Sample records for sign language learning

  1. SignMT: An Alternative Language Learning Tool

    ERIC Educational Resources Information Center

    Ditcharoen, Nadh; Naruedomkul, Kanlaya; Cercone, Nick

    2010-01-01

    Learning a second language is very difficult, especially, for the disabled; the disability may be a barrier to learn and to utilize information written in text form. We present the SignMT, Thai sign to Thai machine translation system, which is able to translate from Thai sign language into Thai text. In the translation process, SignMT takes into…

  2. Sign language comprehension: the case of Spanish sign language.

    PubMed

    Rodríguez Ortiz, I R

    2008-01-01

    This study aims to answer the question, how much of Spanish Sign Language interpreting deaf individuals really understand. Study sampling included 36 deaf people (deafness ranging from severe to profound; variety depending on the age at which they learned sign language) and 36 hearing people who had good knowledge of sign language (most were interpreters). Sign language comprehension was assessed using passages of secondary level. After being exposed to the passages, the participants had to tell what they had understood about them, answer a set of related questions, and offer a title for the passage. Sign language comprehension by deaf participants was quite acceptable but not as good as that by hearing signers who, unlike deaf participants, were not only late learners of sign language as a second language but had also learned it through formal training.

  3. Discourses of prejudice in the professions: the case of sign languages

    PubMed Central

    Humphries, Tom; Kushalnagar, Poorna; Mathur, Gaurav; Napoli, Donna Jo; Padden, Carol; Rathmann, Christian; Smith, Scott

    2017-01-01

    There is no evidence that learning a natural human language is cognitively harmful to children. To the contrary, multilingualism has been argued to be beneficial to all. Nevertheless, many professionals advise the parents of deaf children that their children should not learn a sign language during their early years, despite strong evidence across many research disciplines that sign languages are natural human languages. Their recommendations are based on a combination of misperceptions about (1) the difficulty of learning a sign language, (2) the effects of bilingualism, and particularly bimodalism, (3) the bona fide status of languages that lack a written form, (4) the effects of a sign language on acquiring literacy, (5) the ability of technologies to address the needs of deaf children and (6) the effects that use of a sign language will have on family cohesion. We expose these misperceptions as based in prejudice and urge institutions involved in educating professionals concerned with the healthcare, raising and educating of deaf children to include appropriate information about first language acquisition and the importance of a sign language for deaf children. We further urge such professionals to advise the parents of deaf children properly, which means to strongly advise the introduction of a sign language as soon as hearing loss is detected. PMID:28280057

  4. Facilitating Exposure to Sign Languages of the World: The Case for Mobile Assisted Language Learning

    ERIC Educational Resources Information Center

    Parton, Becky Sue

    2014-01-01

    Foreign sign language instruction is an important, but overlooked area of study. Thus the purpose of this paper was two-fold. First, the researcher sought to determine the level of knowledge and interest in foreign sign language among Deaf teenagers along with their learning preferences. Results from a survey indicated that over a third of the…

  5. Languages Are More than Words: Spanish and American Sign Language in Early Childhood Settings

    ERIC Educational Resources Information Center

    Sherman, Judy; Torres-Crespo, Marisel N.

    2015-01-01

    Capitalizing on preschoolers' inherent enthusiasm and capacity for learning, the authors developed and implemented a dual-language program to enable young children to experience diversity and multiculturalism by learning two new languages: Spanish and American Sign Language. Details of the curriculum, findings, and strategies are shared.

  6. An Investigation into the Relationship of Foreign Language Learning Motivation and Sign Language Use among Deaf and Hard of Hearing Hungarians

    ERIC Educational Resources Information Center

    Kontra, Edit H.; Csizer, Kata

    2013-01-01

    The aim of this study is to point out the relationship between foreign language learning motivation and sign language use among hearing impaired Hungarians. In the article we concentrate on two main issues: first, to what extent hearing impaired people are motivated to learn foreign languages in a European context; second, to what extent sign…

  7. The Road to Language Learning Is Not Entirely Iconic: Iconicity, Neighborhood Density, and Frequency Facilitate Acquisition of Sign Language.

    PubMed

    Caselli, Naomi K; Pyers, Jennie E

    2017-07-01

    Iconic mappings between words and their meanings are far more prevalent than once estimated and seem to support children's acquisition of new words, spoken or signed. We asked whether iconicity's prevalence in sign language overshadows two other factors known to support the acquisition of spoken vocabulary: neighborhood density (the number of lexical items phonologically similar to the target) and lexical frequency. Using mixed-effects logistic regressions, we reanalyzed 58 parental reports of native-signing deaf children's productive acquisition of 332 signs in American Sign Language (ASL; Anderson & Reilly, 2002) and found that iconicity, neighborhood density, and lexical frequency independently facilitated vocabulary acquisition. Despite differences in iconicity and phonological structure between signed and spoken language, signing children, like children learning a spoken language, track statistical information about lexical items and their phonological properties and leverage this information to expand their vocabulary.

  8. Benefits of augmentative signs in word learning: Evidence from children who are deaf/hard of hearing and children with specific language impairment.

    PubMed

    van Berkel-van Hoof, Lian; Hermans, Daan; Knoors, Harry; Verhoeven, Ludo

    2016-12-01

    Augmentative signs may facilitate word learning in children with vocabulary difficulties, for example, children who are Deaf/Hard of Hearing (DHH) and children with Specific Language Impairment (SLI). Despite the fact that augmentative signs may aid second language learning in populations with a typical language development, empirical evidence in favor of this claim is lacking. We aim to investigate whether augmentative signs facilitate word learning for DHH children, children with SLI, and typically developing (TD) children. Whereas previous studies taught children new labels for familiar objects, the present study taught new labels for new objects. In our word learning experiment children were presented with pictures of imaginary creatures and pseudo words. Half of the words were accompanied by an augmentative pseudo sign. The children were tested for their receptive word knowledge. The DHH children benefitted significantly from augmentative signs, but the children with SLI and TD age-matched peers did not score significantly different on words from either the sign or no-sign condition. These results suggest that using Sign-Supported speech in classrooms of bimodal bilingual DHH children may support their spoken language development. The difference between earlier research findings and the present results may be caused by a difference in methodology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Discourses of prejudice in the professions: the case of sign languages.

    PubMed

    Humphries, Tom; Kushalnagar, Poorna; Mathur, Gaurav; Napoli, Donna Jo; Padden, Carol; Rathmann, Christian; Smith, Scott

    2017-09-01

    There is no evidence that learning a natural human language is cognitively harmful to children. To the contrary, multilingualism has been argued to be beneficial to all. Nevertheless, many professionals advise the parents of deaf children that their children should not learn a sign language during their early years, despite strong evidence across many research disciplines that sign languages are natural human languages. Their recommendations are based on a combination of misperceptions about (1) the difficulty of learning a sign language, (2) the effects of bilingualism, and particularly bimodalism, (3) the bona fide status of languages that lack a written form, (4) the effects of a sign language on acquiring literacy, (5) the ability of technologies to address the needs of deaf children and (6) the effects that use of a sign language will have on family cohesion. We expose these misperceptions as based in prejudice and urge institutions involved in educating professionals concerned with the healthcare, raising and educating of deaf children to include appropriate information about first language acquisition and the importance of a sign language for deaf children. We further urge such professionals to advise the parents of deaf children properly, which means to strongly advise the introduction of a sign language as soon as hearing loss is detected. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  10. [Information technology in learning sign language].

    PubMed

    Hernández, Cesar; Pulido, Jose L; Arias, Jorge E

    2015-01-01

    To develop a technological tool that improves the initial learning of sign language in hearing impaired children. The development of this research was conducted in three phases: the lifting of requirements, design and development of the proposed device, and validation and evaluation device. Through the use of information technology and with the advice of special education professionals, we were able to develop an electronic device that facilitates the learning of sign language in deaf children. This is formed mainly by a graphic touch screen, a voice synthesizer, and a voice recognition system. Validation was performed with the deaf children in the Filadelfia School of the city of Bogotá. A learning methodology was established that improves learning times through a small, portable, lightweight, and educational technological prototype. Tests showed the effectiveness of this prototype, achieving a 32 % reduction in the initial learning time for sign language in deaf children.

  11. Child Modifiability as a Predictor of Language Abilities in Deaf Children Who Use American Sign Language.

    PubMed

    Mann, Wolfgang; Peña, Elizabeth D; Morgan, Gary

    2015-08-01

    This research explored the use of dynamic assessment (DA) for language-learning abilities in signing deaf children from deaf and hearing families. Thirty-seven deaf children, aged 6 to 11 years, were identified as either stronger (n = 26) or weaker (n = 11) language learners according to teacher or speech-language pathologist report. All children received 2 scripted, mediated learning experience sessions targeting vocabulary knowledge—specifically, the use of semantic categories that were carried out in American Sign Language. Participant responses to learning were measured in terms of an index of child modifiability. This index was determined separately at the end of the 2 individual sessions. It combined ratings reflecting each child's learning abilities and responses to mediation, including social-emotional behavior, cognitive arousal, and cognitive elaboration. Group results showed that modifiability ratings were significantly better for stronger language learners than for weaker language learners. The strongest predictors of language ability were cognitive arousal and cognitive elaboration. Mediator ratings of child modifiability (i.e., combined score of social-emotional factors and cognitive factors) are highly sensitive to language-learning abilities in deaf children who use sign language as their primary mode of communication. This method can be used to design targeted interventions.

  12. We Need to Communicate! Helping Hearing Parents of Deaf Children Learn American Sign Language

    ERIC Educational Resources Information Center

    Weaver, Kimberly A.; Starner, Thad

    2011-01-01

    Language immersion from birth is crucial to a child's language development. However, language immersion can be particularly challenging for hearing parents of deaf children to provide as they may have to overcome many difficulties while learning American Sign Language (ASL). We are in the process of creating a mobile application to help hearing…

  13. Computerized Sign Language-Based Literacy Training for Deaf and Hard-of-Hearing Children

    ERIC Educational Resources Information Center

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2017-01-01

    Strengthening the connections between sign language and written language may improve reading skills in deaf and hard-of-hearing (DHH) signing children. The main aim of the present study was to investigate whether computerized sign language-based literacy training improves reading skills in DHH signing children who are learning to read. Further,…

  14. Mapping language to the world: the role of iconicity in the sign language input.

    PubMed

    Perniss, Pamela; Lu, Jenny C; Morgan, Gary; Vigliocco, Gabriella

    2018-03-01

    Most research on the mechanisms underlying referential mapping has assumed that learning occurs in ostensive contexts, where label and referent co-occur, and that form and meaning are linked by arbitrary convention alone. In the present study, we focus on iconicity in language, that is, resemblance relationships between form and meaning, and on non-ostensive contexts, where label and referent do not co-occur. We approach the question of language learning from the perspective of the language input. Specifically, we look at child-directed language (CDL) in British Sign Language (BSL), a language rich in iconicity due to the affordances of the visual modality. We ask whether child-directed signing exploits iconicity in the language by highlighting the similarity mapping between form and referent. We find that CDL modifications occur more often with iconic signs than with non-iconic signs. Crucially, for iconic signs, modifications are more frequent in non-ostensive contexts than in ostensive contexts. Furthermore, we find that pointing dominates in ostensive contexts, and suggest that caregivers adjust the semiotic resources recruited in CDL to context. These findings offer first evidence for a role of iconicity in the language input and suggest that iconicity may be involved in referential mapping and language learning, particularly in non-ostensive contexts. © 2017 John Wiley & Sons Ltd.

  15. Impacts of Visual Sonority and Handshape Markedness on Second Language Learning of American Sign Language

    ERIC Educational Resources Information Center

    Williams, Joshua T.; Newman, Sharlene D.

    2016-01-01

    The roles of visual sonority and handshape markedness in sign language acquisition and production were investigated. In Experiment 1, learners were taught sign-nonobject correspondences that varied in sign movement sonority and handshape markedness. Results from a sign-picture matching task revealed that high sonority signs were more accurately…

  16. Sign Language and Hearing Preschoolers.

    ERIC Educational Resources Information Center

    Reynolds, Kate E.

    1995-01-01

    Notes that sign language is the third most used second language in the United States and that early childhood is an ideal language-learning time. Describes the experiences of one preschool where American Sign Language has become an integral part of the curriculum. Includes guiding principles, classroom do's and don'ts, and a resource list of…

  17. Computerized Sign Language-Based Literacy Training for Deaf and Hard-of-Hearing Children

    PubMed Central

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2017-01-01

    Abstract Strengthening the connections between sign language and written language may improve reading skills in deaf and hard-of-hearing (DHH) signing children. The main aim of the present study was to investigate whether computerized sign language-based literacy training improves reading skills in DHH signing children who are learning to read. Further, longitudinal associations between sign language skills and developing reading skills were investigated. Participants were recruited from Swedish state special schools for DHH children, where pupils are taught in both sign language and spoken language. Reading skills were assessed at five occasions and the intervention was implemented in a cross-over design. Results indicated that reading skills improved over time and that development of word reading was predicted by the ability to imitate unfamiliar lexical signs, but there was only weak evidence that it was supported by the intervention. These results demonstrate for the first time a longitudinal link between sign-based abilities and word reading in DHH signing children who are learning to read. We suggest that the active construction of novel lexical forms may be a supramodal mechanism underlying word reading development. PMID:28961874

  18. Computerized Sign Language-Based Literacy Training for Deaf and Hard-of-Hearing Children.

    PubMed

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2017-10-01

    Strengthening the connections between sign language and written language may improve reading skills in deaf and hard-of-hearing (DHH) signing children. The main aim of the present study was to investigate whether computerized sign language-based literacy training improves reading skills in DHH signing children who are learning to read. Further, longitudinal associations between sign language skills and developing reading skills were investigated. Participants were recruited from Swedish state special schools for DHH children, where pupils are taught in both sign language and spoken language. Reading skills were assessed at five occasions and the intervention was implemented in a cross-over design. Results indicated that reading skills improved over time and that development of word reading was predicted by the ability to imitate unfamiliar lexical signs, but there was only weak evidence that it was supported by the intervention. These results demonstrate for the first time a longitudinal link between sign-based abilities and word reading in DHH signing children who are learning to read. We suggest that the active construction of novel lexical forms may be a supramodal mechanism underlying word reading development. © The Author 2017. Published by Oxford University Press.

  19. Sign Language and Language Acquisition in Man and Ape. New Dimensions in Comparative Pedolinguistics.

    ERIC Educational Resources Information Center

    Peng, Fred C. C., Ed.

    A collection of research materials on sign language and primatology is presented here. The essays attempt to show that: sign language is a legitimate language that can be learned not only by humans but by nonhuman primates as well, and nonhuman primates have the capability to acquire a human language using a different mode. The following…

  20. The Effectiveness of the Game-Based Learning System for the Improvement of American Sign Language Using Kinect

    ERIC Educational Resources Information Center

    Kamnardsiri, Teerawat; Hongsit, Ler-on; Khuwuthyakorn, Pattaraporn; Wongta, Noppon

    2017-01-01

    This paper investigated students' achievement for learning American Sign Language (ASL), using two different methods. There were two groups of samples. The first experimental group (Group A) was the game-based learning for ASL, using Kinect. The second control learning group (Group B) was the traditional face-to-face learning method, generally…

  1. Why American Sign Language Gloss Must Matter

    ERIC Educational Resources Information Center

    Supalla, Samuel J.; Cripps, Jody H.; Byrne, Andrew P. J.

    2017-01-01

    Responding to an article by Grushkin (EJ1174123) on how deaf children best learn to read, published, along with the present article, in an "American Annals of the Deaf" special issue, the authors review American Sign Language gloss. Topics include how ASL gloss enables deaf children to learn to read in their own language and…

  2. Exploring the use of dynamic language assessment with deaf children, who use American Sign Language: Two case studies.

    PubMed

    Mann, Wolfgang; Peña, Elizabeth D; Morgan, Gary

    2014-01-01

    We describe a model for assessment of lexical-semantic organization skills in American Sign Language (ASL) within the framework of dynamic vocabulary assessment and discuss the applicability and validity of the use of mediated learning experiences (MLE) with deaf signing children. Two elementary students (ages 7;6 and 8;4) completed a set of four vocabulary tasks and received two 30-minute mediations in ASL. Each session consisted of several scripted activities focusing on the use of categorization. Both had experienced difficulties in providing categorically related responses in one of the vocabulary tasks used previously. Results showed that the two students exhibited notable differences with regards to their learning pace, information uptake, and effort required by the mediator. Furthermore, we observed signs of a shift in strategic behavior by the lower performing student during the second mediation. Results suggest that the use of dynamic assessment procedures in a vocabulary context was helpful in understanding children's strategies as related to learning potential. These results are discussed in terms of deaf children's cognitive modifiability with implications for planning instruction and how MLE can be used with a population that uses ASL. The reader will (1) recognize the challenges in appropriate language assessment of deaf signing children; (2) recall the three areas explored to investigate whether a dynamic assessment approach is sensitive to differences in deaf signing children's language learning profiles (3) discuss how dynamic assessment procedures can make deaf signing children's individual language learning differences visible. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Cultural transmission through infant signs: Objects and actions in U.S. and Taiwan.

    PubMed

    Wang, Wen; Vallotton, Claire

    2016-08-01

    Infant signs are intentionally taught/learned symbolic gestures which can be used to represent objects, actions, requests, and mental state. Through infant signs, parents and infants begin to communicate specific concepts earlier than children's first spoken language. This study examines whether cultural differences in language are reflected in children's and parents' use of infant signs. Parents speaking East Asian languages with their children utilize verbs more often than do English-speaking mothers; and compared to their English-learning peers, Chinese children are more likely to learn verbs as they first acquire spoken words. By comparing parents' and infants' use of infant signs in the U.S. and Taiwan, we investigate cultural differences of noun/object versus verb/action bias before children's first language. Parents reported their own and their children's use of first infant signs retrospectively. Results show that cultural differences in parents' and children's infant sign use were consistent with research on early words, reflecting cultural differences in communication functions (referential versus regulatory) and child-rearing goals (independent versus interdependent). The current study provides evidence that intergenerational transmission of culture through symbols begins prior to oral language. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Comic Books: A Learning Tool for Meaningful Acquisition of Written Sign Language

    ERIC Educational Resources Information Center

    Guimarães, Cayley; Oliveira Machado, Milton César; Fernandes, Sueli F.

    2018-01-01

    Deaf people use Sign Language (SL) for intellectual development, communications and other human activities that are mediated by language--such as the expression of complex and abstract thoughts and feelings; and for literature, culture and knowledge. The Brazilian Sign Language (Libras) is a complete linguistic system of visual-spatial manner,…

  5. Designing an American Sign Language Avatar for Learning Computer Science Concepts for Deaf or Hard-of-Hearing Students and Deaf Interpreters

    ERIC Educational Resources Information Center

    Andrei, Stefan; Osborne, Lawrence; Smith, Zanthia

    2013-01-01

    The current learning process of Deaf or Hard of Hearing (D/HH) students taking Science, Technology, Engineering, and Mathematics (STEM) courses needs, in general, a sign interpreter for the translation of English text into American Sign Language (ASL) signs. This method is at best impractical due to the lack of availability of a specialized sign…

  6. Learning an Embodied Visual Language: Four Imitation Strategies Available to Sign Learners

    PubMed Central

    Shield, Aaron; Meier, Richard P.

    2018-01-01

    The parts of the body that are used to produce and perceive signed languages (the hands, face, and visual system) differ from those used to produce and perceive spoken languages (the vocal tract and auditory system). In this paper we address two factors that have important consequences for sign language acquisition. First, there are three types of lexical signs: one-handed, two-handed symmetrical, and two-handed asymmetrical. Natural variation in hand dominance in the population leads to varied input to children learning sign. Children must learn that signs are not specified for the right or left hand but for dominant and non-dominant. Second, we posit that children have at least four imitation strategies available for imitating signs: anatomical (Activate the same muscles as the sign model), which could lead learners to inappropriately use their non-dominant hand; mirroring (Produce a mirror image of the modeled sign), which could lead learners to produce lateral movement reversal errors or to use the non-dominant hand; visual matching (Reproduce what you see from your perspective), which could lead learners to produce inward–outward movement and palm orientation reversals; and reversing (Reproduce what the sign model would see from his/her perspective). This last strategy is the only one that always yields correct phonological forms in signed languages. To test our hypotheses, we turn to evidence from typical and atypical hearing and deaf children as well as from typical adults; the data come from studies of both sign acquisition and gesture imitation. Specifically, we posit that all children initially use a visual matching strategy but typical children switch to a mirroring strategy sometime in the second year of life; typical adults tend to use a mirroring strategy in learning signs and imitating gestures. By contrast, children and adults with autism spectrum disorder (ASD) appear to use the visual matching strategy well into childhood or even adulthood. Finally, we present evidence that sign language exposure changes how adults imitate gestures, switching from a mirroring strategy to the correct reversal strategy. These four strategies for imitation do not exist in speech and as such constitute a unique problem for research in language acquisition. PMID:29899716

  7. Mobile Sign Language Learning Outside the Classroom

    ERIC Educational Resources Information Center

    Weaver, Kimberly A.; Starner, Thad

    2012-01-01

    The majority of deaf children in the United States are born to hearing parents with limited prior exposure to American Sign Language (ASL). Our research involves creating and validating a mobile language tool called SMARTSign. The goal is to help hearing parents learn ASL in a way that fits seamlessly into their daily routine. (Contains 3 figures.)

  8. ME . . . ME . . . WASHOE: An Appreciation

    ERIC Educational Resources Information Center

    King, Barbara J.

    2008-01-01

    Washoe, the chimpanzee pioneer who learned aspects of American Sign Language, died in October 2007. In reviewing her life and accomplishments, this article focuses on Washoe's status as an ape and a person, and on the role of emotion in language learning and language use. It argues that Washoe's legacy stems not from the number of ASL signs she…

  9. The road to language learning is iconic: evidence from British Sign Language.

    PubMed

    Thompson, Robin L; Vinson, David P; Woll, Bencie; Vigliocco, Gabriella

    2012-12-01

    An arbitrary link between linguistic form and meaning is generally considered a universal feature of language. However, iconic (i.e., nonarbitrary) mappings between properties of meaning and features of linguistic form are also widely present across languages, especially signed languages. Although recent research has shown a role for sign iconicity in language processing, research on the role of iconicity in sign-language development has been mixed. In this article, we present clear evidence that iconicity plays a role in sign-language acquisition for both the comprehension and production of signs. Signed languages were taken as a starting point because they tend to encode a higher degree of iconic form-meaning mappings in their lexicons than spoken languages do, but our findings are more broadly applicable: Specifically, we hypothesize that iconicity is fundamental to all languages (signed and spoken) and that it serves to bridge the gap between linguistic form and human experience.

  10. Signs as Pictures and Signs as Words: Effect of Language Knowledge on Memory for New Vocabulary.

    ERIC Educational Resources Information Center

    Siple, Patricia; And Others

    1982-01-01

    The role of sensory attributes in a vocabulary learning task was investigated for a non-oral language using deaf and hearing individuals, more or less skilled in the use of sign language. Skilled signers encoded invented signs in terms of linguistic structure rather than as visual-pictorial events. (Author/RD)

  11. Benefits of Sign Language Interpreting and Text Alternatives for Deaf Students' Classroom Learning

    ERIC Educational Resources Information Center

    Marschark, Marc; Leigh, Greg; Sapere, Patricia; Burnham, Denis; Convertino, Carol; Stinson, Michael; Knoors, Harry; Vervloed, Mathijs P. J.; Noble, William

    2006-01-01

    Four experiments examined the utility of real-time text in supporting deaf students' learning from lectures in postsecondary (Experiments 1 and 2) and secondary classrooms (Experiments 3 and 4). Experiment 1 compared the effects on learning of sign language interpreting, real-time text (C-Print), and both. Real-time text alone led to significantly…

  12. The Multimedia Dictionary of American Sign Language: Learning Lessons About Language, Technology, and Business.

    ERIC Educational Resources Information Center

    Wilcox, Sherman

    2003-01-01

    Reports on the the Multimedia Dictionary of American Sign language, which was was conceived in he late 1980s as a melding of the pioneering work in American Sign language lexicography that had been carried out decades earlier and the newly emerging computer technologies that were integrating use of graphical user-interface designs, rapidly…

  13. HAPPEN CAN'T HEAR: An Analysis of Code-Blends in Hearing, Native Signers of American Sign Language

    ERIC Educational Resources Information Center

    Bishop, Michele

    2011-01-01

    Hearing native signers often learn sign language as their first language and acquire features that are characteristic of sign languages but are not present in equivalent ways in English (e.g., grammatical facial expressions and the structured use of space for setting up tokens and surrogates). Previous research has indicated that bimodal…

  14. American Sign Language Curricula: A Review

    ERIC Educational Resources Information Center

    Rosen, Russell S.

    2010-01-01

    There is an exponential growth in the number of schools that offer American Sign Language (ASL) for foreign language credit and the different ASL curricula that were published. This study analyzes different curricula in its assumptions regarding language, learning, and teaching of second languages. It is found that curricula vary in their…

  15. Using Sign Language in Your Classroom.

    ERIC Educational Resources Information Center

    Lawrence, Constance D.

    This paper reviews the research on use of American Sign Language in elementary classes that do not include children with hearing impairment and also reports on the use of the manual sign language alphabet in a primary class learning the phonetic sounds of the alphabet. The research reported is overwhelmingly positive in support of using sign…

  16. Using the Linguistic Landscape to Bridge Languages

    ERIC Educational Resources Information Center

    Mari, Vanessa

    2018-01-01

    In this article Vanessa Mari describes how she uses the linguistic landscape to bridge two or more languages with students learning English. The linguistic landscape is defined by Landry and Bourhis (1997, 25) as "the language of public road signs, advertising billboards, street names, place names, commercial shop signs, and public signs on…

  17. Signs of Resistance: Peer Learning of Sign Languages within "Oral" Schools for the Deaf

    ERIC Educational Resources Information Center

    Anglin-Jaffe, Hannah

    2013-01-01

    This article explores the role of the Deaf child as peer educator. In schools where sign languages were banned, Deaf children became the educators of their Deaf peers in a number of contexts worldwide. This paper analyses how this peer education of sign language worked in context by drawing on two examples from boarding schools for the deaf in…

  18. Are deaf students' reading challenges really about reading?

    PubMed

    Marschark, Marc; Sapere, Patricia; Convertino, Carol M; Mayer, Connie; Wauters, Loes; Sarchet, Thomastine

    2009-01-01

    Reading achievement among deaf students typically lags significantly behind hearing peers, a situation that has changed little despite decades of research. This lack of progress and recent findings indicating that deaf students face many of the same challenges in comprehending sign language as they do in comprehending text suggest that difficulties frequently observed in their learning from text may involve more than just reading. Two experiments examined college students' learning of material from science texts. Passages were presented to deaf (signing) students in print or American Sign Language and to hearing students in print or auditorially. Several measures of learning indicated that the deaf students learned as much or more from print as they did from sign language, but less than hearing students in both cases. These and other results suggest that challenges to deaf students' reading comprehension may be more complex than is generally assumed.

  19. Iconicity and Sign Lexical Acquisition: A Review

    PubMed Central

    Ortega, Gerardo

    2017-01-01

    The study of iconicity, defined as the direct relationship between a linguistic form and its referent, has gained momentum in recent years across a wide range of disciplines. In the spoken modality, there is abundant evidence showing that iconicity is a key factor that facilitates language acquisition. However, when we look at sign languages, which excel in the prevalence of iconic structures, there is a more mixed picture, with some studies showing a positive effect and others showing a null or negative effect. In an attempt to reconcile the existing evidence the present review presents a critical overview of the literature on the acquisition of a sign language as first (L1) and second (L2) language and points at some factor that may be the source of disagreement. Regarding sign L1 acquisition, the contradicting findings may relate to iconicity being defined in a very broad sense when a more fine-grained operationalisation might reveal an effect in sign learning. Regarding sign L2 acquisition, evidence shows that there is a clear dissociation in the effect of iconicity in that it facilitates conceptual-semantic aspects of sign learning but hinders the acquisition of the exact phonological form of signs. It will be argued that when we consider the gradient nature of iconicity and that signs consist of a phonological form attached to a meaning we can discern how iconicity impacts sign learning in positive and negative ways. PMID:28824480

  20. Simultaneous Communication Supports Learning in Noise by Cochlear Implant Users

    PubMed Central

    Blom, Helen C.; Marschark, Marc; Machmer, Elizabeth

    2017-01-01

    Objectives This study sought to evaluate the potential of using spoken language and signing together (simultaneous communication, SimCom, sign-supported speech) as a means of improving speech recognition, comprehension, and learning by cochlear implant users in noisy contexts. Methods Forty eight college students who were active cochlear implant users, watched videos of three short presentations, the text versions of which were standardized at the 8th grade reading level. One passage was presented in spoken language only, one was presented in spoken language with multi-talker babble background noise, and one was presented via simultaneous communication with the same background noise. Following each passage, participants responded to 10 (standardized) open-ended questions designed to assess comprehension. Indicators of participants’ spoken language and sign language skills were obtained via self-reports and objective assessments. Results When spoken materials were accompanied by signs, scores were significantly higher than when materials were spoken in noise without signs. Participants’ receptive spoken language skills significantly predicted scores in all three conditions; neither their receptive sign skills nor age of implantation predicted performance. Discussion Students who are cochlear implant users typically rely solely on spoken language in the classroom. The present results, however, suggest that there are potential benefits of simultaneous communication for such learners in noisy settings. For those cochlear implant users who know sign language, the redundancy of speech and signs potentially can offset the reduced fidelity of spoken language in noise. Conclusion Accompanying spoken language with signs can benefit learners who are cochlear implant users in noisy situations such as classroom settings. Factors associated with such benefits, such as receptive skills in signed and spoken modalities, classroom acoustics, and material difficulty need to be empirically examined. PMID:28010675

  1. Simultaneous communication supports learning in noise by cochlear implant users.

    PubMed

    Blom, Helen; Marschark, Marc; Machmer, Elizabeth

    2017-01-01

    This study sought to evaluate the potential of using spoken language and signing together (simultaneous communication, SimCom, sign-supported speech) as a means of improving speech recognition, comprehension, and learning by cochlear implant (CI) users in noisy contexts. Forty eight college students who were active CI users, watched videos of three short presentations, the text versions of which were standardized at the 8 th -grade reading level. One passage was presented in spoken language only, one was presented in spoken language with multi-talker babble background noise, and one was presented via simultaneous communication with the same background noise. Following each passage, participants responded to 10 (standardized) open-ended questions designed to assess comprehension. Indicators of participants' spoken language and sign language skills were obtained via self-reports and objective assessments. When spoken materials were accompanied by signs, scores were significantly higher than when materials were spoken in noise without signs. Participants' receptive spoken language skills significantly predicted scores in all three conditions; neither their receptive sign skills nor age of implantation predicted performance. Students who are CI users typically rely solely on spoken language in the classroom. The present results, however, suggest that there are potential benefits of simultaneous communication for such learners in noisy settings. For those CI users who know sign language, the redundancy of speech and signs potentially can offset the reduced fidelity of spoken language in noise. Accompanying spoken language with signs can benefit learners who are CI users in noisy situations such as classroom settings. Factors associated with such benefits, such as receptive skills in signed and spoken modalities, classroom acoustics, and material difficulty need to be empirically examined.

  2. Case Studies of Multilingual/Multicultural Asian Deaf Adults: Strategies for Success.

    PubMed

    Wang, Qiuying; Andrews, Jean; Liu, Hsiu Tan; Liu, Chun Jung

    2016-01-01

    Case studies of adult d/Deaf or Hard of Hearing Multilingual Learners (DMLs) are few, especially studies of DMLs who learn more than one sign language and read logographic and alphabetic scripts. To reduce this paucity, two descriptive case studies are presented. Written questionnaires, face-to-face interviews, and self-appraisals of language-use rubrics were used to explore (a) the language and literacy histories of two adult Asian DMLs who had learned multiple languages: Chinese (spoken/written), English (written), Chinese Sign Language, and American Sign Language; and (b) how each language was used in different cultural communities with diverse conversational partners. Home literacy environment, family support, visual access to languages, peer and sibling support, role models, encouragement, perseverance, and Deaf identity all played vital roles in the participants' academic success. The findings provide insights into the acquisition of multiple languages and bi-literacy through social communication and academic content.

  3. Children creating language: how Nicaraguan sign language acquired a spatial grammar.

    PubMed

    Senghas, A; Coppola, M

    2001-07-01

    It has long been postulated that language is not purely learned, but arises from an interaction between environmental exposure and innate abilities. The innate component becomes more evident in rare situations in which the environment is markedly impoverished. The present study investigated the language production of a generation of deaf Nicaraguans who had not been exposed to a developed language. We examined the changing use of early linguistic structures (specifically, spatial modulations) in a sign language that has emerged since the Nicaraguan group first came together: In tinder two decades, sequential cohorts of learners systematized the grammar of this new sign language. We examined whether the systematicity being added to the language stems from children or adults: our results indicate that such changes originate in children aged 10 and younger Thus, sequential cohorts of interacting young children collectively: possess the capacity not only to learn, but also to create, language.

  4. Real-time processing of ASL signs: Delayed first language acquisition affects organization of the mental lexicon

    PubMed Central

    Lieberman, Amy M.; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I.

    2014-01-01

    Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of linguistic input and for deaf individuals is highly heterogeneous, which is rarely the case for hearing learners of spoken languages. Little is known about how these modality and developmental factors affect real-time lexical processing. In this study, we ask how these factors impact real-time recognition of American Sign Language (ASL) signs using a novel adaptation of the visual world paradigm in deaf adults who learned sign from birth (Experiment 1), and in deaf individuals who were late-learners of ASL (Experiment 2). Results revealed that although both groups of signers demonstrated rapid, incremental processing of ASL signs, only native-signers demonstrated early and robust activation of sub-lexical features of signs during real-time recognition. Our findings suggest that the organization of the mental lexicon into units of both form and meaning is a product of infant language learning and not the sensory and motor modality through which the linguistic signal is sent and received. PMID:25528091

  5. Sign Language Use and the Appreciation of Diversity in Hearing Classrooms

    ERIC Educational Resources Information Center

    Brereton, Amy

    2008-01-01

    This article is the result of a year-long study into the effects of sign language use on participation in one mainstream preschool setting. Observations and interviews were the primary data-collection tools used during this investigation. This article focuses on how the use of sign language in the classroom affected the learning community's…

  6. Learning To See: American Sign Language as a Second Language. Language in Education: Theory and Practice 76.

    ERIC Educational Resources Information Center

    Wilcox, Sherman; Wilcox, Phyllis

    During the last decade, the study of American Sign Language (ASL) as a second language has become enormously popular. More and more schools and universities recognize the important role that ASL can play in foreign language education. This monograph provides a comprehensive introduction to the history and structure of ASL, to the Deaf community…

  7. Bimodal bilingualism as multisensory training?: Evidence for improved audiovisual speech perception after sign language exposure.

    PubMed

    Williams, Joshua T; Darcy, Isabelle; Newman, Sharlene D

    2016-02-15

    The aim of the present study was to characterize effects of learning a sign language on the processing of a spoken language. Specifically, audiovisual phoneme comprehension was assessed before and after 13 weeks of sign language exposure. L2 ASL learners performed this task in the fMRI scanner. Results indicated that L2 American Sign Language (ASL) learners' behavioral classification of the speech sounds improved with time compared to hearing nonsigners. Results indicated increased activation in the supramarginal gyrus (SMG) after sign language exposure, which suggests concomitant increased phonological processing of speech. A multiple regression analysis indicated that learner's rating on co-sign speech use and lipreading ability was correlated with SMG activation. This pattern of results indicates that the increased use of mouthing and possibly lipreading during sign language acquisition may concurrently improve audiovisual speech processing in budding hearing bimodal bilinguals. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Unsilencing Voices: A Study of Zoo Signs and Their Language of Authority

    ERIC Educational Resources Information Center

    Fogelberg, Katherine

    2014-01-01

    Zoo signs are important for informal learning, but their effect on visitor perception of animals has been sparsely studied. Other studies have established the importance of informal learning in American society; this study discusses zoo signs in the context of such learning. Through the lens of Critical Theory framed by informal learning, and by…

  9. Discriminative exemplar coding for sign language recognition with Kinect.

    PubMed

    Sun, Chao; Zhang, Tianzhu; Bao, Bing-Kun; Xu, Changsheng; Mei, Tao

    2013-10-01

    Sign language recognition is a growing research area in the field of computer vision. A challenge within it is to model various signs, varying with time resolution, visual manual appearance, and so on. In this paper, we propose a discriminative exemplar coding (DEC) approach, as well as utilizing Kinect sensor, to model various signs. The proposed DEC method can be summarized as three steps. First, a quantity of class-specific candidate exemplars are learned from sign language videos in each sign category by considering their discrimination. Then, every video of all signs is described as a set of similarities between frames within it and the candidate exemplars. Instead of simply using a heuristic distance measure, the similarities are decided by a set of exemplar-based classifiers through the multiple instance learning, in which a positive (or negative) video is treated as a positive (or negative) bag and those frames similar to the given exemplar in Euclidean space as instances. Finally, we formulate the selection of the most discriminative exemplars into a framework and simultaneously produce a sign video classifier to recognize sign. To evaluate our method, we collect an American sign language dataset, which includes approximately 2000 phrases, while each phrase is captured by Kinect sensor with color, depth, and skeleton information. Experimental results on our dataset demonstrate the feasibility and effectiveness of the proposed approach for sign language recognition.

  10. Co-Creation Learning Procedures: Comparing Interactive Language Lessons for Deaf and Hearing Students.

    PubMed

    Hosono, Naotsune; Inoue, Hiromitsu; Tomita, Yutaka

    2017-01-01

    This paper discusses co-creation learning procedures of second language lessons for deaf students, and sign language lessons by a deaf lecturer. The analyses focus on the learning procedure and resulting assessment, considering the disability. Through questionnaires ICT-based co-creative learning technologies are effective and efficient and promote spontaneous learning motivation goals.

  11. Social Interaction Affects Neural Outcomes of Sign Language Learning As a Foreign Language in Adults.

    PubMed

    Yusa, Noriaki; Kim, Jungho; Koizumi, Masatoshi; Sugiura, Motoaki; Kawashima, Ryuta

    2017-01-01

    Children naturally acquire a language in social contexts where they interact with their caregivers. Indeed, research shows that social interaction facilitates lexical and phonological development at the early stages of child language acquisition. It is not clear, however, whether the relationship between social interaction and learning applies to adult second language acquisition of syntactic rules. Does learning second language syntactic rules through social interactions with a native speaker or without such interactions impact behavior and the brain? The current study aims to answer this question. Adult Japanese participants learned a new foreign language, Japanese sign language (JSL), either through a native deaf signer or via DVDs. Neural correlates of acquiring new linguistic knowledge were investigated using functional magnetic resonance imaging (fMRI). The participants in each group were indistinguishable in terms of their behavioral data after the instruction. The fMRI data, however, revealed significant differences in the neural activities between two groups. Significant activations in the left inferior frontal gyrus (IFG) were found for the participants who learned JSL through interactions with the native signer. In contrast, no cortical activation change in the left IFG was found for the group who experienced the same visual input for the same duration via the DVD presentation. Given that the left IFG is involved in the syntactic processing of language, spoken or signed, learning through social interactions resulted in an fMRI signature typical of native speakers: activation of the left IFG. Thus, broadly speaking, availability of communicative interaction is necessary for second language acquisition and this results in observed changes in the brain.

  12. Phonological memory in sign language relies on the visuomotor neural system outside the left hemisphere language network.

    PubMed

    Kanazawa, Yuji; Nakamura, Kimihiro; Ishii, Toru; Aso, Toshihiko; Yamazaki, Hiroshi; Omori, Koichi

    2017-01-01

    Sign language is an essential medium for everyday social interaction for deaf people and plays a critical role in verbal learning. In particular, language development in those people should heavily rely on the verbal short-term memory (STM) via sign language. Most previous studies compared neural activations during signed language processing in deaf signers and those during spoken language processing in hearing speakers. For sign language users, it thus remains unclear how visuospatial inputs are converted into the verbal STM operating in the left-hemisphere language network. Using functional magnetic resonance imaging, the present study investigated neural activation while bilinguals of spoken and signed language were engaged in a sequence memory span task. On each trial, participants viewed a nonsense syllable sequence presented either as written letters or as fingerspelling (4-7 syllables in length) and then held the syllable sequence for 12 s. Behavioral analysis revealed that participants relied on phonological memory while holding verbal information regardless of the type of input modality. At the neural level, this maintenance stage broadly activated the left-hemisphere language network, including the inferior frontal gyrus, supplementary motor area, superior temporal gyrus and inferior parietal lobule, for both letter and fingerspelling conditions. Interestingly, while most participants reported that they relied on phonological memory during maintenance, direct comparisons between letters and fingers revealed strikingly different patterns of neural activation during the same period. Namely, the effortful maintenance of fingerspelling inputs relative to letter inputs activated the left superior parietal lobule and dorsal premotor area, i.e., brain regions known to play a role in visuomotor analysis of hand/arm movements. These findings suggest that the dorsal visuomotor neural system subserves verbal learning via sign language by relaying gestural inputs to the classical left-hemisphere language network.

  13. Infants Learn Baby Signs from Video

    ERIC Educational Resources Information Center

    Dayanim, Shoshana; Namy, Laura L.

    2015-01-01

    There is little evidence that infants learn from infant-oriented educational videos and television programming. This 4-week longitudinal experiment investigated 15-month-olds' (N = 92) ability to learn American Sign Language signs (e.g., patting head for hat) from at-home viewing of instructional video, either with or without parent support,…

  14. How deaf American Sign Language/English bilingual children become proficient readers: an emic perspective.

    PubMed

    Mounty, Judith L; Pucci, Concetta T; Harmon, Kristen C

    2014-07-01

    A primary tenet underlying American Sign Language/English bilingual education for deaf students is that early access to a visual language, developed in conjunction with language planning principles, provides a foundation for literacy in English. The goal of this study is to obtain an emic perspective on bilingual deaf readers transitioning from learning to read to reading to learn. Analysis of 12 interactive, semi-structured interviews identified informal and formal teaching and learning practices in ASL/English bilingual homes and classrooms. These practices value, reinforce, and support the bidirectional acquisition of both languages and provide a strong foundation for literacy. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Using the Hands to Represent Objects in Space: Gesture as a Substrate for Signed Language Acquisition.

    PubMed

    Janke, Vikki; Marshall, Chloë R

    2017-01-01

    An ongoing issue of interest in second language research concerns what transfers from a speaker's first language to their second. For learners of a sign language, gesture is a potential substrate for transfer. Our study provides a novel test of gestural production by eliciting silent gesture from novices in a controlled environment. We focus on spatial relationships, which in sign languages are represented in a very iconic way using the hands, and which one might therefore predict to be easy for adult learners to acquire. However, a previous study by Marshall and Morgan (2015) revealed that this was only partly the case: in a task that required them to express the relative locations of objects, hearing adult learners of British Sign Language (BSL) could represent objects' locations and orientations correctly, but had difficulty selecting the correct handshapes to represent the objects themselves. If hearing adults are indeed drawing upon their gestural resources when learning sign languages, then their difficulties may have stemmed from their having in manual gesture only a limited repertoire of handshapes to draw upon, or, alternatively, from having too broad a repertoire. If the first hypothesis is correct, the challenge for learners is to extend their handshape repertoire, but if the second is correct, the challenge is instead to narrow down to the handshapes appropriate for that particular sign language. 30 sign-naïve hearing adults were tested on Marshall and Morgan's task. All used some handshapes that were different from those used by native BSL signers and learners, and the set of handshapes used by the group as a whole was larger than that employed by native signers and learners. Our findings suggest that a key challenge when learning to express locative relations might be reducing from a very large set of gestural resources, rather than supplementing a restricted one, in order to converge on the conventionalized classifier system that forms part of the grammar of the language being learned.

  16. Language as a multimodal phenomenon: implications for language learning, processing and evolution

    PubMed Central

    Vigliocco, Gabriella; Perniss, Pamela; Vinson, David

    2014-01-01

    Our understanding of the cognitive and neural underpinnings of language has traditionally been firmly based on spoken Indo-European languages and on language studied as speech or text. However, in face-to-face communication, language is multimodal: speech signals are invariably accompanied by visual information on the face and in manual gestures, and sign languages deploy multiple channels (hands, face and body) in utterance construction. Moreover, the narrow focus on spoken Indo-European languages has entrenched the assumption that language is comprised wholly by an arbitrary system of symbols and rules. However, iconicity (i.e. resemblance between aspects of communicative form and meaning) is also present: speakers use iconic gestures when they speak; many non-Indo-European spoken languages exhibit a substantial amount of iconicity in word forms and, finally, iconicity is the norm, rather than the exception in sign languages. This introduction provides the motivation for taking a multimodal approach to the study of language learning, processing and evolution, and discusses the broad implications of shifting our current dominant approaches and assumptions to encompass multimodal expression in both signed and spoken languages. PMID:25092660

  17. The Beneficial Role of L1 Spoken Language Skills on Initial L2 Sign Language Learning: Cognitive and Linguistic Predictors of M2L2 Acquisition

    ERIC Educational Resources Information Center

    Williams, Joshua T.; Darcy, Isabelle; Newman, Sharlene D.

    2017-01-01

    Understanding how language modality (i.e., signed vs. spoken) affects second language outcomes in hearing adults is important both theoretically and pedagogically, as it can determine the specificity of second language (L2) theory and inform how best to teach a language that uses a new modality. The present study investigated which…

  18. Categorical Coding of Manual & English Alphabet Characters by Beginning Students of American Sign Language.

    ERIC Educational Resources Information Center

    Hoemann, Harry W.; Koenig, Teresa J.

    1990-01-01

    Analysis of the performance of beginning American Sign Language students, who had only recently learned the manual alphabet, on a task in which proactive interference would build up rapidly on successive trials, supported the view that different languages have separate memory stores. (Author/CB)

  19. A Preliminary Study on Interpreting for Emergent Signers

    ERIC Educational Resources Information Center

    Smith, Caitlin; Dicus, Danica

    2015-01-01

    Sign language interpreters work with a variety of consumer populations throughout their careers. One such population, referred to as "emergent signers," consists of consumers who are in the process of learning American Sign Language, and who rely on interpreters during their language acquisition period. A gap in the research is revealed…

  20. Early Sign Language Exposure and Cochlear Implantation Benefits.

    PubMed

    Geers, Ann E; Mitchell, Christine M; Warner-Czyz, Andrea; Wang, Nae-Yuh; Eisenberg, Laurie S

    2017-07-01

    Most children with hearing loss who receive cochlear implants (CI) learn spoken language, and parents must choose early on whether to use sign language to accompany speech at home. We address whether parents' use of sign language before and after CI positively influences auditory-only speech recognition, speech intelligibility, spoken language, and reading outcomes. Three groups of children with CIs from a nationwide database who differed in the duration of early sign language exposure provided in their homes were compared in their progress through elementary grades. The groups did not differ in demographic, auditory, or linguistic characteristics before implantation. Children without early sign language exposure achieved better speech recognition skills over the first 3 years postimplant and exhibited a statistically significant advantage in spoken language and reading near the end of elementary grades over children exposed to sign language. Over 70% of children without sign language exposure achieved age-appropriate spoken language compared with only 39% of those exposed for 3 or more years. Early speech perception predicted speech intelligibility in middle elementary grades. Children without sign language exposure produced speech that was more intelligible (mean = 70%) than those exposed to sign language (mean = 51%). This study provides the most compelling support yet available in CI literature for the benefits of spoken language input for promoting verbal development in children implanted by 3 years of age. Contrary to earlier published assertions, there was no advantage to parents' use of sign language either before or after CI. Copyright © 2017 by the American Academy of Pediatrics.

  1. Visual sign phonology: insights into human reading and language from a natural soundless phonology.

    PubMed

    Petitto, L A; Langdon, C; Stone, A; Andriola, D; Kartheiser, G; Cochran, C

    2016-11-01

    Among the most prevailing assumptions in science and society about the human reading process is that sound and sound-based phonology are critical to young readers. The child's sound-to-letter decoding is viewed as universal and vital to deriving meaning from print. We offer a different view. The crucial link for early reading success is not between segmental sounds and print. Instead the human brain's capacity to segment, categorize, and discern linguistic patterning makes possible the capacity to segment all languages. This biological process includes the segmentation of languages on the hands in signed languages. Exposure to natural sign language in early life equally affords the child's discovery of silent segmental units in visual sign phonology (VSP) that can also facilitate segmental decoding of print. We consider powerful biological evidence about the brain, how it builds sound and sign phonology, and why sound and sign phonology are equally important in language learning and reading. We offer a testable theoretical account, reading model, and predictions about how VSP can facilitate segmentation and mapping between print and meaning. We explain how VSP can be a powerful facilitator of all children's reading success (deaf and hearing)-an account with profound transformative impact on learning to read in deaf children with different language backgrounds. The existence of VSP has important implications for understanding core properties of all human language and reading, challenges assumptions about language and reading as being tied to sound, and provides novel insight into a remarkable biological equivalence in signed and spoken languages. WIREs Cogn Sci 2016, 7:366-381. doi: 10.1002/wcs.1404 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.

  2. The Relationship among Beginning and Advanced American Sign Language Students and Credentialed Interpreters across Two Domains of Visual Imagery: Vividness and Manipulation

    ERIC Educational Resources Information Center

    Stauffer, Linda K.

    2010-01-01

    Given the visual-gestural nature of ASL it is reasonable to assume that visualization abilities may be one predictor of aptitude for learning ASL. This study tested a hypothesis that visualization abilities are a foundational aptitude for learning a signed language and that measurements of these skills will increase as students progress from…

  3. Widening the lens: what the manual modality reveals about language, learning and cognition.

    PubMed

    Goldin-Meadow, Susan

    2014-09-19

    The goal of this paper is to widen the lens on language to include the manual modality. We look first at hearing children who are acquiring language from a spoken language model and find that even before they use speech to communicate, they use gesture. Moreover, those gestures precede, and predict, the acquisition of structures in speech. We look next at deaf children whose hearing losses prevent them from using the oral modality, and whose hearing parents have not presented them with a language model in the manual modality. These children fall back on the manual modality to communicate and use gestures, which take on many of the forms and functions of natural language. These homemade gesture systems constitute the first step in the emergence of manual sign systems that are shared within deaf communities and are full-fledged languages. We end by widening the lens on sign language to include gesture and find that signers not only gesture, but they also use gesture in learning contexts just as speakers do. These findings suggest that what is key in gesture's ability to predict learning is its ability to add a second representational format to communication, rather than a second modality. Gesture can thus be language, assuming linguistic forms and functions, when other vehicles are not available; but when speech or sign is possible, gesture works along with language, providing an additional representational format that can promote learning. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  4. Sign language aphasia from a neurodegenerative disease.

    PubMed

    Falchook, Adam D; Mayberry, Rachel I; Poizner, Howard; Burtis, David Brandon; Doty, Leilani; Heilman, Kenneth M

    2013-01-01

    While Alois Alzheimer recognized the effects of the disease he described on speech and language in his original description of the disease in 1907, the effects of Alzheimer's disease (AD) on language in deaf signers has not previously been reported. We evaluated a 55-year-old right-handed congenitally deaf woman with a 2-year history of progressive memory loss and a deterioration of her ability to communicate in American Sign Language, which she learned at the age of eight. Examination revealed that she had impaired episodic memory as well as marked impairments in the production and comprehension of fingerspelling and grammatically complex sentences. She also had signs of anomia as well as an ideomotor apraxia and visual-spatial dysfunction. This report illustrates the challenges in evaluation of a patient for the presence of degenerative dementia when the person is deaf from birth, uses sign language, and has a late age of primary language acquisition. Although our patient could neither speak nor hear, in many respects her cognitive disorders mirror those of patients with AD who had normally learned to speak.

  5. ERP correlates of German Sign Language processing in deaf native signers.

    PubMed

    Hänel-Faulhaber, Barbara; Skotara, Nils; Kügow, Monique; Salden, Uta; Bottari, Davide; Röder, Brigitte

    2014-05-10

    The present study investigated the neural correlates of sign language processing of Deaf people who had learned German Sign Language (Deutsche Gebärdensprache, DGS) from their Deaf parents as their first language. Correct and incorrect signed sentences were presented sign by sign on a computer screen. At the end of each sentence the participants had to judge whether or not the sentence was an appropriate DGS sentence. Two types of violations were introduced: (1) semantically incorrect sentences containing a selectional restriction violation (implausible object); (2) morphosyntactically incorrect sentences containing a verb that was incorrectly inflected (i.e., incorrect direction of movement). Event-related brain potentials (ERPs) were recorded from 74 scalp electrodes. Semantic violations (implausible signs) elicited an N400 effect followed by a positivity. Sentences with a morphosyntactic violation (verb agreement violation) elicited a negativity followed by a broad centro-parietal positivity. ERP correlates of semantic and morphosyntactic aspects of DGS clearly differed from each other and showed a number of similarities with those observed in other signed and oral languages. These data suggest a similar functional organization of signed and oral languages despite the visual-spacial modality of sign language.

  6. ERP correlates of German Sign Language processing in deaf native signers

    PubMed Central

    2014-01-01

    Background The present study investigated the neural correlates of sign language processing of Deaf people who had learned German Sign Language (Deutsche Gebärdensprache, DGS) from their Deaf parents as their first language. Correct and incorrect signed sentences were presented sign by sign on a computer screen. At the end of each sentence the participants had to judge whether or not the sentence was an appropriate DGS sentence. Two types of violations were introduced: (1) semantically incorrect sentences containing a selectional restriction violation (implausible object); (2) morphosyntactically incorrect sentences containing a verb that was incorrectly inflected (i.e., incorrect direction of movement). Event-related brain potentials (ERPs) were recorded from 74 scalp electrodes. Results Semantic violations (implausible signs) elicited an N400 effect followed by a positivity. Sentences with a morphosyntactic violation (verb agreement violation) elicited a negativity followed by a broad centro-parietal positivity. Conclusions ERP correlates of semantic and morphosyntactic aspects of DGS clearly differed from each other and showed a number of similarities with those observed in other signed and oral languages. These data suggest a similar functional organization of signed and oral languages despite the visual-spacial modality of sign language. PMID:24884527

  7. Language as a multimodal phenomenon: implications for language learning, processing and evolution.

    PubMed

    Vigliocco, Gabriella; Perniss, Pamela; Vinson, David

    2014-09-19

    Our understanding of the cognitive and neural underpinnings of language has traditionally been firmly based on spoken Indo-European languages and on language studied as speech or text. However, in face-to-face communication, language is multimodal: speech signals are invariably accompanied by visual information on the face and in manual gestures, and sign languages deploy multiple channels (hands, face and body) in utterance construction. Moreover, the narrow focus on spoken Indo-European languages has entrenched the assumption that language is comprised wholly by an arbitrary system of symbols and rules. However, iconicity (i.e. resemblance between aspects of communicative form and meaning) is also present: speakers use iconic gestures when they speak; many non-Indo-European spoken languages exhibit a substantial amount of iconicity in word forms and, finally, iconicity is the norm, rather than the exception in sign languages. This introduction provides the motivation for taking a multimodal approach to the study of language learning, processing and evolution, and discusses the broad implications of shifting our current dominant approaches and assumptions to encompass multimodal expression in both signed and spoken languages. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  8. Mastering the Pressures of Variation: A Cognitive Linguistic Examination of Advanced Hearing ASL L2 Signers

    ERIC Educational Resources Information Center

    Nadolske, Marie Anne

    2009-01-01

    Despite the fact that American Sign Language (ASL) courses at the college-level have been increasing in frequency, little is understood about the capabilities of hearing individuals learning a sign language as a second language. This study aims to begin assessing the language skills of advanced L2 learners of ASL by comparing L2 signer productions…

  9. Effects of Real-Time Captioning and Sign Language Interpreting on the Learning of College Students Who Are Deaf or Hard of Hearing

    ERIC Educational Resources Information Center

    Smith-Pethybridge, Valorie

    2009-01-01

    College personnel are required to provide accommodations for students who are deaf and hard of hearing (D/HoH), but few empirical studies have been conducted on D/HoH students as they learn under the various accommodation conditions (sign language interpreting, SLI, real-time captioning, RTC, and both). Guided by the experiences of students who…

  10. School Based Factors Affecting Learning of Kenyan Sign Language in Primary Schools for Hearing Impaired in Embu and Isiolo Counties, Kenya

    ERIC Educational Resources Information Center

    Rwaimba, Samuel Muthomi

    2016-01-01

    This was a descriptive survey study design which sought to establish the school based factors that affect the learning of Kenyan Sign Language in primary schools for learners with hearing impairment in Embu and Isiolo counties in Kenya. The target population was all teachers teaching in primary schools for learners with hearing impairment in the…

  11. Bridge of Signs: Can Sign Language Empower Non-Deaf Children to Triumph over Their Communication Disabilities?

    ERIC Educational Resources Information Center

    Toth, Anne

    2009-01-01

    This pilot research project examined the use of sign language as a communication bridge for non-Deaf children between the ages of 0-6 years who had been diagnosed with, or whose communication difficulties suggested, the presence of such disorders as Autism, Down Syndrome, Fetal Alcohol Spectrum Disorder (FASD), and/or learning disabilities.…

  12. The link between form and meaning in American Sign Language: lexical processing effects.

    PubMed

    Thompson, Robin L; Vinson, David P; Vigliocco, Gabriella

    2009-03-01

    Signed languages exploit iconicity (the transparent relationship between meaning and form) to a greater extent than spoken languages. where it is largely limited to onomatopoeia. In a picture-sign matching experiment measuring reaction times, the authors examined the potential advantage of iconicity both for 1st- and 2nd-language learners of American Sign Language (ASL). The results show that native ASL signers are faster to respond when a specific property iconically represented in a sign is made salient in the corresponding picture, thus providing evidence that a closer mapping between meaning and form can aid in lexical retrieval. While late 2nd-language learners appear to use iconicity as an aid to learning sign (R. Campbell, P. Martin, & T. White, 1992), they did not show the same facilitation effect as native ASL signers, suggesting that the task tapped into more automatic language processes. Overall, the findings suggest that completely arbitrary mappings between meaning and form may not be more advantageous in language and that, rather, arbitrariness may simply be an accident of modality. (c) 2009 APA, all rights reserved

  13. Using Educational Games for Sign Language Learning--A SignWriting Learning Game: Case Study

    ERIC Educational Resources Information Center

    Bouzid, Yosra; Khenissi, Mohamed Ali; Essalmi, Fathi; Jemni, Mohamed

    2016-01-01

    Apart from being used as a means of entertainment, computer games have been adopted for a long time as a valuable tool for learning. Computer games can offer many learning benefits to students since they can consume their attention and increase their motivation and engagement which can then lead to stimulate learning. However, most of the research…

  14. Visual cortex entrains to sign language.

    PubMed

    Brookshire, Geoffrey; Lu, Jenny; Nusbaum, Howard C; Goldin-Meadow, Susan; Casasanto, Daniel

    2017-06-13

    Despite immense variability across languages, people can learn to understand any human language, spoken or signed. What neural mechanisms allow people to comprehend language across sensory modalities? When people listen to speech, electrophysiological oscillations in auditory cortex entrain to slow ([Formula: see text]8 Hz) fluctuations in the acoustic envelope. Entrainment to the speech envelope may reflect mechanisms specialized for auditory perception. Alternatively, flexible entrainment may be a general-purpose cortical mechanism that optimizes sensitivity to rhythmic information regardless of modality. Here, we test these proposals by examining cortical coherence to visual information in sign language. First, we develop a metric to quantify visual change over time. We find quasiperiodic fluctuations in sign language, characterized by lower frequencies than fluctuations in speech. Next, we test for entrainment of neural oscillations to visual change in sign language, using electroencephalography (EEG) in fluent speakers of American Sign Language (ASL) as they watch videos in ASL. We find significant cortical entrainment to visual oscillations in sign language <5 Hz, peaking at [Formula: see text]1 Hz. Coherence to sign is strongest over occipital and parietal cortex, in contrast to speech, where coherence is strongest over the auditory cortex. Nonsigners also show coherence to sign language, but entrainment at frontal sites is reduced relative to fluent signers. These results demonstrate that flexible cortical entrainment to language does not depend on neural processes that are specific to auditory speech perception. Low-frequency oscillatory entrainment may reflect a general cortical mechanism that maximizes sensitivity to informational peaks in time-varying signals.

  15. Towards a Transcription System of Sign Language for 3D Virtual Agents

    NASA Astrophysics Data System (ADS)

    Do Amaral, Wanessa Machado; de Martino, José Mario

    Accessibility is a growing concern in computer science. Since virtual information is mostly presented visually, it may seem that access for deaf people is not an issue. However, for prelingually deaf individuals, those who were deaf since before acquiring and formally learn a language, written information is often of limited accessibility than if presented in signing. Further, for this community, signing is their language of choice, and reading text in a spoken language is akin to using a foreign language. Sign language uses gestures and facial expressions and is widely used by deaf communities. To enabling efficient production of signed content on virtual environment, it is necessary to make written records of signs. Transcription systems have been developed to describe sign languages in written form, but these systems have limitations. Since they were not originally designed with computer animation in mind, in general, the recognition and reproduction of signs in these systems is an easy task only to those who deeply know the system. The aim of this work is to develop a transcription system to provide signed content in virtual environment. To animate a virtual avatar, a transcription system requires explicit enough information, such as movement speed, signs concatenation, sequence of each hold-and-movement and facial expressions, trying to articulate close to reality. Although many important studies in sign languages have been published, the transcription problem remains a challenge. Thus, a notation to describe, store and play signed content in virtual environments offers a multidisciplinary study and research tool, which may help linguistic studies to understand the sign languages structure and grammar.

  16. The Semiotics of Learning New Words

    ERIC Educational Resources Information Center

    Nöth, Winfried

    2014-01-01

    In several of his papers, Charles S. Peirce illustrates processes of interpreting and understanding signs by examples from second language vocabulary teaching and learning. The insights conveyed by means of these little pedagogical scenarios are not meant as contributions to the psychology of second language learning, but they aim at elucidating…

  17. Teachers' Attitudes to Signing for Children with Severe Learning Disabilities in Indonesia

    ERIC Educational Resources Information Center

    Sheehy, Kieron; Budiyanto

    2014-01-01

    The Indonesian education system is striving for an inclusive approach and techniques are needed which can support children with severe learning disabilities and their peers in this context. Manually signed language has proved useful both in supporting the development and empowerment of children with severe learning disabilities and supporting…

  18. Exploring the Efficacy of Online American Sign Language Instruction

    ERIC Educational Resources Information Center

    Radford, Curt L.

    2012-01-01

    Advances in technology have significantly influenced educational delivery options, particularly in the area of American Sign Language (ASL) instruction. As a result, ASL online courses are currently being explored in higher education. The review of literature remains relatively unexplored regarding the effectiveness of learning ASL online. In…

  19. Real-time lexical comprehension in young children learning American Sign Language.

    PubMed

    MacDonald, Kyle; LaMarr, Todd; Corina, David; Marchman, Virginia A; Fernald, Anne

    2018-04-16

    When children interpret spoken language in real time, linguistic information drives rapid shifts in visual attention to objects in the visual world. This language-vision interaction can provide insights into children's developing efficiency in language comprehension. But how does language influence visual attention when the linguistic signal and the visual world are both processed via the visual channel? Here, we measured eye movements during real-time comprehension of a visual-manual language, American Sign Language (ASL), by 29 native ASL-learning children (16-53 mos, 16 deaf, 13 hearing) and 16 fluent deaf adult signers. All signers showed evidence of rapid, incremental language comprehension, tending to initiate an eye movement before sign offset. Deaf and hearing ASL-learners showed similar gaze patterns, suggesting that the in-the-moment dynamics of eye movements during ASL processing are shaped by the constraints of processing a visual language in real time and not by differential access to auditory information in day-to-day life. Finally, variation in children's ASL processing was positively correlated with age and vocabulary size. Thus, despite competition for attention within a single modality, the timing and accuracy of visual fixations during ASL comprehension reflect information processing skills that are important for language acquisition regardless of language modality. © 2018 John Wiley & Sons Ltd.

  20. Teaching a Foreign Language to Deaf People via Vodcasting & Web 2.0 Tools

    NASA Astrophysics Data System (ADS)

    Drigas, Athanasios; Vrettaros, John; Tagoulis, Alexandors; Kouremenos, Dimitris

    This paper presents the design and development of an e-learning course in teaching deaf people in a foreign language, whose first language is the sign language. The course is based in e-material, vodcasting and web 2.0 tools such as social networking and blog The course has been designed especially for deaf people and it is exploring the possibilities that e-learning material vodcasting and web 2.0 tools can offer to enhance the learning process and achieve more effective learning results.

  1. Neural Language Processing in Adolescent First-Language Learners

    PubMed Central

    Ferjan Ramirez, Naja; Leonard, Matthew K.; Torres, Christina; Hatrak, Marla; Halgren, Eric; Mayberry, Rachel I.

    2014-01-01

    The relation between the timing of language input and development of neural organization for language processing in adulthood has been difficult to tease apart because language is ubiquitous in the environment of nearly all infants. However, within the congenitally deaf population are individuals who do not experience language until after early childhood. Here, we investigated the neural underpinnings of American Sign Language (ASL) in 2 adolescents who had no sustained language input until they were approximately 14 years old. Using anatomically constrained magnetoencephalography, we found that recently learned signed words mainly activated right superior parietal, anterior occipital, and dorsolateral prefrontal areas in these 2 individuals. This spatiotemporal activity pattern was significantly different from the left fronto-temporal pattern observed in young deaf adults who acquired ASL from birth, and from that of hearing young adults learning ASL as a second language for a similar length of time as the cases. These results provide direct evidence that the timing of language experience over human development affects the organization of neural language processing. PMID:23696277

  2. Why American Sign Language Gloss Must Matter.

    PubMed

    Supalla, Samuel J; Cripps, Jody H; Byrne, Andrew P

    2017-01-01

    Responding to an article by Grushkin on how deaf children best learn to read, published, along with the present article, in an American Annals of the Deaf special issue, the authors review American Sign Language gloss. Topics include how ASL gloss enables deaf children to learn to read in their own language and simultaneously experience a transition to written English, and what gloss looks like and how it underlines deaf children's learning and mastery of English literacy through ASL. Rebuttal of Grushkin's argument includes data describing a deaf child's engagement in reading aloud (entirely in ASL) with a gloss text, which occurred without the breakdown implied by Grushkin. The authors characterize Grushkin's argument that deaf children need to learn to read through a conventional ASL writing system as limiting, asserting that ASL gloss contributes more by providing a path for learning and mastering English literacy.

  3. Gesture, sign, and language: The coming of age of sign language and gesture studies.

    PubMed

    Goldin-Meadow, Susan; Brentari, Diane

    2017-01-01

    How does sign language compare with gesture, on the one hand, and spoken language on the other? Sign was once viewed as nothing more than a system of pictorial gestures without linguistic structure. More recently, researchers have argued that sign is no different from spoken language, with all of the same linguistic structures. The pendulum is currently swinging back toward the view that sign is gestural, or at least has gestural components. The goal of this review is to elucidate the relationships among sign language, gesture, and spoken language. We do so by taking a close look not only at how sign has been studied over the past 50 years, but also at how the spontaneous gestures that accompany speech have been studied. We conclude that signers gesture just as speakers do. Both produce imagistic gestures along with more categorical signs or words. Because at present it is difficult to tell where sign stops and gesture begins, we suggest that sign should not be compared with speech alone but should be compared with speech-plus-gesture. Although it might be easier (and, in some cases, preferable) to blur the distinction between sign and gesture, we argue that distinguishing between sign (or speech) and gesture is essential to predict certain types of learning and allows us to understand the conditions under which gesture takes on properties of sign, and speech takes on properties of gesture. We end by calling for new technology that may help us better calibrate the borders between sign and gesture.

  4. Governmental Partnerships for Language Learning: A Commercial Language Platform for Young Workers in Colombia

    ERIC Educational Resources Information Center

    García Botero, Gustavo; García Botero, Jacqueline; Questier, Frederik

    2017-01-01

    In June 2015, the Colombian government via the Labor Ministry announced a project for young workers called "40.000 Primeros Empleos". In the framework of this project, the Ministry of Labor signed an alliance with the language platform Duolingo as a strategy to provide participants with English learning opportunities and a free language…

  5. Brain Activations Associated with Sign Production Using Word and Picture Inputs in Deaf Signers

    ERIC Educational Resources Information Center

    Hu, Zhiguo; Wang, Wenjing; Liu, Hongyan; Peng, Danling; Yang, Yanhui; Li, Kuncheng; Zhang, John X.; Ding, Guosheng

    2011-01-01

    Effective literacy education in deaf students calls for psycholinguistic research revealing the cognitive and neural mechanisms underlying their written language processing. When learning a written language, deaf students are often instructed to sign out printed text. The present fMRI study was intended to reveal the neural substrates associated…

  6. A Prototype Greek Text to Greek Sign Language Conversion System

    ERIC Educational Resources Information Center

    Kouremenos, Dimitris; Fotinea, Stavroula-Evita; Efthimiou, Eleni; Ntalianis, Klimis

    2010-01-01

    In this article, a prototype Greek text to Greek Sign Language (GSL) conversion system is presented. The system is integrated into an educational platform that addresses the needs of teaching GSL grammar and was developed within the SYNENNOESE project (Efthimiou "et al." 2004a. Developing an e-learning platform for the Greek sign…

  7. Continuous Chinese sign language recognition with CNN-LSTM

    NASA Astrophysics Data System (ADS)

    Yang, Su; Zhu, Qing

    2017-07-01

    The goal of sign language recognition (SLR) is to translate the sign language into text, and provide a convenient tool for the communication between the deaf-mute and the ordinary. In this paper, we formulate an appropriate model based on convolutional neural network (CNN) combined with Long Short-Term Memory (LSTM) network, in order to accomplish the continuous recognition work. With the strong ability of CNN, the information of pictures captured from Chinese sign language (CSL) videos can be learned and transformed into vector. Since the video can be regarded as an ordered sequence of frames, LSTM model is employed to connect with the fully-connected layer of CNN. As a recurrent neural network (RNN), it is suitable for sequence learning tasks with the capability of recognizing patterns defined by temporal distance. Compared with traditional RNN, LSTM has performed better on storing and accessing information. We evaluate this method on our self-built dataset including 40 daily vocabularies. The experimental results show that the recognition method with CNN-LSTM can achieve a high recognition rate with small training sets, which will meet the needs of real-time SLR system.

  8. Signed language and human action processing: evidence for functional constraints on the human mirror-neuron system.

    PubMed

    Corina, David P; Knapp, Heather Patterson

    2008-12-01

    In the quest to further understand the neural underpinning of human communication, researchers have turned to studies of naturally occurring signed languages used in Deaf communities. The comparison of the commonalities and differences between spoken and signed languages provides an opportunity to determine core neural systems responsible for linguistic communication independent of the modality in which a language is expressed. The present article examines such studies, and in addition asks what we can learn about human languages by contrasting formal visual-gestural linguistic systems (signed languages) with more general human action perception. To understand visual language perception, it is important to distinguish the demands of general human motion processing from the highly task-dependent demands associated with extracting linguistic meaning from arbitrary, conventionalized gestures. This endeavor is particularly important because theorists have suggested close homologies between perception and production of actions and functions of human language and social communication. We review recent behavioral, functional imaging, and neuropsychological studies that explore dissociations between the processing of human actions and signed languages. These data suggest incomplete overlap between the mirror-neuron systems proposed to mediate human action and language.

  9. "I use it when I see it": The role of development and experience in Deaf and hearing children's understanding of iconic gesture.

    PubMed

    Magid, Rachel W; Pyers, Jennie E

    2017-05-01

    Iconicity is prevalent in gesture and in sign languages, yet the degree to which children recognize and leverage iconicity for early language learning is unclear. In Experiment 1 of the current study, we presented sign-naïve 3-, 4- and 5-year-olds (n=87) with iconic shape gestures and no additional scaffolding to ask whether children can spontaneously map iconic gestures to their referents. Four- and five-year-olds, but not three-year-olds, recognized the referents of iconic shape gestures above chance. Experiment 2 asked whether preschoolers (n=93) show an advantage in fast-mapping iconic gestures compared to arbitrary ones. We found that iconicity played a significant role in supporting 4- and 5-year-olds' ability to learn new gestures presented in an explicit pedagogical context, and a lesser role in 3-year-olds' learning. Using similar tasks in Experiment 3, we found that Deaf preschoolers (n=41) exposed to American Sign Language showed a similar pattern of recognition and learning but starting at an earlier age, suggesting that learning a language with rich iconicity may lead to earlier use of iconicity. These results suggest that sensitivity to iconicity is shaped by experience, and while not fundamental to the earliest stages of language development, is a useful tool once children unlock these form-meaning relationships. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Codeswitching techniques: evidence-based instructional practices for the ASL/English bilingual classroom.

    PubMed

    Andrews, Jean F; Rusher, Melissa

    2010-01-01

    The authors present a perspective on emerging bilingual deaf students who are exposed to, learning, and developing two languages--American Sign Language (ASL) and English (spoken English, manually coded English, and English reading and writing). The authors suggest that though deaf children may lack proficiency or fluency in either language during early language-learning development, they still engage in codeswitching activities, in which they go back and forth between signing and English to communicate. The authors then provide a second meaning of codeswitching--as a purpose-driven instructional technique in which the teacher strategically changes from ASL to English print for purposes of vocabulary and reading comprehension. The results of four studies are examined that suggest that certain codeswitching strategies support English vocabulary learning and reading comprehension. These instructional strategies are couched in a five-pronged approach to furthering the development of bilingual education for deaf students.

  11. Language Promotes False-Belief Understanding

    PubMed Central

    Pyers, Jennie E.; Senghas, Ann

    2010-01-01

    Developmental studies have identified a strong correlation in the timing of language development and false-belief understanding. However, the nature of this relationship remains unresolved. Does language promote false-belief understanding, or does it merely facilitate development that could occur independently, albeit on a delayed timescale? We examined language development and false-belief understanding in deaf learners of an emerging sign language in Nicaragua. The use of mental-state vocabulary and performance on a low-verbal false-belief task were assessed, over 2 years, in adult and adolescent users of Nicaraguan Sign Language. Results show that those adults who acquired a nascent form of the language during childhood produce few mental-state signs and fail to exhibit false-belief understanding. Furthermore, those whose language developed over the period of the study correspondingly developed in false-belief understanding. Thus, language learning, over and above social experience, drives the development of a mature theory of mind. PMID:19515119

  12. Gesture, sign and language: The coming of age of sign language and gesture studies

    PubMed Central

    Goldin-Meadow, Susan; Brentari, Diane

    2016-01-01

    How does sign language compare to gesture, on the one hand, and to spoken language on the other? At one time, sign was viewed as nothing more than a system of pictorial gestures with no linguistic structure. More recently, researchers have argued that sign is no different from spoken language with all of the same linguistic structures. The pendulum is currently swinging back toward the view that sign is gestural, or at least has gestural components. The goal of this review is to elucidate the relationships among sign language, gesture, and spoken language. We do so by taking a close look not only at how sign has been studied over the last 50 years, but also at how the spontaneous gestures that accompany speech have been studied. We come to the conclusion that signers gesture just as speakers do. Both produce imagistic gestures along with more categorical signs or words. Because, at the moment, it is difficult to tell where sign stops and where gesture begins, we suggest that sign should not be compared to speech alone, but should be compared to speech-plus-gesture. Although it might be easier (and, in some cases, preferable) to blur the distinction between sign and gesture, we argue that making a distinction between sign (or speech) and gesture is essential to predict certain types of learning, and allows us to understand the conditions under which gesture takes on properties of sign, and speech takes on properties of gesture. We end by calling for new technology that may help us better calibrate the borders between sign and gesture. PMID:26434499

  13. The Development of Sensitivity to Grammatical Violations in American Sign Language: Native versus Nonnative Signers

    ERIC Educational Resources Information Center

    Novogrodsky, Rama; Henner, Jon; Caldwell-Harris, Catherine; Hoffmeister, Robert

    2017-01-01

    Factors influencing native and nonnative signers' syntactic judgment ability in American Sign Language (ASL) were explored for 421 deaf students aged 7;6-18;5. Predictors for syntactic knowledge were chronological age, age of entering a school for the deaf, gender, and additional learning disabilities. Mixed-effects linear modeling analysis…

  14. Morphological Innovation in the Acquisition of American Sign Language.

    ERIC Educational Resources Information Center

    van Hoek, Karen; And Others

    A study examined aspects of the acquisition of spatialized morphology and syntax in American Sign Language (ASL) learned natively by deaf children of deaf parents. Children aged 2 to 8 were shown story books to elicit narratives, and the resulting use of verbs contained morphological forms not appearing in adult grammar. Analysis of the creative…

  15. Conventions for sign and speech transcription of child bimodal bilingual corpora in ELAN.

    PubMed

    Chen Pichler, Deborah; Hochgesang, Julie A; Lillo-Martin, Diane; de Quadros, Ronice Müller

    2010-01-01

    This article extends current methodologies for the linguistic analysis of sign language acquisition to cases of bimodal bilingual acquisition. Using ELAN, we are transcribing longitudinal spontaneous production data from hearing children of Deaf parents who are learning either American Sign Language (ASL) and American English (AE), or Brazilian Sign Language (Libras, also referred to as Língua de Sinais Brasileira/LSB in some texts) and Brazilian Portuguese (BP). Our goal is to construct corpora that can be mined for a wide range of investigations on various topics in acquisition. Thus, it is important that we maintain consistency in transcription for both signed and spoken languages. This article documents our transcription conventions, including the principles behind our approach. Using this document, other researchers can chose to follow similar conventions or develop new ones using our suggestions as a starting point.

  16. Conventions for sign and speech transcription of child bimodal bilingual corpora in ELAN

    PubMed Central

    Chen Pichler, Deborah; Hochgesang, Julie A.; Lillo-Martin, Diane; de Quadros, Ronice Müller

    2011-01-01

    This article extends current methodologies for the linguistic analysis of sign language acquisition to cases of bimodal bilingual acquisition. Using ELAN, we are transcribing longitudinal spontaneous production data from hearing children of Deaf parents who are learning either American Sign Language (ASL) and American English (AE), or Brazilian Sign Language (Libras, also referred to as Língua de Sinais Brasileira/LSB in some texts) and Brazilian Portuguese (BP). Our goal is to construct corpora that can be mined for a wide range of investigations on various topics in acquisition. Thus, it is important that we maintain consistency in transcription for both signed and spoken languages. This article documents our transcription conventions, including the principles behind our approach. Using this document, other researchers can chose to follow similar conventions or develop new ones using our suggestions as a starting point. PMID:21625371

  17. V2S: Voice to Sign Language Translation System for Malaysian Deaf People

    NASA Astrophysics Data System (ADS)

    Mean Foong, Oi; Low, Tang Jung; La, Wai Wan

    The process of learning and understand the sign language may be cumbersome to some, and therefore, this paper proposes a solution to this problem by providing a voice (English Language) to sign language translation system using Speech and Image processing technique. Speech processing which includes Speech Recognition is the study of recognizing the words being spoken, regardless of whom the speaker is. This project uses template-based recognition as the main approach in which the V2S system first needs to be trained with speech pattern based on some generic spectral parameter set. These spectral parameter set will then be stored as template in a database. The system will perform the recognition process through matching the parameter set of the input speech with the stored templates to finally display the sign language in video format. Empirical results show that the system has 80.3% recognition rate.

  18. When does Iconicity in Sign Language Matter?

    PubMed Central

    Baus, Cristina; Carreiras, Manuel; Emmorey, Karen

    2012-01-01

    We examined whether iconicity in American Sign Language (ASL) enhances translation performance for new learners and proficient signers. Fifteen hearing nonsigners and 15 proficient ASL-English bilinguals performed a translation recognition task and a production translation task. Nonsigners were taught 28 ASL verbs (14 iconic; 14 non-iconic) prior to performing these tasks. Only new learners benefited from sign iconicity, recognizing iconic translations faster and more accurately and exhibiting faster forward (English-ASL) and backward (ASL-English) translation times for iconic signs. In contrast, proficient ASL-English bilinguals exhibited slower recognition and translation times for iconic signs. We suggest iconicity aids memorization in the early stages of adult sign language learning, but for fluent L2 signers, iconicity interacts with other variables that slow translation (specifically, the iconic signs had more translation equivalents than the non-iconic signs). Iconicity may also have slowed translation performance by forcing conceptual mediation for iconic signs, which is slower than translating via direct lexical links. PMID:23543899

  19. Impacts of Visual Sonority and Handshape Markedness on Second Language Learning of American Sign Language

    PubMed Central

    Newman, Sharlene D.

    2016-01-01

    The roles of visual sonority and handshape markedness in sign language acquisition and production were investigated. In Experiment 1, learners were taught sign-nonobject correspondences that varied in sign movement sonority and handshape markedness. Results from a sign-picture matching task revealed that high sonority signs were more accurately matched, especially when the sign contained a marked handshape. In Experiment 2, learners produced these familiar signs in addition to novel signs, which differed based on sonority and markedness. Results from a key-release reaction time reproduction task showed that learners tended to produce high sonority signs much more quickly than low sonority signs, especially when the sign contained an unmarked handshape. This effect was only present in familiar signs. Sign production accuracy rates revealed that high sonority signs were more accurate than low sonority signs. Similarly, signs with unmarked handshapes were produced more accurately than those with marked handshapes. Together, results from Experiments 1 and 2 suggested that signs that contain high sonority movements are more easily processed, both perceptually and productively, and handshape markedness plays a differential role in perception and production. PMID:26644551

  20. Auditory Technology and Its Impact on Bilingual Deaf Education

    ERIC Educational Resources Information Center

    Mertes, Jennifer

    2015-01-01

    Brain imaging studies suggest that children can simultaneously develop, learn, and use two languages. A visual language, such as American Sign Language (ASL), facilitates development at the earliest possible moments in a child's life. Spoken language development can be delayed due to diagnostic evaluations, device fittings, and auditory skill…

  1. Learning to Look for Language: Development of Joint Attention in Young Deaf Children

    ERIC Educational Resources Information Center

    Lieberman, Amy M.; Hatrak, Marla; Mayberry, Rachel I.

    2014-01-01

    Joint attention between hearing children and their caregivers is typically achieved when the adult provides spoken, auditory linguistic input that relates to the child's current visual focus of attention. Deaf children interacting through sign language must learn to continually switch visual attention between people and objects in order to achieve…

  2. Psycholinguistic Descriptions and Their Relevance to Education

    ERIC Educational Resources Information Center

    Vejleskov, Hans

    1976-01-01

    Article deals with the Osgood model of sign learning...and the Chomskian approach. Both...approaches are discussed in terms of their appropriateness with respect to teachers' questions about language development and language stimulation. (Author)

  3. Sounds of Science

    ERIC Educational Resources Information Center

    Lott, Kimberly; Lott, Alan; Ence, Hannah

    2018-01-01

    Inquiry-based active learning in science is helpful to all students but especially to those who have a hearing loss. For many deaf or hard of hearing students, the English language may be their second language, with American Sign Language (ASL) being their primary language. Therefore, many of the accommodations for the deaf are similar to those…

  4. From gesture to sign language: conventionalization of classifier constructions by adult hearing learners of British Sign Language.

    PubMed

    Marshall, Chloë R; Morgan, Gary

    2015-01-01

    There has long been interest in why languages are shaped the way they are, and in the relationship between sign language and gesture. In sign languages, entity classifiers are handshapes that encode how objects move, how they are located relative to one another, and how multiple objects of the same type are distributed in space. Previous studies have shown that hearing adults who are asked to use only manual gestures to describe how objects move in space will use gestures that bear some similarities to classifiers. We investigated how accurately hearing adults, who had been learning British Sign Language (BSL) for 1-3 years, produce and comprehend classifiers in (static) locative and distributive constructions. In a production task, learners of BSL knew that they could use their hands to represent objects, but they had difficulty choosing the same, conventionalized, handshapes as native signers. They were, however, highly accurate at encoding location and orientation information. Learners therefore show the same pattern found in sign-naïve gesturers. In contrast, handshape, orientation, and location were comprehended with equal (high) accuracy, and testing a group of sign-naïve adults showed that they too were able to understand classifiers with higher than chance accuracy. We conclude that adult learners of BSL bring their visuo-spatial knowledge and gestural abilities to the tasks of understanding and producing constructions that contain entity classifiers. We speculate that investigating the time course of adult sign language acquisition might shed light on how gesture became (and, indeed, becomes) conventionalized during the genesis of sign languages. Copyright © 2014 Cognitive Science Society, Inc.

  5. The question of sign-language and the utility of signs in the instruction of the deaf: two papers by Alexander Graham Bell (1898).

    PubMed

    Bell, Alexander Graham

    2005-01-01

    Alexander Graham Bell is often portrayed as either hero or villain of deaf individuals and the Deaf community. His writings, however, indicate that he was neither, and was not as clearly definite in his beliefs about language as is often supposed. The following two articles, reprinted from The Educator (1898), Vol. V, pp. 3-4 and pp. 38-44, capture Bell's thinking about sign language and its use in the classroom. Contrary to frequent claims, Bell does not demand "oral" training for all deaf children--even if he thinks it is the superior alternative--but does advocate for it for "the semi-deaf" and "the semi-mute." "In regard to the others," he writes, "I am not so sure." Although he clearly voices his support for oral methods and fingerspelling (the Rochester method) over sign language, Bell acknowledges the use and utility of signing in a carefully-crafted discussion that includes both linguistics and educational philosophy. In separating the language used at home from that in school and on the playground, Bell reveals a far more complex view of language learning by deaf children than he is often granted. (M. Marschark).

  6. Teaching Science to a Profoundly Deaf Child in a Mainstream Classroom

    ERIC Educational Resources Information Center

    Spicer, Sally

    2016-01-01

    From her experience of teaching a profoundly deaf child learning science with British Sign Language (BSL) as the child's first language, Sally Spicer learned methods that could be good practice for all learners. In this article, Sally Spicer shares how providing an opportunity for first-hand experience to develop knowledge and understanding of…

  7. Impacts of Visual Sonority and Handshape Markedness on Second Language Learning of American Sign Language.

    PubMed

    Williams, Joshua T; Newman, Sharlene D

    2016-04-01

    The roles of visual sonority and handshape markedness in sign language acquisition and production were investigated. In Experiment 1, learners were taught sign-nonobject correspondences that varied in sign movement sonority and handshape markedness. Results from a sign-picture matching task revealed that high sonority signs were more accurately matched, especially when the sign contained a marked handshape. In Experiment 2, learners produced these familiar signs in addition to novel signs, which differed based on sonority and markedness. Results from a key-release reaction time reproduction task showed that learners tended to produce high sonority signs much more quickly than low sonority signs, especially when the sign contained an unmarked handshape. This effect was only present in familiar signs. Sign production accuracy rates revealed that high sonority signs were more accurate than low sonority signs. Similarly, signs with unmarked handshapes were produced more accurately than those with marked handshapes. Together, results from Experiments 1 and 2 suggested that signs that contain high sonority movements are more easily processed, both perceptually and productively, and handshape markedness plays a differential role in perception and production. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Absence of Sublexical Representations in Late-Learning Signers? A Statistical Critique of Lieberman et al. (2015)

    ERIC Educational Resources Information Center

    Salverda, Anne Pier

    2016-01-01

    Lieberman, Borovsky, Hatrak, and Mayberry (2015) used a modified version of the visual-world paradigm to examine the real-time processing of signs in American Sign Language. They examined the activation of phonological and semantic competitors in native signers and late-learning signers and concluded that their results provide evidence that the…

  9. Evaluating the Ability of and Enabling a Blind Adult with Learning Disability to Sign a Tenancy Agreement

    ERIC Educational Resources Information Center

    Waight, Mary Philomena; Oldreive, Warren James

    2012-01-01

    This paper aims to describe the process undertaken by Speech and Language Therapy and Occupational Therapy to assess a gentleman with learning disabilities and visual impairment with regard to his capacity to sign a tenancy agreement. It describes the method used to assess the gentleman's mental capacity before exploring the system used to provide…

  10. Benefits of Sign Language Interpreting and Text Alternatives for Deaf Students' Classroom Learning

    PubMed Central

    Marschark, Marc; Leigh, Greg; Sapere, Patricia; Burnham, Denis; Convertino, Carol; Stinson, Michael; Knoors, Harry; Vervloed, Mathijs P. J.; Noble, William

    2006-01-01

    Four experiments examined the utility of real-time text in supporting deaf students' learning from lectures in postsecondary (Experiments 1 and 2) and secondary classrooms (Experiments 3 and 4). Experiment 1 compared the effects on learning of sign language interpreting, real-time text (C-Print), and both. Real-time text alone led to significantly higher performance by deaf students than the other two conditions, but performance by deaf students in all conditions was significantly below that of hearing peers who saw lectures without any support services. Experiment 2 compared interpreting and two forms of real-time text, C-Print and Communication Access Real-Time Translation, at immediate testing and after a 1-week delay (with study notes). No significant differences among support services were obtained at either testing. Experiment 3 also failed to reveal significant effects at immediate or delayed testing in a comparison of real-time text, direct (signed) instruction, and both. Experiment 4 found no significant differences between interpreting and interpreting plus real-time text on the learning of either new words or the content of television programs. Alternative accounts of the observed pattern of results are considered, but it is concluded that neither sign language interpreting nor real-time text have any inherent, generalized advantage over the other in supporting deaf students in secondary or postsecondary settings. Providing deaf students with both services simultaneously does not appear to provide any generalized benefit, at least for the kinds of materials utilized here. PMID:16928778

  11. Benefits of sign language interpreting and text alternatives for deaf students' classroom learning.

    PubMed

    Marschark, Marc; Leigh, Greg; Sapere, Patricia; Burnham, Denis; Convertino, Carol; Stinson, Michael; Knoors, Harry; Vervloed, Mathijs P J; Noble, William

    2006-01-01

    Four experiments examined the utility of real-time text in supporting deaf students' learning from lectures in postsecondary (Experiments 1 and 2) and secondary classrooms (Experiments 3 and 4). Experiment 1 compared the effects on learning of sign language interpreting, real-time text (C-Print), and both. Real-time text alone led to significantly higher performance by deaf students than the other two conditions, but performance by deaf students in all conditions was significantly below that of hearing peers who saw lectures without any support services. Experiment 2 compared interpreting and two forms of real-time text, C-Print and Communication Access Real-Time Translation, at immediate testing and after a 1-week delay (with study notes). No significant differences among support services were obtained at either testing. Experiment 3 also failed to reveal significant effects at immediate or delayed testing in a comparison of real-time text, direct (signed) instruction, and both. Experiment 4 found no significant differences between interpreting and interpreting plus real-time text on the learning of either new words or the content of television programs. Alternative accounts of the observed pattern of results are considered, but it is concluded that neither sign language interpreting nor real-time text have any inherent, generalized advantage over the other in supporting deaf students in secondary or postsecondary settings. Providing deaf students with both services simultaneously does not appear to provide any generalized benefit, at least for the kinds of materials utilized here.

  12. Which Fragments of a Sign Enable Its Recognition?

    ERIC Educational Resources Information Center

    ten Holt, G. A.; Van Doorn, A. J.; de Ridder, H.; Reinders, M. J. T.; Hendriks, E. A.

    2009-01-01

    In sign language studies, it is generally assumed that a sign can be divided into several phases in time (preparation, stroke, and retraction) and that the stroke contains all of the necessary information. However, this has not been tested empirically. In order to learn where the information truly resides, we present an experiment that…

  13. DESIGN FOR THINKING, A FIRST BOOK IN SEMANTICS.

    ERIC Educational Resources Information Center

    UPTON, ALBERT

    THIS BOOK ABOUT THE FUNCTIONS OF LANGUAGE IN HUMAN LIFE EMPHASIZES LEARNING HOW TO CLASSIFY, DEFINE, AND ANALYZE. FOLLOWING AN EXPLANATION OF THE PHYSIOLOGICAL AND PSYCHOLOGICAL ROOTS OF LANGUAGE, CHAPTERS ON ANALYSIS, MEANING, SIGNS, AMBIGUITY, SEMANTIC GROWTH, AND METAPHOR LEAD TO A DESCRIPTION OF THE COMMUNICATIVE FUNCTION OF LANGUAGE,…

  14. The Effect of Computer Game-Based Learning on FL Vocabulary Transferability

    ERIC Educational Resources Information Center

    Franciosi, Stephan J.

    2017-01-01

    In theory, computer game-based learning can support several vocabulary learning affordances that have been identified in the foreign language learning research. In the observable evidence, learning with computer games has been shown to improve performance on vocabulary recall tests. However, while simple recall can be a sign of learning,…

  15. The Effects of Captions on Deaf Students' Content Comprehension, Cognitive Load, and Motivation in Online Learning

    ERIC Educational Resources Information Center

    Yoon, Joong-O.; Kim, Minjeong

    2011-01-01

    The authors examined the effects of captions on deaf students' content comprehension, cognitive load, and motivation in online learning. The participants in the study were 62 deaf adult students who had limited reading comprehension skills and used sign language as a first language. Participants were randomly assigned to either the control group…

  16. Sign-Supported English: Is It Effective at Teaching Vocabulary to Young Children with English as an Additional Language?

    ERIC Educational Resources Information Center

    Marshall, Chloë R.; Hobsbaum, Angela

    2015-01-01

    Background: Children who are learning English as an Additional Language (EAL) may start school with smaller vocabularies than their monolingual peers. Given the links between vocabulary and academic achievement, it is important to evaluate interventions that are designed to support vocabulary learning in this group of children. Aims: To evaluate…

  17. Young children make their gestural communication systems more language-like: segmentation and linearization of semantic elements in motion events.

    PubMed

    Clay, Zanna; Pople, Sally; Hood, Bruce; Kita, Sotaro

    2014-08-01

    Research on Nicaraguan Sign Language, created by deaf children, has suggested that young children use gestures to segment the semantic elements of events and linearize them in ways similar to those used in signed and spoken languages. However, it is unclear whether this is due to children's learning processes or to a more general effect of iterative learning. We investigated whether typically developing children, without iterative learning, segment and linearize information. Gestures produced in the absence of speech to express a motion event were examined in 4-year-olds, 12-year-olds, and adults (all native English speakers). We compared the proportions of gestural expressions that segmented semantic elements into linear sequences and that encoded them simultaneously. Compared with adolescents and adults, children reshaped the holistic stimuli by segmenting and recombining their semantic features into linearized sequences. A control task on recognition memory ruled out the possibility that this was due to different event perception or memory. Young children spontaneously bring fundamental properties of language into their communication system. © The Author(s) 2014.

  18. Sign Language: An Effective Strategy to Reduce the Gap between English Language Learners Native Language and English

    ERIC Educational Resources Information Center

    Nicholson, Sheryl; Graves, Emily

    2010-01-01

    Linguistic diversity provides even greater challenges for our educational system. English Language Learners (ELLs) are a diverse population of students who are learning English in school. They come from numerous cultural and economic backgrounds, and live throughout the country. The task of the classroom teacher is to find a way to reach these…

  19. Codeswitching Techniques: Evidence-Based Instructional Practices for the ASL/English Bilingual Classroom

    ERIC Educational Resources Information Center

    Andrews, Jean F.; Rusher, Melissa

    2010-01-01

    The authors present a perspective on emerging bilingual deaf students who are exposed to, learning, and developing two languages--American Sign Language (ASL) and English (spoken English, manually coded English, and English reading and writing). The authors suggest that though deaf children may lack proficiency or fluency in either language during…

  20. Learning with Pictures, Signs and Symbols (A Language Arts and Consumer Mathematics Curriculum for the 0-4 Level ABE Student). Final Report.

    ERIC Educational Resources Information Center

    Robinson, Nancy; Selkirk, Betty

    The ARIN Adult Learning Center in Indiana, Pennsylvania conducted a project to develop language arts and consumer mathematics curriculums for O-4 level adult basic education (ABE) students. Using the five knowledge areas of the adult performance levels as established by the University of Texas, (consumer economics, health, occupational knowledge,…

  1. The impact of time on predicate forms in the manual modality: Signers, homesigners, and silent gesturers*

    PubMed Central

    Goldin-Meadow, Susan

    2014-01-01

    It is difficult to create spoken forms that can be understood on the spot. But the manual modality, in large part because of its iconic potential, allows us to construct forms that are immediately understood, thus requiring essentially no time to develop. This paper contrasts manual forms for actions produced over 3 time spans—by silent gesturers who are asked to invent gestures on the spot; by homesigners who have created gesture systems over their lifespans; and by signers who have learned a conventional sign language from other signers—and finds that properties of the predicate differ across these time spans. Silent gesturers use location to establish co-reference in the way established sign languages do, but show little evidence of the segmentation sign languages display in motion forms for manner and path, and little evidence of the finger complexity sign languages display in handshapes in predicates representing events. Homesigners, in contrast, not only use location to establish co-reference, but also display segmentation in their motion forms for manner and path and finger complexity in their object handshapes, although they have not yet decreased finger complexity to the levels found in sign languages in their handling handshapes. The manual modality thus allows us to watch language as it grows, offering insight into factors that may have shaped and may continue to shape human language. PMID:25329421

  2. First language acquisition differs from second language acquisition in prelingually deaf signers: Evidence from sensitivity to grammaticality judgement in British Sign Language

    PubMed Central

    Cormier, Kearsy; Schembri, Adam; Vinson, David; Orfanidou, Eleni

    2012-01-01

    Age of acquisition (AoA) effects have been used to support the notion of a critical period for first language acquisition. In this study, we examine AoA effects in deaf British Sign Language (BSL) users via a grammaticality judgment task. When English reading performance and nonverbal IQ are factored out, results show that accuracy of grammaticality judgement decreases as AoA increases, until around age 8, thus showing the unique effect of AoA on grammatical judgement in early learners. No such effects were found in those who acquired BSL after age 8. These late learners appear to have first language proficiency in English instead, which may have been used to scaffold learning of BSL as a second language later in life. PMID:22578601

  3. The Political Uses of Sign Language: The Case of the French Revolution

    ERIC Educational Resources Information Center

    Rosenfeld, Sophia

    2005-01-01

    The story of the Abbe de l'Epee's "methodical signs" is best known as a key moment in Deaf history. However, at the time of the French Revolution this story served a larger political function. The example of de l'Epee's deaf students, and their seemingly miraculous command of ideas learned through gestural signs, helped the French…

  4. Electrophysiological Correlates of Error Monitoring and Feedback Processing in Second Language Learning.

    PubMed

    Bultena, Sybrine; Danielmeier, Claudia; Bekkering, Harold; Lemhöfer, Kristin

    2017-01-01

    Humans monitor their behavior to optimize performance, which presumably relies on stable representations of correct responses. During second language (L2) learning, however, stable representations have yet to be formed while knowledge of the first language (L1) can interfere with learning, which in some cases results in persistent errors. In order to examine how correct L2 representations are stabilized, this study examined performance monitoring in the learning process of second language learners for a feature that conflicts with their first language. Using EEG, we investigated if L2 learners in a feedback-guided word gender assignment task showed signs of error detection in the form of an error-related negativity (ERN) before and after receiving feedback, and how feedback is processed. The results indicated that initially, response-locked negativities for correct (CRN) and incorrect (ERN) responses were of similar size, showing a lack of internal error detection when L2 representations are unstable. As behavioral performance improved following feedback, the ERN became larger than the CRN, pointing to the first signs of successful error detection. Additionally, we observed a second negativity following the ERN/CRN components, the amplitude of which followed a similar pattern as the previous negativities. Feedback-locked data indicated robust FRN and P300 effects in response to negative feedback across different rounds, demonstrating that feedback remained important in order to update memory representations during learning. We thus show that initially, L2 representations may often not be stable enough to warrant successful error monitoring, but can be stabilized through repeated feedback, which means that the brain is able to overcome L1 interference, and can learn to detect errors internally after a short training session. The results contribute a different perspective to the discussion on changes in ERN and FRN components in relation to learning, by extending the investigation of these effects to the language learning domain. Furthermore, these findings provide a further characterization of the online learning process of L2 learners.

  5. Mapping Language to the World: The Role of Iconicity in the Sign Language Input

    ERIC Educational Resources Information Center

    Perniss, Pamela; Lu, Jenny C.; Morgan, Gary; Vigliocco, Gabriella

    2018-01-01

    Most research on the mechanisms underlying referential mapping has assumed that learning occurs in ostensive contexts, where label and referent co-occur, and that form and meaning are linked by arbitrary convention alone. In the present study, we focus on "iconicity" in language, that is, resemblance relationships between form and…

  6. Iconicity in English and Spanish and Its Relation to Lexical Category and Age of Acquisition

    PubMed Central

    Lupyan, Gary

    2015-01-01

    Signed languages exhibit iconicity (resemblance between form and meaning) across their vocabulary, and many non-Indo-European spoken languages feature sizable classes of iconic words known as ideophones. In comparison, Indo-European languages like English and Spanish are believed to be arbitrary outside of a small number of onomatopoeic words. In three experiments with English and two with Spanish, we asked native speakers to rate the iconicity of ~600 words from the English and Spanish MacArthur-Bates Communicative Developmental Inventories. We found that iconicity in the words of both languages varied in a theoretically meaningful way with lexical category. In both languages, adjectives were rated as more iconic than nouns and function words, and corresponding to typological differences between English and Spanish in verb semantics, English verbs were rated as relatively iconic compared to Spanish verbs. We also found that both languages exhibited a negative relationship between iconicity ratings and age of acquisition. Words learned earlier tended to be more iconic, suggesting that iconicity in early vocabulary may aid word learning. Altogether these findings show that iconicity is a graded quality that pervades vocabularies of even the most “arbitrary” spoken languages. The findings provide compelling evidence that iconicity is an important property of all languages, signed and spoken, including Indo-European languages. PMID:26340349

  7. Social construction of American sign language--English interpreters.

    PubMed

    McDermid, Campbell

    2009-01-01

    Instructors in 5 American Sign Language--English Interpreter Programs and 4 Deaf Studies Programs in Canada were interviewed and asked to discuss their experiences as educators. Within a qualitative research paradigm, their comments were grouped into a number of categories tied to the social construction of American Sign Language--English interpreters, such as learners' age and education and the characteristics of good citizens within the Deaf community. According to the participants, younger students were adept at language acquisition, whereas older learners more readily understood the purpose of lessons. Children of deaf adults were seen as more culturally aware. The participants' beliefs echoed the theories of P. Freire (1970/1970) that educators consider the reality of each student and their praxis and were responsible for facilitating student self-awareness. Important characteristics in the social construction of students included independence, an appropriate attitude, an understanding of Deaf culture, ethical behavior, community involvement, and a willingness to pursue lifelong learning.

  8. Learning with a Missing Sense: What Can We Learn from the Interaction of a Deaf Child with a Turtle?

    ERIC Educational Resources Information Center

    Miller, Paul

    2009-01-01

    This case study reports on the progress of Navon, a 13-year-old boy with prelingual deafness, over a 3-month period following exposure to Logo, a computer programming language that visualizes specific programming commands by means of a virtual drawing tool called the Turtle. Despite an almost complete lack of skills in spoken and sign language,…

  9. Learning Language through Total Physical Response.

    ERIC Educational Resources Information Center

    Marlatt, Edward A.

    1995-01-01

    The Total Physical Response (TPR) method of language instruction is introduced, and guidelines for designing and implementing TPR lessons for students with hearing impairments are provided. In TPR instruction, students develop understanding before speech or signing, understanding is demonstrated through actions, and new vocabulary is developed…

  10. Modeling the Emergence of Lexicons in Homesign Systems

    PubMed Central

    Richie, Russell; Yang, Charles; Coppola, Marie

    2014-01-01

    It is largely acknowledged that natural languages emerge from not just human brains, but also from rich communities of interacting human brains (Senghas, 2005). Yet the precise role of such communities and such interaction in the emergence of core properties of language has largely gone uninvestigated in naturally emerging systems, leaving the few existing computational investigations of this issue at an artificial setting. Here we take a step towards investigating the precise role of community structure in the emergence of linguistic conventions with both naturalistic empirical data and computational modeling. We first show conventionalization of lexicons in two different classes of naturally emerging signed systems: (1) protolinguistic “homesigns” invented by linguistically isolated Deaf individuals, and (2) a natural sign language emerging in a recently formed rich Deaf community. We find that the latter conventionalized faster than the former. Second, we model conventionalization as a population of interacting individuals who adjust their probability of sign use in response to other individuals' actual sign use, following an independently motivated model of language learning (Yang 2002, 2004). Simulations suggest that a richer social network, like that of natural (signed) languages, conventionalizes faster than a sparser social network, like that of homesign systems. We discuss our behavioral and computational results in light of other work on language emergence, and other work of behavior on complex networks. PMID:24482343

  11. Using American sign language interpreters to facilitate research among deaf adults: lessons learned.

    PubMed

    Sheppard, Kate

    2011-04-01

    Health care providers commonly discuss depressive symptoms with clients, enabling earlier intervention. Such discussions rarely occur between providers and Deaf clients. Most culturally Deaf adults experience early-onset hearing loss, self-identify as part of a unique culture, and communicate in the visual language of American Sign Language (ASL). Communication barriers abound, and depression screening instruments may be unreliable. To train and use ASL interpreters for a qualitative study describing depressive symptoms among Deaf adults. Training included research versus community interpreting. During data collection, interpreters translated to and from voiced English and ASL. Training eliminated potential problems during data collection. Unexpected issues included participants asking for "my interpreter" and worrying about confidentiality or friendship in a small community. Lessons learned included the value of careful training of interpreters prior to initiating data collection, including resolution of possible role conflicts and ensuring conceptual equivalence in real-time interpreting.

  12. Unsilencing voices: a study of zoo signs and their language of authority

    NASA Astrophysics Data System (ADS)

    Fogelberg, Katherine

    2014-12-01

    Zoo signs are important for informal learning, but their effect on visitor perception of animals has been sparsely studied. Other studies have established the importance of informal learning in American society; this study discusses zoo signs in the context of such learning. Through the lens of Critical Theory framed by informal learning, and by applying critical discourse analysis, I discovered subtle institutional power on zoo signs. This may influence visitors through dominant ideological discursive formations and emergent discourse objects, adding to the paradox of "saving" wild animals while simultaneously oppressing them. Signs covering a variety of species from two different United States-accredited zoos were analyzed. Critical Theory looks to emancipate oppressed human populations; here I apply it zoo animals. As physical emancipation is not practical, I define emancipation in the sociological sense—in this case, freedom from silence. Through this research, perhaps we can find a way to represent animals as living beings who have their own lives and voices, by presenting them honestly, with care and compassion.

  13. Three Years Old, Going on Four.

    ERIC Educational Resources Information Center

    Luetke-Stahlman, Barbara

    1990-01-01

    The adoptive mother of a hearing-impaired preschool girl describes ways the family has integrated language practice into every facet of the child's life. The paper focuses on practicing speech, learning language, getting ready for reading, using computers, family involvement in signing, socialization and independence, child care, preschool team…

  14. First language acquisition differs from second language acquisition in prelingually deaf signers: evidence from sensitivity to grammaticality judgement in British Sign Language.

    PubMed

    Cormier, Kearsy; Schembri, Adam; Vinson, David; Orfanidou, Eleni

    2012-07-01

    Age of acquisition (AoA) effects have been used to support the notion of a critical period for first language acquisition. In this study, we examine AoA effects in deaf British Sign Language (BSL) users via a grammaticality judgment task. When English reading performance and nonverbal IQ are factored out, results show that accuracy of grammaticality judgement decreases as AoA increases, until around age 8, thus showing the unique effect of AoA on grammatical judgement in early learners. No such effects were found in those who acquired BSL after age 8. These late learners appear to have first language proficiency in English instead, which may have been used to scaffold learning of BSL as a second language later in life. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Nicaraguan Sign Language and Theory of Mind: The Issue of Critical Periods and Abilities

    ERIC Educational Resources Information Center

    Morgan, Gary; Kegl, Judy

    2006-01-01

    Background: Previous studies in the literature report that deaf individuals who experience late access to language perform poorly on false belief tests of Theory of Mind (ToM) compared with age-matched deaf and hearing controls exposed to language early. Methods: A group of 22 deaf Nicaraguans (aged 7 to 39 years) who learned Nicaraguan Sign…

  16. The Arbitrariness of the Sign: Learning Advantages from the Structure of the Vocabulary

    ERIC Educational Resources Information Center

    Monaghan, Padraic; Christiansen, Morten H.; Fitneva, Stanka A.

    2011-01-01

    Recent research has demonstrated that systematic mappings between phonological word forms and their meanings can facilitate language learning (e.g., in the form of sound symbolism or cues to grammatical categories). Yet, paradoxically from a learning viewpoint, most words have an arbitrary form-meaning mapping. We hypothesized that this paradox…

  17. Recognition & Response: Response to Intervention for PreK

    ERIC Educational Resources Information Center

    Buysse, Virginia; Peisner-Feinberg, Ellen

    2010-01-01

    Some young children show signs that they may not be learning in an expected manner, even during the prekindergarten (PreK) years. These children may exhibit learning challenges in areas such as developing language, counting objects, hearing differences in letter sounds, paying attention during story time, or learning how to write. Teachers,…

  18. Signing Earth Science: Accommodations for Students Who Are Deaf or Hard of Hearing and Whose First Language Is Sign

    NASA Astrophysics Data System (ADS)

    Vesel, J.; Hurdich, J.

    2014-12-01

    TERC and Vcom3D used the SigningAvatar® accessibility software to research and develop a Signing Earth Science Dictionary (SESD) of approximately 750 standards-based Earth science terms for high school students who are deaf and hard of hearing and whose first language is sign. The partners also evaluated the extent to which use of the SESD furthers understanding of Earth science content, command of the language of Earth science, and the ability to study Earth science independently. Disseminated as a Web-based version and App, the SESD is intended to serve the ~36,000 grade 9-12 students who are deaf or hard of hearing and whose first language is sign, the majority of whom leave high school reading at the fifth grade or below. It is also intended for teachers and interpreters who interact with members of this population and professionals working with Earth science education programs during field trips, internships etc. The signed SESD terms have been incorporated into a Mobile Communication App (MCA). This App for Androids is intended to facilitate communication between English speakers and persons who communicate in American Sign Language (ASL) or Signed English. It can translate words, phrases, or whole sentences from written or spoken English to animated signing. It can also fingerspell proper names and other words for which there are no signs. For our presentation, we will demonstrate the interactive features of the SigningAvatar® accessibility software that support the three principles of Universal Design for Learning (UDL) and have been incorporated into the SESD and MCA. Results from national field-tests will provide insight into the SESD's and MCA's potential applicability beyond grade 12 as accommodations that can be used for accessing the vocabulary deaf and hard of hearing students need for study of the geosciences and for facilitating communication about content. This work was funded in part by grants from NSF and the U.S. Department of Education.

  19. Sign(al)s: Living and Learning as Semiotic Engagement

    ERIC Educational Resources Information Center

    Stables, Andrew

    2006-01-01

    Cartesian mind-body dualism, while often explicitly denied, has left a legacy of conceptions that remain highly influential in education. I argue that trends in both analytic and continental philosophy of language point towards a post-Cartesian settlement in which the distinction between "signs" and "signals" is collapsed, and which thus construes…

  20. Engaging the Deaf American sign language community: lessons from a community-based participatory research center.

    PubMed

    McKee, Michael; Thew, Denise; Starr, Matthew; Kushalnagar, Poorna; Reid, John T; Graybill, Patrick; Velasquez, Julia; Pearson, Thomas

    2012-01-01

    Numerous publications demonstrate the importance of community-based participatory research (CBPR) in community health research, but few target the Deaf community. The Deaf community is understudied and underrepresented in health research despite suspected health disparities and communication barriers. The goal of this paper is to share the lessons learned from the implementation of CBPR in an understudied community of Deaf American Sign Language (ASL) users in the greater Rochester, New York, area. We review the process of CBPR in a Deaf ASL community and identify the lessons learned. Key CBPR lessons include the importance of engaging and educating the community about research, ensuring that research benefits the community, using peer-based recruitment strategies, and sustaining community partnerships. These lessons informed subsequent research activities. This report focuses on the use of CBPR principles in a Deaf ASL population; lessons learned can be applied to research with other challenging-to-reach populations.

  1. Origin of symbol-using systems: speech, but not sign, without the semantic urge

    PubMed Central

    Sereno, Martin I.

    2014-01-01

    Natural language—spoken and signed—is a multichannel phenomenon, involving facial and body expression, and voice and visual intonation that is often used in the service of a social urge to communicate meaning. Given that iconicity seems easier and less abstract than making arbitrary connections between sound and meaning, iconicity and gesture have often been invoked in the origin of language alongside the urge to convey meaning. To get a fresh perspective, we critically distinguish the origin of a system capable of evolution from the subsequent evolution that system becomes capable of. Human language arose on a substrate of a system already capable of Darwinian evolution; the genetically supported uniquely human ability to learn a language reflects a key contact point between Darwinian evolution and language. Though implemented in brains generated by DNA symbols coding for protein meaning, the second higher-level symbol-using system of language now operates in a world mostly decoupled from Darwinian evolutionary constraints. Examination of Darwinian evolution of vocal learning in other animals suggests that the initial fixation of a key prerequisite to language into the human genome may actually have required initially side-stepping not only iconicity, but the urge to mean itself. If sign languages came later, they would not have faced this constraint. PMID:25092671

  2. Theory of Mind and Reading Comprehension in Deaf and Hard-of-Hearing Signing Children

    PubMed Central

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2016-01-01

    Theory of Mind (ToM) is related to reading comprehension in hearing children. In the present study, we investigated progression in ToM in Swedish deaf and hard-of-hearing (DHH) signing children who were learning to read, as well as the association of ToM with reading comprehension. Thirteen children at Swedish state primary schools for DHH children performed a Swedish Sign Language (SSL) version of the Wellman and Liu (2004) ToM scale, along with tests of reading comprehension, SSL comprehension, and working memory. Results indicated that ToM progression did not differ from that reported in previous studies, although ToM development was delayed despite age-appropriate sign language skills. Correlation analysis revealed that ToM was associated with reading comprehension and working memory, but not sign language comprehension. We propose that some factor not investigated in the present study, possibly represented by inference making constrained by working memory capacity, supports both ToM and reading comprehension and may thus explain the results observed in the present study. PMID:27375532

  3. Theory of Mind and Reading Comprehension in Deaf and Hard-of-Hearing Signing Children.

    PubMed

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2016-01-01

    Theory of Mind (ToM) is related to reading comprehension in hearing children. In the present study, we investigated progression in ToM in Swedish deaf and hard-of-hearing (DHH) signing children who were learning to read, as well as the association of ToM with reading comprehension. Thirteen children at Swedish state primary schools for DHH children performed a Swedish Sign Language (SSL) version of the Wellman and Liu (2004) ToM scale, along with tests of reading comprehension, SSL comprehension, and working memory. Results indicated that ToM progression did not differ from that reported in previous studies, although ToM development was delayed despite age-appropriate sign language skills. Correlation analysis revealed that ToM was associated with reading comprehension and working memory, but not sign language comprehension. We propose that some factor not investigated in the present study, possibly represented by inference making constrained by working memory capacity, supports both ToM and reading comprehension and may thus explain the results observed in the present study.

  4. Language Development in Nonverbal Autistic Children Using a Simultaneous Communication System.

    ERIC Educational Resources Information Center

    Creedon, Margaret Procyk

    Twenty-one nonverbal autistic children, 4- to 9-years-old, with language ages of 4- to 24-months, participated in the communication learning program from 1 to 3 years. Simultaneous verbal and manual signs were chosen as the communications mode. The children initially displayed infrequent, unrecognizable vocalizations (Screeches, or vocal…

  5. Jabberwocky: The Complexities of Mathematical English

    ERIC Educational Resources Information Center

    Carter, Merilyn; Quinnell, Lorna

    2012-01-01

    Students find it hard to interpret mathematical problem texts. Mathematics is a unique language with its own symbols (grapho-phonics), vocabulary (lexicon), grammar (syntax), semantics and literature. As in any other language, to make meaning of the text, the student must learn: (1) signs and symbols (for example: [division], x, [not equal to]);…

  6. Technology and Multiple Disabilities: Learning What Works for Cree

    ERIC Educational Resources Information Center

    Valcourt-Pearce, Catherine C.

    2015-01-01

    Thanks to the generous sharing of the author, in this article, we learn about the progress and difficulties her family experienced and learned to cope with in dealing with their son Cree (the second of four boys). The author explains how this hearing-impaired family has used American Sign Language (ASL) as their primary mode of communication at…

  7. Needs Analysis for Graphic Design Learning Module Based on Technology & Learning Styles of Deaf Students

    ERIC Educational Resources Information Center

    Ibrahim, Zainuddin; Alias, Norlidah; Nordin, Abu Bakar

    2016-01-01

    The field of Information Communication Technology has offered a promising future for deaf students. Web design, animation, and multimedia application design are a branch of graphic design area, which aim to aid their learning visually. However, most of the technical terms cannot be interpreted in Malaysian sign language. Moreover, the development…

  8. Automated Finger Spelling by Highly Realistic 3D Animation

    ERIC Educational Resources Information Center

    Adamo-Villani, Nicoletta; Beni, Gerardo

    2004-01-01

    We present the design of a new 3D animation tool for self-teaching (signing and reading) finger spelling the first basic component in learning any sign language. We have designed a highly realistic hand with natural animation of the finger motions. Smoothness of motion (in real time) is achieved via programmable blending of animation segments. The…

  9. Infant Signs as Intervention? Promoting Symbolic Gestures for Preverbal Children in Low-Income Families Supports Responsive Parent-Child Relationships

    ERIC Educational Resources Information Center

    Vallotton, Claire D.

    2012-01-01

    Gestures are a natural form of communication between preverbal children and parents which support children's social and language development; however, low-income parents gesture less frequently, disadvantaging their children. In addition to pointing and waving, children are capable of learning many symbolic gestures, known as "infant signs," if…

  10. Development and Evaluation of an E-Learning Course for Deaf and Hard of Hearing Based on the Advanced Adapted Pedagogical Index Method

    ERIC Educational Resources Information Center

    Debevc, Matjaž; Stjepanovic, Zoran; Holzinger, Andreas

    2014-01-01

    Web-based and adapted e-learning materials provide alternative methods of learning to those used in a traditional classroom. Within the study described in this article, deaf and hard of hearing people used an adaptive e-learning environment to improve their computer literacy. This environment included streaming video with sign language interpreter…

  11. Enhancing Deaf Students' Learning from Sign Language and Text: Metacognition, Modality, and the Effectiveness of Content Scaffolding

    ERIC Educational Resources Information Center

    Borgna, Georgianna; Convertino, Carol; Marschark, Marc; Morrison, Carolyn; Rizzolo, Kathleen

    2011-01-01

    Four experiments, each building on the results of the previous ones, explored the effects of several manipulations on learning and the accuracy of metacognitive judgments among deaf and hard-of-hearing (DHH) students. Experiment 1 examined learning and metacognitive accuracy from classroom lectures with or without prior "scaffolding" in the form…

  12. Case Studies of Multilingual/Multicultural Asian Deaf Adults: Strategies for Success

    ERIC Educational Resources Information Center

    Wang, Qiuying; Andrews, Jean; Liu, Hsiu Tan; Liu, Chun Jung

    2016-01-01

    Case studies of adult d/Deaf or Hard of Hearing Multilingual Learners (DMLs) are few, especially studies of DMLs who learn more than one sign language and read logographic and alphabetic scripts. To reduce this paucity, two descriptive case studies are presented. Written questionnaires, face-to-face interviews, and self-appraisals of language-use…

  13. Parametric Representation of the Speaker's Lips for Multimodal Sign Language and Speech Recognition

    NASA Astrophysics Data System (ADS)

    Ryumin, D.; Karpov, A. A.

    2017-05-01

    In this article, we propose a new method for parametric representation of human's lips region. The functional diagram of the method is described and implementation details with the explanation of its key stages and features are given. The results of automatic detection of the regions of interest are illustrated. A speed of the method work using several computers with different performances is reported. This universal method allows applying parametrical representation of the speaker's lipsfor the tasks of biometrics, computer vision, machine learning, and automatic recognition of face, elements of sign languages, and audio-visual speech, including lip-reading.

  14. The Use of Makaton for Supporting Talk, through Play, for Pupils Who have English as an Additional Language (EAL) in the Foundation Stage

    ERIC Educational Resources Information Center

    Mistry, Malini; Barnes, Danielle

    2013-01-01

    This study examines the use of Makaton® , a language programme based on the use of signing, symbols and speech, as a pedagogic tool to support the development of talk for pupils learning English as an Additional Language (EAL). The research setting was a Reception class with a high percentage of pupils who have EAL in the initial stages of…

  15. Enlarged Vestibular Aqueducts and Childhood Hearing Loss

    MedlinePlus

    ... EVA. However, this is a rare event in commercial aircraft with pressurized cabins. If you have EVA, ... EVA, will benefit from learning other forms of communication, such as sign language or cued speech, or ...

  16. Engaging the Deaf American Sign Language Community: Lessons From a Community-Based Participatory Research Center

    PubMed Central

    McKee, Michael; Thew, Denise; Starr, Matthew; Kushalnagar, Poorna; Reid, John T.; Graybill, Patrick; Velasquez, Julia; Pearson, Thomas

    2013-01-01

    Background Numerous publications demonstrate the importance of community-based participatory research (CBPR) in community health research, but few target the Deaf community. The Deaf community is understudied and underrepresented in health research despite suspected health disparities and communication barriers. Objectives The goal of this paper is to share the lessons learned from the implementation of CBPR in an understudied community of Deaf American Sign Language (ASL) users in the greater Rochester, New York, area. Methods We review the process of CBPR in a Deaf ASL community and identify the lessons learned. Results Key CBPR lessons include the importance of engaging and educating the community about research, ensuring that research benefits the community, using peer-based recruitment strategies, and sustaining community partnerships. These lessons informed subsequent research activities. Conclusions This report focuses on the use of CBPR principles in a Deaf ASL population; lessons learned can be applied to research with other challenging-to-reach populations. PMID:22982845

  17. [Neuropsychological profiles associated with the children's oral language disorders].

    PubMed

    Conde-Guzón, P A; Conde-Guzón, M J; Bartolomé-Albistegui, M T; Quirós-Expósito, P

    Oral language disorders constitute a group of syndromes with a high prevalence among the childhood population. They form a heterogeneous group that ranges from simple problems in articulating a phoneme (dyslalias) to severe disorders affecting communication, such as children's dysarthrias and aphasias. In this paper our objective is to review the neuropsychological profiles of children who manifest different oral language disorders. Due to the wide range of clinical pictures and causations covered by children's oral language disorders, very few systematic reviews have been conducted to obtain an overall view of the neuropsychological profiles of these children. Although the linguistic signs and symptoms of these disorders are well understood, the associated neuropsychological signs and symptoms have not been studied. In some cases, these neuropsychological signs cause greater learning problems in children than the actual language problems themselves. Childhood language disorders are associated with different neuropsychological problems. The most commonly associated neuropsychological deficits are problems involving memory, attention, executive functions, motor dysfunctions, temporal perception, tactile recognition, body scheme, spatial orientation and difficulties in visual discrimination. Mnemonic disorders (essentially in short-term and working auditory memory) are usually a common denominator in the different clinical pictures that make up language disorders. The mnemonic impairment associated to dyslalias deserves special attention as this disorder is sometimes similar to that seen in language problems deriving from clinical pictures with important neurological alterations.

  18. Learning with a missing sense: what can we learn from the interaction of a deaf child with a turtle?

    PubMed

    Miller, Paul

    2009-01-01

    This case study reports on the progress of Navon, a 13-year-old boy with prelingual deafness, over a 3-month period following exposure to Logo, a computer programming language that visualizes specific programming commands by means of a virtual drawing tool called the Turtle. Despite an almost complete lack of skills in spoken and sign language, Navon made impressive progress in his programming skills, including acquisition of a notable active written vocabulary, which he learned to apply in a purposeful, rule-based manner. His achievements are discussed with reference to commonly held assumptions about the relationship between language and thought, in general, and the prerequisite of proper spoken language skills for the acquisition of reading and writing, in particular. Highlighted are the central principles responsible for Navon's unexpected cognitive and linguistic development, including the way it affected his social relations with peers and teachers.

  19. The bridge of iconicity: from a world of experience to the experience of language.

    PubMed

    Perniss, Pamela; Vigliocco, Gabriella

    2014-09-19

    Iconicity, a resemblance between properties of linguistic form (both in spoken and signed languages) and meaning, has traditionally been considered to be a marginal, irrelevant phenomenon for our understanding of language processing, development and evolution. Rather, the arbitrary and symbolic nature of language has long been taken as a design feature of the human linguistic system. In this paper, we propose an alternative framework in which iconicity in face-to-face communication (spoken and signed) is a powerful vehicle for bridging between language and human sensori-motor experience, and, as such, iconicity provides a key to understanding language evolution, development and processing. In language evolution, iconicity might have played a key role in establishing displacement (the ability of language to refer beyond what is immediately present), which is core to what language does; in ontogenesis, iconicity might play a critical role in supporting referentiality (learning to map linguistic labels to objects, events, etc., in the world), which is core to vocabulary development. Finally, in language processing, iconicity could provide a mechanism to account for how language comes to be embodied (grounded in our sensory and motor systems), which is core to meaningful communication.

  20. The bridge of iconicity: from a world of experience to the experience of language

    PubMed Central

    Perniss, Pamela; Vigliocco, Gabriella

    2014-01-01

    Iconicity, a resemblance between properties of linguistic form (both in spoken and signed languages) and meaning, has traditionally been considered to be a marginal, irrelevant phenomenon for our understanding of language processing, development and evolution. Rather, the arbitrary and symbolic nature of language has long been taken as a design feature of the human linguistic system. In this paper, we propose an alternative framework in which iconicity in face-to-face communication (spoken and signed) is a powerful vehicle for bridging between language and human sensori-motor experience, and, as such, iconicity provides a key to understanding language evolution, development and processing. In language evolution, iconicity might have played a key role in establishing displacement (the ability of language to refer beyond what is immediately present), which is core to what language does; in ontogenesis, iconicity might play a critical role in supporting referentiality (learning to map linguistic labels to objects, events, etc., in the world), which is core to vocabulary development. Finally, in language processing, iconicity could provide a mechanism to account for how language comes to be embodied (grounded in our sensory and motor systems), which is core to meaningful communication. PMID:25092668

  1. The Path to Symbolism. Practice Perspectives--Highlighting Information on Deaf-Blindness. Number 3

    ERIC Educational Resources Information Center

    Malloy, Peggy

    2008-01-01

    Language involves the use of symbols in the form of words or signs that allow people to communicate their thoughts, ideas, and needs. Even without formal language, many children who are deaf-blind learn to communicate with gestures and object or picture symbols. Symbolic expression makes it possible to express thoughts and feelings about the…

  2. An Evaluation of an Intervention Using Sign Language and Multi-Sensory Coding to Support Word Learning and Reading Comprehension of Deaf Signing Children

    ERIC Educational Resources Information Center

    van Staden, Annalene

    2013-01-01

    The reading skills of many deaf children lag several years behind those of hearing children, and there is a need for identifying reading difficulties and implementing effective reading support strategies in this population. This study embraces a balanced reading approach, and investigates the efficacy of applying multi-sensory coding strategies…

  3. Meaning matters: a clinician's/student's guide to general sign theory and its applicability in clinical settings.

    PubMed

    Oller, Stephen D

    2005-01-01

    The pragmatic mapping process and its variants have proven effective in second language learning and teaching. The goal of this paper is to show that the same process applies in teaching and intervention with disordered populations. A secondary goal, ultimately more important, is to give clinicians, teachers, and other educators a tool-kit, or a framework, from which they can evaluate and implement interventions. What is offered is an introduction to a general theory of signs and some examples of how it can be applied in treating communication disorders. (1) Readers will be able to relate the three theoretical consistency requirements to language teaching and intervention. (2) Readers will be introduced to a general theory of signs that provides a basis for evaluating and implementing interventions.

  4. Deaf children's non-verbal working memory is impacted by their language experience

    PubMed Central

    Marshall, Chloë; Jones, Anna; Denmark, Tanya; Mason, Kathryn; Atkinson, Joanna; Botting, Nicola; Morgan, Gary

    2015-01-01

    Several recent studies have suggested that deaf children perform more poorly on working memory tasks compared to hearing children, but these studies have not been able to determine whether this poorer performance arises directly from deafness itself or from deaf children's reduced language exposure. The issue remains unresolved because findings come mostly from (1) tasks that are verbal as opposed to non-verbal, and (2) involve deaf children who use spoken communication and therefore may have experienced impoverished input and delayed language acquisition. This is in contrast to deaf children who have been exposed to a sign language since birth from Deaf parents (and who therefore have native language-learning opportunities within a normal developmental timeframe for language acquisition). A more direct, and therefore stronger, test of the hypothesis that the type and quality of language exposure impact working memory is to use measures of non-verbal working memory (NVWM) and to compare hearing children with two groups of deaf signing children: those who have had native exposure to a sign language, and those who have experienced delayed acquisition and reduced quality of language input compared to their native-signing peers. In this study we investigated the relationship between NVWM and language in three groups aged 6–11 years: hearing children (n = 28), deaf children who were native users of British Sign Language (BSL; n = 8), and deaf children who used BSL but who were not native signers (n = 19). We administered a battery of non-verbal reasoning, NVWM, and language tasks. We examined whether the groups differed on NVWM scores, and whether scores on language tasks predicted scores on NVWM tasks. For the two executive-loaded NVWM tasks included in our battery, the non-native signers performed less accurately than the native signer and hearing groups (who did not differ from one another). Multiple regression analysis revealed that scores on the vocabulary measure predicted scores on those two executive-loaded NVWM tasks (with age and non-verbal reasoning partialled out). Our results suggest that whatever the language modality—spoken or signed—rich language experience from birth, and the good language skills that result from this early age of acquisition, play a critical role in the development of NVWM and in performance on NVWM tasks. PMID:25999875

  5. A reversed-typicality effect in pictures but not in written words in deaf and hard of hearing adolescents.

    PubMed

    Li, Degao; Gao, Kejuan; Wu, Xueyun; Xong, Ying; Chen, Xiaojun; He, Weiwei; Li, Ling; Huang, Jingjia

    2015-01-01

    Two experiments investigated Chinese deaf and hard of hearing (DHH) adolescents' recognition of category names in an innovative task of semantic categorization. In each trial, the category-name target appeared briefly at the screen center followed by two words or two pictures for two basic-level exemplars of high or middle typicality, which appeared briefly approximately where the target had appeared. Participants' reaction times when they were deciding whether the target referred to living or nonliving things consistently revealed the typicality effect for the word, but a reversed-typicality effect for picture-presented exemplars. It was found that in automatically processing a category name, DHH adolescents with natural sign language as their first language evidently activate two sets of exemplar representations: those for middle-typicality exemplars, which they develop in interactions with the physical world and in sign language uses; and those in written-language learning.

  6. Using hypnosis to help deaf children help themselves: report of two cases.

    PubMed

    Kohen, D P; Mann-Rinehart, P; Schmitz, D; Wills, L M

    1998-04-01

    This is a report of deaf children who demonstrated the ability to quickly learn hypnotic skills and apply them effectively to the management of their problems. The children were taught hypnosis through American Sign Language, their preferred mode of communication. As with hypnosis with hearing children, we focused upon induction with fantasy and imaginative involvement, creation in imagination of a metaphor for, or imagery of, the desired outcome, and associated sense of pride (ego-strengthening), positive expectation, and teaching self-hypnosis to emphasize the importance of repeated, daily practice. Case examples presented are an 11-year-old deaf girl who used hypnosis to eliminate multiple warts, and a 9-year-old deaf boy with mild developmental disability whose self-hypnosis skills were applied to the management of myoclonus. In the former, the clinician is also the sign language communicator and in the latter, a professional sign language interpreter and parent are both intimately involved in the communication and hypnosis process.

  7. The ALPHA Interactive Microcomputer System for Teaching Reading, Writing, and Communication Skills to Hearing-Impaired Children.

    ERIC Educational Resources Information Center

    Prinz, Philip M; And Others

    1985-01-01

    A total of 79 hearing-impaired children (3-14 years old) participated in the project involving microcomputer assisted instruction and exploratory learning. Ss learned to write text sentences that were highly accurate interpretations of either animated pictured action sequences or sign language sentence animations. Moreover, Ss made significant…

  8. Enabling Pedagogy and Andragogy for 21st-Century Sign Language Users and Learners

    ERIC Educational Resources Information Center

    Hermann-Shores, Patricia

    2017-01-01

    Enabling pedagogy and andragogy is discussed as a form of lifelong learning in which learners attain competences and skills as children (pedagogy) and as adults (andragogy) that enable them to engage in independent learning in the 21st century. Throughout the article the author avoids as much as possible the labels "deaf" and…

  9. Methodological Proposal for Elaboration of Learning Materials in Sign Language in University Teaching

    ERIC Educational Resources Information Center

    Viera-Santana, J. Guillermo; Rodríguez-Esparragón, Dionisio; Hernández-Haddad, Juan C.; Castillo-Ortiz, Jesús

    2015-01-01

    Hearing impairment may constitute a barrier for accessing to information and communication in public places. Since the oral communication forms the basis of the learning process, this problem becomes of particular relevance at schools and universities. To cope with this situation is not enough to provide a textual translation for people with…

  10. Deaf Children's Science Content Learning in Direct Instruction Versus Interpreted Instruction

    ERIC Educational Resources Information Center

    Kurz, Kim B.; Schick, Brenda; Hauser, Peter C.

    2015-01-01

    This research study compared learning of 6-9th grade deaf students under two modes of educational delivery--interpreted vs. direct instruction using science lessons. Nineteen deaf students participated in the study in which they were taught six science lessons in American Sign Language. In one condition, the lessons were taught by a hearing…

  11. Monitoring Different Phonological Parameters of Sign Language Engages the Same Cortical Language Network but Distinctive Perceptual Ones.

    PubMed

    Cardin, Velia; Orfanidou, Eleni; Kästner, Lena; Rönnberg, Jerker; Woll, Bencie; Capek, Cheryl M; Rudner, Mary

    2016-01-01

    The study of signed languages allows the dissociation of sensorimotor and cognitive neural components of the language signal. Here we investigated the neurocognitive processes underlying the monitoring of two phonological parameters of sign languages: handshape and location. Our goal was to determine if brain regions processing sensorimotor characteristics of different phonological parameters of sign languages were also involved in phonological processing, with their activity being modulated by the linguistic content of manual actions. We conducted an fMRI experiment using manual actions varying in phonological structure and semantics: (1) signs of a familiar sign language (British Sign Language), (2) signs of an unfamiliar sign language (Swedish Sign Language), and (3) invented nonsigns that violate the phonological rules of British Sign Language and Swedish Sign Language or consist of nonoccurring combinations of phonological parameters. Three groups of participants were tested: deaf native signers, deaf nonsigners, and hearing nonsigners. Results show that the linguistic processing of different phonological parameters of sign language is independent of the sensorimotor characteristics of the language signal. Handshape and location were processed by different perceptual and task-related brain networks but recruited the same language areas. The semantic content of the stimuli did not influence this process, but phonological structure did, with nonsigns being associated with longer RTs and stronger activations in an action observation network in all participants and in the supramarginal gyrus exclusively in deaf signers. These results suggest higher processing demands for stimuli that contravene the phonological rules of a signed language, independently of previous knowledge of signed languages. We suggest that the phonological characteristics of a language may arise as a consequence of more efficient neural processing for its perception and production.

  12. Sign Language and the Learning of Swedish by Deaf Children (Project TSD).

    ERIC Educational Resources Information Center

    Jansson, Karin, Ed.

    1982-01-01

    A project in Sweden focuses on the early linguistic development of preschool deaf children in families where the parents are also deaf. The School for the Deaf in Sweden is involved with describing the Swedish language as it appears to a deaf learner, a description to be used as a basis for teacher training and inservice in the teaching of the…

  13. Writing Signed Languages: What For? What Form?

    PubMed

    Grushkin, Donald A

    2017-01-01

    Signed languages around the world have tended to maintain an "oral," unwritten status. Despite the advantages of possessing a written form of their language, signed language communities typically resist and reject attempts to create such written forms. The present article addresses many of the arguments against written forms of signed languages, and presents the potential advantages of writing signed languages. Following a history of the development of writing in spoken as well as signed language populations, the effects of orthographic types upon literacy and biliteracy are explored. Attempts at writing signed languages have followed two primary paths: "alphabetic" and "icono-graphic." It is argued that for greatest congruency and ease in developing biliteracy strategies in societies where an alphabetic script is used for the spoken language, signed language communities within these societies are best served by adoption of an alphabetic script for writing their signed language.

  14. Towards a Sign Language Synthesizer: a Bridge to Communication Gap of the Hearing/Speech Impaired Community

    NASA Astrophysics Data System (ADS)

    Maarif, H. A.; Akmeliawati, R.; Gunawan, T. S.; Shafie, A. A.

    2013-12-01

    Sign language synthesizer is a method to visualize the sign language movement from the spoken language. The sign language (SL) is one of means used by HSI people to communicate to normal people. But, unfortunately the number of people, including the HSI people, who are familiar with sign language is very limited. These cause difficulties in the communication between the normal people and the HSI people. The sign language is not only hand movement but also the face expression. Those two elements have complimentary aspect each other. The hand movement will show the meaning of each signing and the face expression will show the emotion of a person. Generally, Sign language synthesizer will recognize the spoken language by using speech recognition, the grammatical process will involve context free grammar, and 3D synthesizer will take part by involving recorded avatar. This paper will analyze and compare the existing techniques of developing a sign language synthesizer, which leads to IIUM Sign Language Synthesizer.

  15. Exploring the Ancestral Roots of American Sign Language: Lexical Borrowing from Cistercian Sign Language and French Sign Language

    ERIC Educational Resources Information Center

    Cagle, Keith Martin

    2010-01-01

    American Sign Language (ASL) is the natural and preferred language of the Deaf community in both the United States and Canada. Woodward (1978) estimated that approximately 60% of the ASL lexicon is derived from early 19th century French Sign Language, which is known as "langue des signes francaise" (LSF). The lexicon of LSF and ASL may…

  16. Adapting tests of sign language assessment for other sign languages--a review of linguistic, cultural, and psychometric problems.

    PubMed

    Haug, Tobias; Mann, Wolfgang

    2008-01-01

    Given the current lack of appropriate assessment tools for measuring deaf children's sign language skills, many test developers have used existing tests of other sign languages as templates to measure the sign language used by deaf people in their country. This article discusses factors that may influence the adaptation of assessment tests from one natural sign language to another. Two tests which have been adapted for several other sign languages are focused upon: the Test for American Sign Language and the British Sign Language Receptive Skills Test. A brief description is given of each test as well as insights from ongoing adaptations of these tests for other sign languages. The problems reported in these adaptations were found to be grounded in linguistic and cultural differences, which need to be considered for future test adaptations. Other reported shortcomings of test adaptation are related to the question of how well psychometric measures transfer from one instrument to another.

  17. A New Kind of Heterogeneity: What We Can Learn from D/Deaf and Hard of Hearing Multilingual Learners

    ERIC Educational Resources Information Center

    Cannon, Joanna E.; Guardino, Caroline; Gallimore, Erin

    2016-01-01

    The present article introduces a special issue of the "American Annals of the Deaf." Students who are d/Deaf or hard of hearing and come from homes where a language other than English or American Sign Language is used constitute 19.4%-35.0% of the U.S. d/Dhh population (Gallaudet Research Institute, 2013). The authors propose moving…

  18. Signed Language Working Memory Capacity of Signed Language Interpreters and Deaf Signers

    ERIC Educational Resources Information Center

    Wang, Jihong; Napier, Jemina

    2013-01-01

    This study investigated the effects of hearing status and age of signed language acquisition on signed language working memory capacity. Professional Auslan (Australian sign language)/English interpreters (hearing native signers and hearing nonnative signers) and deaf Auslan signers (deaf native signers and deaf nonnative signers) completed an…

  19. Theory-of-mind development in oral deaf children with cochlear implants or conventional hearing aids.

    PubMed

    Peterson, Candida C

    2004-09-01

    In the context of the established finding that theory-of-mind (ToM) growth is seriously delayed in late-signing deaf children, and some evidence of equivalent delays in those learning speech with conventional hearing aids, this study's novel contribution was to explore ToM development in deaf children with cochlear implants. Implants can substantially boost auditory acuity and rates of language growth. Despite the implant, there are often problems socialising with hearing peers and some language difficulties, lending special theoretical interest to the present comparative design. A total of 52 children aged 4 to 12 years took a battery of false belief tests of ToM. There were 26 oral deaf children, half with implants and half with hearing aids, evenly divided between oral-only versus sign-plus-oral schools. Comparison groups of age-matched high-functioning children with autism and younger hearing children were also included. No significant ToM differences emerged between deaf children with implants and those with hearing aids, nor between those in oral-only versus sign-plus-oral schools. Nor did the deaf children perform any better on the ToM tasks than their age peers with autism. Hearing preschoolers scored significantly higher than all other groups. For the deaf and the autistic children, as well as the preschoolers, rate of language development and verbal maturity significantly predicted variability in ToM, over and above chronological age. The finding that deaf children with cochlear implants are as delayed in ToM development as children with autism and their deaf peers with hearing aids or late sign language highlights the likely significance of peer interaction and early fluent communication with peers and family, whether in sign or in speech, in order to optimally facilitate the growth of social cognition and language.

  20. GESTURE'S ROLE IN CREATING AND LEARNING LANGUAGE.

    PubMed

    Goldin-Meadow, Susan

    2010-09-22

    Imagine a child who has never seen or heard language. Would such a child be able to invent a language? Despite what one might guess, the answer is "yes". This chapter describes children who are congenitally deaf and cannot learn the spoken language that surrounds them. In addition, the children have not been exposed to sign language, either by their hearing parents or their oral schools. Nevertheless, the children use their hands to communicate--they gesture--and those gestures take on many of the forms and functions of language (Goldin-Meadow 2003a). The properties of language that we find in these gestures are just those properties that do not need to be handed down from generation to generation, but can be reinvented by a child de novo. They are the resilient properties of language, properties that all children, deaf or hearing, come to language-learning ready to develop. In contrast to these deaf children who are inventing language with their hands, hearing children are learning language from a linguistic model. But they too produce gestures, as do all hearing speakers (Feyereisen and de Lannoy 1991; Goldin-Meadow 2003b; Kendon 1980; McNeill 1992). Indeed, young hearing children often use gesture to communicate before they use words. Interestingly, changes in a child's gestures not only predate but also predict changes in the child's early language, suggesting that gesture may be playing a role in the language-learning process. This chapter begins with a description of the gestures the deaf child produces without speech. These gestures assume the full burden of communication and take on a language-like form--they are language. This phenomenon stands in contrast to the gestures hearing speakers produce with speech. These gestures share the burden of communication with speech and do not take on a language-like form--they are part of language.

  1. What You Don't Know Can Hurt You: The Risk of Language Deprivation by Impairing Sign Language Development in Deaf Children.

    PubMed

    Hall, Wyatte C

    2017-05-01

    A long-standing belief is that sign language interferes with spoken language development in deaf children, despite a chronic lack of evidence supporting this belief. This deserves discussion as poor life outcomes continue to be seen in the deaf population. This commentary synthesizes research outcomes with signing and non-signing children and highlights fully accessible language as a protective factor for healthy development. Brain changes associated with language deprivation may be misrepresented as sign language interfering with spoken language outcomes of cochlear implants. This may lead to professionals and organizations advocating for preventing sign language exposure before implantation and spreading misinformation. The existence of one-time-sensitive-language acquisition window means a strong possibility of permanent brain changes when spoken language is not fully accessible to the deaf child and sign language exposure is delayed, as is often standard practice. There is no empirical evidence for the harm of sign language exposure but there is some evidence for its benefits, and there is growing evidence that lack of language access has negative implications. This includes cognitive delays, mental health difficulties, lower quality of life, higher trauma, and limited health literacy. Claims of cochlear implant- and spoken language-only approaches being more effective than sign language-inclusive approaches are not empirically supported. Cochlear implants are an unreliable standalone first-language intervention for deaf children. Priorities of deaf child development should focus on healthy growth of all developmental domains through a fully-accessible first language foundation such as sign language, rather than auditory deprivation and speech skills.

  2. Usability of American Sign Language Videos for Presenting Mathematics Assessment Content.

    PubMed

    Hansen, Eric G; Loew, Ruth C; Laitusis, Cara C; Kushalnagar, Poorna; Pagliaro, Claudia M; Kurz, Christopher

    2018-04-12

    There is considerable interest in determining whether high-quality American Sign Language videos can be used as an accommodation in tests of mathematics at both K-12 and postsecondary levels; and in learning more about the usability (e.g., comprehensibility) of ASL videos with two different types of signers - avatar (animated figure) and human. The researchers describe the results of administering each of nine pre-college mathematics items in both avatar and human versions to each of 31 Deaf participants with high school and post-high school backgrounds. This study differed from earlier studies by obliging the participants to rely on the ASL videos to answer the items. While participants preferred the human version over the avatar version (apparently due largely to the better expressiveness and fluency of the human), there was no discernible relationship between mathematics performance and signed version.

  3. How to Create Shared Symbols.

    PubMed

    Fay, Nicolas; Walker, Bradley; Swoboda, Nik; Garrod, Simon

    2018-05-01

    Human cognition and behavior are dominated by symbol use. This paper examines the social learning strategies that give rise to symbolic communication. Experiment 1 contrasts an individual-level account, based on observational learning and cognitive bias, with an inter-individual account, based on social coordinative learning. Participants played a referential communication game in which they tried to communicate a range of recurring meanings to a partner by drawing, but without using their conventional language. Individual-level learning, via observation and cognitive bias, was sufficient to produce signs that became increasingly effective, efficient, and shared over games. However, breaking a referential precedent eliminated these benefits. The most effective, most efficient, and most shared signs arose when participants could directly interact with their partner, indicating that social coordinative learning is important to the creation of shared symbols. Experiment 2 investigated the contribution of two distinct aspects of social interaction: behavior alignment and concurrent partner feedback. Each played a complementary role in the creation of shared symbols: Behavior alignment primarily drove communication effectiveness, and partner feedback primarily drove the efficiency of the evolved signs. In conclusion, inter-individual social coordinative learning is important to the evolution of effective, efficient, and shared symbols. Copyright © 2018 Cognitive Science Society, Inc.

  4. Spoken Language Activation Alters Subsequent Sign Language Activation in L2 Learners of American Sign Language.

    PubMed

    Williams, Joshua T; Newman, Sharlene D

    2017-02-01

    A large body of literature has characterized unimodal monolingual and bilingual lexicons and how neighborhood density affects lexical access; however there have been relatively fewer studies that generalize these findings to bimodal (M2) second language (L2) learners of sign languages. The goal of the current study was to investigate parallel language activation in M2L2 learners of sign language and to characterize the influence of spoken language and sign language neighborhood density on the activation of ASL signs. A priming paradigm was used in which the neighbors of the sign target were activated with a spoken English word and compared the activation of the targets in sparse and dense neighborhoods. Neighborhood density effects in auditory primed lexical decision task were then compared to previous reports of native deaf signers who were only processing sign language. Results indicated reversed neighborhood density effects in M2L2 learners relative to those in deaf signers such that there were inhibitory effects of handshape density and facilitatory effects of location density. Additionally, increased inhibition for signs in dense handshape neighborhoods was greater for high proficiency L2 learners. These findings support recent models of the hearing bimodal bilingual lexicon, which posit lateral links between spoken language and sign language lexical representations.

  5. Language Policy and Planning: The Case of Italian Sign Language

    ERIC Educational Resources Information Center

    Geraci, Carlo

    2012-01-01

    Italian Sign Language (LIS) is the name of the language used by the Italian Deaf community. The acronym LIS derives from Lingua italiana dei segni ("Italian language of signs"), although nowadays Italians refers to LIS as Lingua dei segni italiana, reflecting the more appropriate phrasing "Italian sign language." Historically,…

  6. Signs of Change: Contemporary Attitudes to Australian Sign Language

    ERIC Educational Resources Information Center

    Slegers, Claudia

    2010-01-01

    This study explores contemporary attitudes to Australian Sign Language (Auslan). Since at least the 1960s, sign languages have been accepted by linguists as natural languages with all of the key ingredients common to spoken languages. However, these visual-spatial languages have historically been subject to ignorance and myth in Australia and…

  7. Sign language ability in young deaf signers predicts comprehension of written sentences in English.

    PubMed

    Andrew, Kathy N; Hoshooley, Jennifer; Joanisse, Marc F

    2014-01-01

    We investigated the robust correlation between American Sign Language (ASL) and English reading ability in 51 young deaf signers ages 7;3 to 19;0. Signers were divided into 'skilled' and 'less-skilled' signer groups based on their performance on three measures of ASL. We next assessed reading comprehension of four English sentence structures (actives, passives, pronouns, reflexive pronouns) using a sentence-to-picture-matching task. Of interest was the extent to which ASL proficiency provided a foundation for lexical and syntactic processes of English. Skilled signers outperformed less-skilled signers overall. Error analyses further indicated greater single-word recognition difficulties in less-skilled signers marked by a higher rate of errors reflecting an inability to identify the actors and actions described in the sentence. Our findings provide evidence that increased ASL ability supports English sentence comprehension both at the levels of individual words and syntax. This is consistent with the theory that first language learning promotes second language through transference of linguistic elements irrespective of the transparency of mapping of grammatical structures between the two languages.

  8. Sign Language Ability in Young Deaf Signers Predicts Comprehension of Written Sentences in English

    PubMed Central

    Andrew, Kathy N.; Hoshooley, Jennifer; Joanisse, Marc F.

    2014-01-01

    We investigated the robust correlation between American Sign Language (ASL) and English reading ability in 51 young deaf signers ages 7;3 to 19;0. Signers were divided into ‘skilled’ and ‘less-skilled’ signer groups based on their performance on three measures of ASL. We next assessed reading comprehension of four English sentence structures (actives, passives, pronouns, reflexive pronouns) using a sentence-to-picture-matching task. Of interest was the extent to which ASL proficiency provided a foundation for lexical and syntactic processes of English. Skilled signers outperformed less-skilled signers overall. Error analyses further indicated greater single-word recognition difficulties in less-skilled signers marked by a higher rate of errors reflecting an inability to identify the actors and actions described in the sentence. Our findings provide evidence that increased ASL ability supports English sentence comprehension both at the levels of individual words and syntax. This is consistent with the theory that first language learning promotes second language through transference of linguistic elements irrespective of the transparency of mapping of grammatical structures between the two languages. PMID:24587174

  9. Representational momentum for the human body: awkwardness matters, experience does not.

    PubMed

    Wilson, Margaret; Lancaster, Jessy; Emmorey, Karen

    2010-08-01

    Perception of the human body appears to involve predictive simulations that project forward to track unfolding body-motion events. Here we use representational momentum (RM) to investigate whether implicit knowledge of a learned arbitrary system of body movement such as sign language influences this prediction process, and how this compares to implicit knowledge of biomechanics. Experiment 1 showed greater RM for sign language stimuli in the correct direction of the sign than in the reverse direction, but unexpectedly this held true for non-signers as well as signers. Experiment 2 supported two biomechanical explanations for this result (an effect of downward movement, and an effect of the direction that the movement had actually been performed by the model), and Experiments 3 and 4 found no residual enhancement of RM in signers when these factors were controlled. In fact, surprisingly, the opposite was found: signers showed reduced RM for signs. Experiment 5 verified the effect of biomechanical knowledge by testing arm movements that are easy to perform in one direction but awkward in the reverse direction, and found greater RM for the easy direction. We conclude that while perceptual prediction is shaped by implicit knowledge of biomechanics (the awkwardness effect), it is surprisingly insensitive to expectations derived from learned movement patterns. Results are discussed in terms of recent findings on the mirror system. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  10. Adapting Tests of Sign Language Assessment for Other Sign Languages--A Review of Linguistic, Cultural, and Psychometric Problems

    ERIC Educational Resources Information Center

    Haug, Tobias; Mann, Wolfgang

    2008-01-01

    Given the current lack of appropriate assessment tools for measuring deaf children's sign language skills, many test developers have used existing tests of other sign languages as templates to measure the sign language used by deaf people in their country. This article discusses factors that may influence the adaptation of assessment tests from…

  11. Legal and Ethical Imperatives for Using Certified Sign Language Interpreters in Health Care Settings: How to "Do No Harm" When "It's (All) Greek" (Sign Language) to You.

    PubMed

    Nonaka, Angela M

    2016-09-01

    Communication obstacles in health care settings adversely impact patient-practitioner interactions by impeding service efficiency, reducing mutual trust and satisfaction, or even endangering health outcomes. When interlocutors are separated by language, interpreters are required. The efficacy of interpreting, however, is constrained not just by interpreters' competence but also by health care providers' facility working with interpreters. Deaf individuals whose preferred form of communication is a signed language often encounter communicative barriers in health care settings. In those environments, signing Deaf people are entitled to equal communicative access via sign language interpreting services according to the Americans with Disabilities Act and Executive Order 13166, the Limited English Proficiency Initiative. Yet, litigation in states across the United States suggests that individual and institutional providers remain uncertain about their legal obligations to provide equal communicative access. This article discusses the legal and ethical imperatives for using professionally certified (vs. ad hoc) sign language interpreters in health care settings. First outlining the legal terrain governing provision of sign language interpreting services, the article then describes different types of "sign language" (e.g., American Sign Language vs. manually coded English) and different forms of "sign language interpreting" (e.g., interpretation vs. transliteration vs. translation; simultaneous vs. consecutive interpreting; individual vs. team interpreting). This is followed by reviews of the formal credentialing process and of specialized forms of sign language interpreting-that is, certified deaf interpreting, trilingual interpreting, and court interpreting. After discussing practical steps for contracting professional sign language interpreters and addressing ethical issues of confidentiality, this article concludes by offering suggestions for working more effectively with Deaf clients via professional sign language interpreters.

  12. Move...to Music Learning

    ERIC Educational Resources Information Center

    Heimann, Hope

    1977-01-01

    Through simple and pleasurable movement experiences a very young child can be led to an understanding of the intricate craft of music and the signs and symbols of the music "language", while developing at the same time an increasing appreciation and perception of music as an artistic expression. (Author)

  13. Sentence Repetition in Deaf Children with Specific Language Impairment in British Sign Language

    ERIC Educational Resources Information Center

    Marshall, Chloë; Mason, Kathryn; Rowley, Katherine; Herman, Rosalind; Atkinson, Joanna; Woll, Bencie; Morgan, Gary

    2015-01-01

    Children with specific language impairment (SLI) perform poorly on sentence repetition tasks across different spoken languages, but until now, this methodology has not been investigated in children who have SLI in a signed language. Users of a natural sign language encode different sentence meanings through their choice of signs and by altering…

  14. Tactile Signing with One-Handed Perception

    ERIC Educational Resources Information Center

    Mesch, Johanna

    2013-01-01

    Tactile signing among persons with deaf-blindness is not homogenous; rather, like other forms of language, it exhibits variation, especially in turn taking. Early analyses of tactile Swedish Sign Language, tactile Norwegian Sign Language, and tactile French Sign Language focused on tactile communication with four hands, in which partially blind or…

  15. Numeral Incorporation in Japanese Sign Language

    ERIC Educational Resources Information Center

    Ktejik, Mish

    2013-01-01

    This article explores the morphological process of numeral incorporation in Japanese Sign Language. Numeral incorporation is defined and the available research on numeral incorporation in signed language is discussed. The numeral signs in Japanese Sign Language are then introduced and followed by an explanation of the numeral morphemes which are…

  16. The Legal Recognition of Sign Languages

    ERIC Educational Resources Information Center

    De Meulder, Maartje

    2015-01-01

    This article provides an analytical overview of the different types of explicit legal recognition of sign languages. Five categories are distinguished: constitutional recognition, recognition by means of general language legislation, recognition by means of a sign language law or act, recognition by means of a sign language law or act including…

  17. Differences in praxis performance and receptive language during fingerspelling between deaf children with and without autism spectrum disorder.

    PubMed

    Bhat, Anjana N; Srinivasan, Sudha M; Woxholdt, Colleen; Shield, Aaron

    2018-04-01

    Children with autism spectrum disorder present with a variety of social communication deficits such as atypicalities in social gaze and verbal and non-verbal communication delays as well as perceptuo-motor deficits like motor incoordination and dyspraxia. In this study, we had the unique opportunity to study praxis performance in deaf children with and without autism spectrum disorder in a fingerspelling context using American Sign Language. A total of 11 deaf children with autism spectrum disorder and 11 typically developing deaf children aged between 5 and 14 years completed a fingerspelling task. Children were asked to fingerspell 15 different words shown on an iPad. We coded various praxis errors and fingerspelling time. The deaf children with autism spectrum disorder had greater errors in pace, sequence precision, accuracy, and body part use and also took longer to fingerspell each word. Additionally, the deaf children with autism spectrum disorder had poor receptive language skills and this strongly correlated with their praxis performance and autism severity. These findings extend the evidence for dyspraxia in hearing children with autism spectrum disorder to deaf children with autism spectrum disorder. Poor sign language production in children with autism spectrum disorder may contribute to their poor gestural learning/comprehension and vice versa. Our findings have therapeutic implications for children with autism spectrum disorder when teaching sign language.

  18. Development of Web-based Distributed Cooperative Development Environmentof Sign-Language Animation System and its Evaluation

    NASA Astrophysics Data System (ADS)

    Yuizono, Takaya; Hara, Kousuke; Nakayama, Shigeru

    A web-based distributed cooperative development environment of sign-language animation system has been developed. We have extended the system from the previous animation system that was constructed as three tiered system which consists of sign-language animation interface layer, sign-language data processing layer, and sign-language animation database. Two components of a web client using VRML plug-in and web servlet are added to the previous system. The systems can support humanoid-model avatar for interoperability, and can use the stored sign language animation data shared on the database. It is noted in the evaluation of this system that the inverse kinematics function of web client improves the sign-language animation making.

  19. On the Conventionalization of Mouth Actions in Australian Sign Language.

    PubMed

    Johnston, Trevor; van Roekel, Jane; Schembri, Adam

    2016-03-01

    This study investigates the conventionalization of mouth actions in Australian Sign Language. Signed languages were once thought of as simply manual languages because the hands produce the signs which individually and in groups are the symbolic units most easily equated with the words, phrases and clauses of spoken languages. However, it has long been acknowledged that non-manual activity, such as movements of the body, head and the face play a very important role. In this context, mouth actions that occur while communicating in signed languages have posed a number of questions for linguists: are the silent mouthings of spoken language words simply borrowings from the respective majority community spoken language(s)? Are those mouth actions that are not silent mouthings of spoken words conventionalized linguistic units proper to each signed language, culturally linked semi-conventional gestural units shared by signers with members of the majority speaking community, or even gestures and expressions common to all humans? We use a corpus-based approach to gather evidence of the extent of the use of mouth actions in naturalistic Australian Sign Language-making comparisons with other signed languages where data is available--and the form/meaning pairings that these mouth actions instantiate.

  20. How Do Young Deaf Children Learn to Read? A Proposed Model of Deaf Children's Emergent Reading Behaviors. Technical Report No. 329.

    ERIC Educational Resources Information Center

    Andrews, Jean F.; Mason, Jana M.

    Evidence from a nine-month longitudinal study of deaf children's early attempts at learning to read provides the construct for an instructional model that stresses that even though the children may have, at the least, a meager expressive sign language vocabulary, they can be lead successfully through the holophrastic or one-word stage of reading…

  1. Sharing Shakespeare: Integrating Literature, Technology, and American Sign Language.

    ERIC Educational Resources Information Center

    Cambridge, Theresa; Abdulezer, Susan

    1998-01-01

    The Sharing Shakespeare project at New York City's Public School for the Deaf developed a unique, elective, process-oriented literature class that combined teenage profoundly deaf students and culturally diverse, limited-English-proficient hearing students. Aided by multimedia technologies, these students collaborated in learning, reading, and…

  2. Comparing Action Gestures and Classifier Verbs of Motion: Evidence from Australian Sign Language, Taiwan Sign Language, and Nonsigners' Gestures without Speech

    ERIC Educational Resources Information Center

    Schembri, Adam; Jones, Caroline; Burnham, Denis

    2005-01-01

    Recent research into signed languages indicates that signs may share some properties with gesture, especially in the use of space in classifier constructions. A prediction of this proposal is that there will be similarities in the representation of motion events by sign-naive gesturers and by native signers of unrelated signed languages. This…

  3. Planning Sign Languages: Promoting Hearing Hegemony? Conceptualizing Sign Language Standardization

    ERIC Educational Resources Information Center

    Eichmann, Hanna

    2009-01-01

    In light of the absence of a codified standard variety in British Sign Language and German Sign Language ("Deutsche Gebardensprache") there have been repeated calls for the standardization of both languages primarily from outside the Deaf community. The paper is based on a recent grounded theory study which explored perspectives on sign…

  4. THE PARADOX OF SIGN LANGUAGE MORPHOLOGY

    PubMed Central

    Aronoff, Mark; Meir, Irit; Sandler, Wendy

    2011-01-01

    Sign languages have two strikingly different kinds of morphological structure: sequential and simultaneous. The simultaneous morphology of two unrelated sign languages, American and Israeli Sign Language, is very similar and is largely inflectional, while what little sequential morphology we have found differs significantly and is derivational. We show that at least two pervasive types of inflectional morphology, verb agreement and classifier constructions, are iconically grounded in spatiotemporal cognition, while the sequential patterns can be traced to normal historical development. We attribute the paucity of sequential morphology in sign languages to their youth. This research both brings sign languages much closer to spoken languages in their morphological structure and shows how the medium of communication contributes to the structure of languages.* PMID:22223926

  5. Restoring the voids of voices by signs and gestures, in dentistry: A cross-sectional study.

    PubMed

    Jain, Suyog; Duggi, Vijay; Avinash, Alok; Dubey, Alok; Fouzdar, Sambodhi; Sagar, Mylavarapu Krishna

    2017-01-01

    To help dentists to communicate with the hearing impaired patients, reach an accurate diagnosis and explain the treatment plan by learning some signs and gestures used in the nonverbal communication (NVC) and by devising some new signs and gestures related to dentistry which shall be easy to learn and understand both by the hearing impaired patients and the dentists. The study was carried out on 100 hearing impaired students in the age group of 10-14 years in two special schools for hearing impaired children located in two different states of India, where different spoken languages and different sign languages are used. One dentist (expert dentist) was trained in the NVC and the other dentist (non expert dentist) had no knowledge of this type of communication, communicated the same sets of statements related to dentistry, to the hearing impaired children. One1 translator was assigned to judge their interactions. Students were asked to tell the interpreter at the end of each signed interaction what they understood from the statement conveyed to them by both the dentists. All data collected were subjected to statistical analysis using Chi-square test and odds ratio test. In the special school of 1st state, the nonexpert dentist conveyed only 36.3% of the information correctly to the students, whereas the expert dentist conveyed 83% of the information correctly. In the special school of 2nd state, the nonexpert dentist conveyed only 37.5% of the information correctly to the students, whereas the expert dentist conveyed 80.3% of the information correctly. Dentists should be made aware of the NVC and signs and gestures related to dentistry should be taught to the hearing impaired students as well as the dental students.

  6. Imitation, Sign Language Skill and the Developmental Ease of Language Understanding (D-ELU) Model.

    PubMed

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2016-01-01

    Imitation and language processing are closely connected. According to the Ease of Language Understanding (ELU) model (Rönnberg et al., 2013) pre-existing mental representation of lexical items facilitates language understanding. Thus, imitation of manual gestures is likely to be enhanced by experience of sign language. We tested this by eliciting imitation of manual gestures from deaf and hard-of-hearing (DHH) signing and hearing non-signing children at a similar level of language and cognitive development. We predicted that the DHH signing children would be better at imitating gestures lexicalized in their own sign language (Swedish Sign Language, SSL) than unfamiliar British Sign Language (BSL) signs, and that both groups would be better at imitating lexical signs (SSL and BSL) than non-signs. We also predicted that the hearing non-signing children would perform worse than DHH signing children with all types of gestures the first time (T1) we elicited imitation, but that the performance gap between groups would be reduced when imitation was elicited a second time (T2). Finally, we predicted that imitation performance on both occasions would be associated with linguistic skills, especially in the manual modality. A split-plot repeated measures ANOVA demonstrated that DHH signers imitated manual gestures with greater precision than non-signing children when imitation was elicited the second but not the first time. Manual gestures were easier to imitate for both groups when they were lexicalized than when they were not; but there was no difference in performance between familiar and unfamiliar gestures. For both groups, language skills at T1 predicted imitation at T2. Specifically, for DHH children, word reading skills, comprehension and phonological awareness of sign language predicted imitation at T2. For the hearing participants, language comprehension predicted imitation at T2, even after the effects of working memory capacity and motor skills were taken into account. These results demonstrate that experience of sign language enhances the ability to imitate manual gestures once representations have been established, and suggest that the inherent motor patterns of lexical manual gestures are better suited for representation than those of non-signs. This set of findings prompts a developmental version of the ELU model, D-ELU.

  7. Imitation, Sign Language Skill and the Developmental Ease of Language Understanding (D-ELU) Model

    PubMed Central

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2016-01-01

    Imitation and language processing are closely connected. According to the Ease of Language Understanding (ELU) model (Rönnberg et al., 2013) pre-existing mental representation of lexical items facilitates language understanding. Thus, imitation of manual gestures is likely to be enhanced by experience of sign language. We tested this by eliciting imitation of manual gestures from deaf and hard-of-hearing (DHH) signing and hearing non-signing children at a similar level of language and cognitive development. We predicted that the DHH signing children would be better at imitating gestures lexicalized in their own sign language (Swedish Sign Language, SSL) than unfamiliar British Sign Language (BSL) signs, and that both groups would be better at imitating lexical signs (SSL and BSL) than non-signs. We also predicted that the hearing non-signing children would perform worse than DHH signing children with all types of gestures the first time (T1) we elicited imitation, but that the performance gap between groups would be reduced when imitation was elicited a second time (T2). Finally, we predicted that imitation performance on both occasions would be associated with linguistic skills, especially in the manual modality. A split-plot repeated measures ANOVA demonstrated that DHH signers imitated manual gestures with greater precision than non-signing children when imitation was elicited the second but not the first time. Manual gestures were easier to imitate for both groups when they were lexicalized than when they were not; but there was no difference in performance between familiar and unfamiliar gestures. For both groups, language skills at T1 predicted imitation at T2. Specifically, for DHH children, word reading skills, comprehension and phonological awareness of sign language predicted imitation at T2. For the hearing participants, language comprehension predicted imitation at T2, even after the effects of working memory capacity and motor skills were taken into account. These results demonstrate that experience of sign language enhances the ability to imitate manual gestures once representations have been established, and suggest that the inherent motor patterns of lexical manual gestures are better suited for representation than those of non-signs. This set of findings prompts a developmental version of the ELU model, D-ELU. PMID:26909050

  8. Direction Asymmetries in Spoken and Signed Language Interpreting

    ERIC Educational Resources Information Center

    Nicodemus, Brenda; Emmorey, Karen

    2013-01-01

    Spoken language (unimodal) interpreters often prefer to interpret from their non-dominant language (L2) into their native language (L1). Anecdotally, signed language (bimodal) interpreters express the opposite bias, preferring to interpret from L1 (spoken language) into L2 (signed language). We conducted a large survey study ("N" =…

  9. Grammar, Gesture, and Meaning in American Sign Language.

    ERIC Educational Resources Information Center

    Liddell, Scott K.

    In sign languages of the Deaf, now recognized as fully legitimate human languages, some signs can meaningfully point toward things or can be meaningfully placed in the space ahead of the signer. Such spatial uses of sign are an obligatory part of fluent grammatical signing. There is no parallel for this in vocally produced languages. This book…

  10. A Stronger Reason for the Right to Sign Languages

    ERIC Educational Resources Information Center

    Trovato, Sara

    2013-01-01

    Is the right to sign language only the right to a minority language? Holding a capability (not a disability) approach, and building on the psycholinguistic literature on sign language acquisition, I make the point that this right is of a stronger nature, since only sign languages can guarantee that each deaf child will properly develop the…

  11. How sensory-motor systems impact the neural organization for language: direct contrasts between spoken and signed language

    PubMed Central

    Emmorey, Karen; McCullough, Stephen; Mehta, Sonya; Grabowski, Thomas J.

    2014-01-01

    To investigate the impact of sensory-motor systems on the neural organization for language, we conducted an H215O-PET study of sign and spoken word production (picture-naming) and an fMRI study of sign and audio-visual spoken language comprehension (detection of a semantically anomalous sentence) with hearing bilinguals who are native users of American Sign Language (ASL) and English. Directly contrasting speech and sign production revealed greater activation in bilateral parietal cortex for signing, while speaking resulted in greater activation in bilateral superior temporal cortex (STC) and right frontal cortex, likely reflecting auditory feedback control. Surprisingly, the language production contrast revealed a relative increase in activation in bilateral occipital cortex for speaking. We speculate that greater activation in visual cortex for speaking may actually reflect cortical attenuation when signing, which functions to distinguish self-produced from externally generated visual input. Directly contrasting speech and sign comprehension revealed greater activation in bilateral STC for speech and greater activation in bilateral occipital-temporal cortex for sign. Sign comprehension, like sign production, engaged bilateral parietal cortex to a greater extent than spoken language. We hypothesize that posterior parietal activation in part reflects processing related to spatial classifier constructions in ASL and that anterior parietal activation may reflect covert imitation that functions as a predictive model during sign comprehension. The conjunction analysis for comprehension revealed that both speech and sign bilaterally engaged the inferior frontal gyrus (with more extensive activation on the left) and the superior temporal sulcus, suggesting an invariant bilateral perisylvian language system. We conclude that surface level differences between sign and spoken languages should not be dismissed and are critical for understanding the neurobiology of language. PMID:24904497

  12. Anatomical Substrates of Visual and Auditory Miniature Second-language Learning

    PubMed Central

    Newman-Norlund, Roger D.; Frey, Scott H.; Petitto, Laura-Ann; Grafton, Scott T.

    2007-01-01

    Longitudinal changes in brain activity during second language (L2) acquisition of a miniature finite-state grammar, named Wernickese, were identified with functional magnetic resonance imaging (fMRI). Participants learned either a visual sign language form or an auditory-verbal form to equivalent proficiency levels. Brain activity during sentence comprehension while hearing/viewing stimuli was assessed at low, medium, and high levels of proficiency in three separate fMRI sessions. Activation in the left inferior frontal gyrus (Broca’s area) correlated positively with improving L2 proficiency, whereas activity in the right-hemisphere (RH) homologue was negatively correlated for both auditory and visual forms of the language. Activity in sequence learning areas including the premotor cortex and putamen also correlated with L2 proficiency. Modality-specific differences in the blood oxygenation level-dependent signal accompanying L2 acquisition were localized to the planum temporale (PT). Participants learning the auditory form exhibited decreasing reliance on bilateral PT sites across sessions. In the visual form, bilateral PT sites increased in activity between Session 1 and Session 2, then decreased in left PT activity from Session 2 to Session 3. Comparison of L2 laterality (as compared to L1 laterality) in auditory and visual groups failed to demonstrate greater RH lateralization for the visual versus auditory L2. These data establish a common role for Broca’s area in language acquisition irrespective of the perceptual form of the language and suggest that L2s are processed similar to first languages even when learned after the ‘‘critical period.’’ The right frontal cortex was not preferentially recruited by visual language after accounting for phonetic/structural complexity and performance. PMID:17129186

  13. Deaf Adolescents' Learning of Cardiovascular Health Information: Sources and Access Challenges

    ERIC Educational Resources Information Center

    Smith, Scott R.; Kushalnagar, Poorna; Hauser, Peter C.

    2015-01-01

    Deaf individuals have more cardiovascular risks than the general population that are believed to be related to their cardiovascular health knowledge disparities. This phenomenological study describes where 20 deaf sign language-using adolescents from Rochester, New York, many who possess many positive characteristics to support their health…

  14. Spatial and Facial Processing in the Signed Discourse of Two Groups of Deaf Signers with Clinical Language Impairment

    ERIC Educational Resources Information Center

    Penn, Claire; Commerford, Ann; Ogilvy, Dale

    2007-01-01

    The linguistic and cognitive profiles of five deaf adults with a sign language disorder were compared with those of matched deaf controls. The test involved a battery of sign language tests, a signed narrative discourse task and a neuropsychological test protocol administered in sign language. Spatial syntax and facial processing were examined in…

  15. A biometric authentication model using hand gesture images.

    PubMed

    Fong, Simon; Zhuang, Yan; Fister, Iztok; Fister, Iztok

    2013-10-30

    A novel hand biometric authentication method based on measurements of the user's stationary hand gesture of hand sign language is proposed. The measurement of hand gestures could be sequentially acquired by a low-cost video camera. There could possibly be another level of contextual information, associated with these hand signs to be used in biometric authentication. As an analogue, instead of typing a password 'iloveu' in text which is relatively vulnerable over a communication network, a signer can encode a biometric password using a sequence of hand signs, 'i' , 'l' , 'o' , 'v' , 'e' , and 'u'. Subsequently the features from the hand gesture images are extracted which are integrally fuzzy in nature, to be recognized by a classification model for telling if this signer is who he claimed himself to be, by examining over his hand shape and the postures in doing those signs. It is believed that everybody has certain slight but unique behavioral characteristics in sign language, so are the different hand shape compositions. Simple and efficient image processing algorithms are used in hand sign recognition, including intensity profiling, color histogram and dimensionality analysis, coupled with several popular machine learning algorithms. Computer simulation is conducted for investigating the efficacy of this novel biometric authentication model which shows up to 93.75% recognition accuracy.

  16. One grammar or two? Sign Languages and the Nature of Human Language

    PubMed Central

    Lillo-Martin, Diane C; Gajewski, Jon

    2014-01-01

    Linguistic research has identified abstract properties that seem to be shared by all languages—such properties may be considered defining characteristics. In recent decades, the recognition that human language is found not only in the spoken modality but also in the form of sign languages has led to a reconsideration of some of these potential linguistic universals. In large part, the linguistic analysis of sign languages has led to the conclusion that universal characteristics of language can be stated at an abstract enough level to include languages in both spoken and signed modalities. For example, languages in both modalities display hierarchical structure at sub-lexical and phrasal level, and recursive rule application. However, this does not mean that modality-based differences between signed and spoken languages are trivial. In this article, we consider several candidate domains for modality effects, in light of the overarching question: are signed and spoken languages subject to the same abstract grammatical constraints, or is a substantially different conception of grammar needed for the sign language case? We look at differences between language types based on the use of space, iconicity, and the possibility for simultaneity in linguistic expression. The inclusion of sign languages does support some broadening of the conception of human language—in ways that are applicable for spoken languages as well. Still, the overall conclusion is that one grammar applies for human language, no matter the modality of expression. PMID:25013534

  17. Sign Language Planning: Pragmatism, Pessimism and Principles

    ERIC Educational Resources Information Center

    Turner, Graham H.

    2009-01-01

    This article introduces the present collection of sign language planning studies. Contextualising the analyses against the backdrop of core issues in the theory of language planning and the evolution of applied sign linguistics, it is argued that--while the sociolinguistic circumstances of signed languages worldwide can, in many respects, be…

  18. Problems for a Sign Language Planning Agency

    ERIC Educational Resources Information Center

    Covington, Virginia

    1977-01-01

    American Sign Language is chiefly untaught and nonstandardized. The Communicative Skills Program of the National Association of the Deaf aims to provide sign language classes for hearing personnel and to increase interpreting services. Programs, funding and aims of the Program are outlined. A government sign language planning agency is proposed.…

  19. Deaf, Hard-of-Hearing, and Hearing Signing Undergraduates’ Attitudes toward Science in Inquiry-Based Biology Laboratory Classes

    PubMed Central

    Gormally, Cara

    2017-01-01

    For science learning to be successful, students must develop attitudes toward support future engagement with challenging social issues related to science. This is especially important for increasing participation of students from underrepresented populations. This study investigated how participation in inquiry-based biology laboratory classes affected students’ attitudes toward science, focusing on deaf, hard-of-hearing, and hearing signing students in bilingual learning environments (i.e., taught in American Sign Language and English). Analysis of reflection assignments and interviews revealed that the majority of students developed positive attitudes toward science and scientific attitudes after participating in inquiry-based biology laboratory classes. Attitudinal growth appears to be driven by student value of laboratory activities, repeated direct engagement with scientific inquiry, and peer collaboration. Students perceived that hands-on experimentation involving peer collaboration and a positive, welcoming learning environment were key features of inquiry-based laboratories, affording attitudinal growth. Students who did not perceive biology as useful for their majors, careers, or lives did not develop positive attitudes. Students highlighted the importance of the climate of the learning environment for encouraging student contribution and noted both the benefits and pitfalls of teamwork. Informed by students’ characterizations of their learning experiences, recommendations are made for inquiry-based learning in college biology. PMID:28188279

  20. John Tracy Clinic: Programa de Ensenanza por Correspondencia para Los Padres de Ninos Sordo-Ciegos de Edad Preescolar (John Tracy Clinic Correspondence Learning Program for Parents of Preschool Deaf-Blind Children).

    ERIC Educational Resources Information Center

    Thielman, Virginia B.; And Others

    Written in Spanish, the document contains a correspondence learning program for parents of deaf blind preschoolers. An introductory section gives preliminary instructions, an introduction to sign language, and a list of resources for deaf blind children. Twelve lessons follow with information on: the parent's role in teaching the child, visual…

  1. A Kinect based sign language recognition system using spatio-temporal features

    NASA Astrophysics Data System (ADS)

    Memiş, Abbas; Albayrak, Songül

    2013-12-01

    This paper presents a sign language recognition system that uses spatio-temporal features on RGB video images and depth maps for dynamic gestures of Turkish Sign Language. Proposed system uses motion differences and accumulation approach for temporal gesture analysis. Motion accumulation method, which is an effective method for temporal domain analysis of gestures, produces an accumulated motion image by combining differences of successive video frames. Then, 2D Discrete Cosine Transform (DCT) is applied to accumulated motion images and temporal domain features transformed into spatial domain. These processes are performed on both RGB images and depth maps separately. DCT coefficients that represent sign gestures are picked up via zigzag scanning and feature vectors are generated. In order to recognize sign gestures, K-Nearest Neighbor classifier with Manhattan distance is performed. Performance of the proposed sign language recognition system is evaluated on a sign database that contains 1002 isolated dynamic signs belongs to 111 words of Turkish Sign Language (TSL) in three different categories. Proposed sign language recognition system has promising success rates.

  2. A Comparison of Comprehension Processes in Sign Language Interpreter Videos with or without Captions.

    PubMed

    Debevc, Matjaž; Milošević, Danijela; Kožuh, Ines

    2015-01-01

    One important theme in captioning is whether the implementation of captions in individual sign language interpreter videos can positively affect viewers' comprehension when compared with sign language interpreter videos without captions. In our study, an experiment was conducted using four video clips with information about everyday events. Fifty-one deaf and hard of hearing sign language users alternately watched the sign language interpreter videos with, and without, captions. Afterwards, they answered ten questions. The results showed that the presence of captions positively affected their rates of comprehension, which increased by 24% among deaf viewers and 42% among hard of hearing viewers. The most obvious differences in comprehension between watching sign language interpreter videos with and without captions were found for the subjects of hiking and culture, where comprehension was higher when captions were used. The results led to suggestions for the consistent use of captions in sign language interpreter videos in various media.

  3. Operationalization of Sign Language Phonological Similarity and Its Effects on Lexical Access

    ERIC Educational Resources Information Center

    Williams, Joshua T.; Stone, Adam; Newman, Sharlene D.

    2017-01-01

    Cognitive mechanisms for sign language lexical access are fairly unknown. This study investigated whether phonological similarity facilitates lexical retrieval in sign languages using measures from a new lexical database for American Sign Language. Additionally, it aimed to determine which similarity metric best fits the present data in order to…

  4. Dictionaries of African Sign Languages: An Overview

    ERIC Educational Resources Information Center

    Schmaling, Constanze H.

    2012-01-01

    This article gives an overview of dictionaries of African sign languages that have been published to date most of which have not been widely distributed. After an introduction into the field of sign language lexicography and a discussion of some of the obstacles that authors of sign language dictionaries face in general, I will show problems…

  5. Effects of Iconicity and Semantic Relatedness on Lexical Access in American Sign Language

    ERIC Educational Resources Information Center

    Bosworth, Rain G.; Emmorey, Karen

    2010-01-01

    Iconicity is a property that pervades the lexicon of many sign languages, including American Sign Language (ASL). Iconic signs exhibit a motivated, nonarbitrary mapping between the form of the sign and its meaning. We investigated whether iconicity enhances semantic priming effects for ASL and whether iconic signs are recognized more quickly than…

  6. Sign language recognition and translation: a multidisciplined approach from the field of artificial intelligence.

    PubMed

    Parton, Becky Sue

    2006-01-01

    In recent years, research has progressed steadily in regard to the use of computers to recognize and render sign language. This paper reviews significant projects in the field beginning with finger-spelling hands such as "Ralph" (robotics), CyberGloves (virtual reality sensors to capture isolated and continuous signs), camera-based projects such as the CopyCat interactive American Sign Language game (computer vision), and sign recognition software (Hidden Markov Modeling and neural network systems). Avatars such as "Tessa" (Text and Sign Support Assistant; three-dimensional imaging) and spoken language to sign language translation systems such as Poland's project entitled "THETOS" (Text into Sign Language Automatic Translator, which operates in Polish; natural language processing) are addressed. The application of this research to education is also explored. The "ICICLE" (Interactive Computer Identification and Correction of Language Errors) project, for example, uses intelligent computer-aided instruction to build a tutorial system for deaf or hard-of-hearing children that analyzes their English writing and makes tailored lessons and recommendations. Finally, the article considers synthesized sign, which is being added to educational material and has the potential to be developed by students themselves.

  7. Hand and mouth: Cortical correlates of lexical processing in British Sign Language and speechreading English

    PubMed Central

    Capek, Cheryl M.; Waters, Dafydd; Woll, Bencie; MacSweeney, Mairéad; Brammer, Michael J.; McGuire, Philip K.; David, Anthony S.; Campbell, Ruth

    2012-01-01

    Spoken languages use one set of articulators – the vocal tract, whereas signed languages use multiple articulators, including both manual and facial actions. How sensitive are the cortical circuits for language processing to the particular articulators that are observed? This question can only be addressed with participants who use both speech and a signed language. In this study, we used fMRI to compare the processing of speechreading and sign processing in deaf native signers of British Sign Language (BSL) who were also proficient speechreaders. The following questions were addressed: To what extent do these different language types rely on a common brain network? To what extent do the patterns of activation differ? How are these networks affected by the articulators that languages use? Common perisylvian regions were activated both for speechreading English words and for BSL signs. Distinctive activation was also observed reflecting the language form. Speechreading elicited greater activation in the left mid-superior temporal cortex than BSL, whereas BSL processing generated greater activation at the parieto-occipito-temporal junction in both hemispheres. We probed this distinction further within BSL, where manual signs can be accompanied by different sorts of mouth action. BSL signs with speech-like mouth actions showed greater superior temporal activation, while signs made with non-speech-like mouth actions showed more activation in posterior and inferior temporal regions. Distinct regions within the temporal cortex are not only differentially sensitive to perception of the distinctive articulators for speech and for sign, but also show sensitivity to the different articulators within the (signed) language. PMID:18284353

  8. Acquisition of graphic communication by a young girl without comprehension of spoken language.

    PubMed

    von Tetzchner, S; Øvreeide, K D; Jørgensen, K K; Ormhaug, B M; Oxholm, B; Warme, R

    To describe a graphic-mode communication intervention involving a girl with intellectual impairment and autism who did not develop comprehension of spoken language. The aim was to teach graphic-mode vocabulary that reflected her interests, preferences, and the activities and routines of her daily life, by providing sufficient cues to the meanings of the graphic representations so that she would not need to comprehend spoken instructions. An individual case study design was selected, including the use of written records, participant observation, and registration of the girl's graphic vocabulary and use of graphic signs and other communicative expressions. While the girl's comprehension (and hence use) of spoken language remained lacking over a 3-year period, she acquired an active use of over 80 photographs and pictograms. The girl was able to cope better with the cognitive and attentional requirements of graphic communication than those of spoken language and manual signs, which had been focused in earlier interventions. Her achievements demonstrate that it is possible for communication-impaired children to learn to use an augmentative and alternative communication system without speech comprehension, provided the intervention utilizes functional strategies and non-language cues to the meaning of the graphic representations that are taught.

  9. The Influence of Deaf People's Dual Category Status on Sign Language Planning: The British Sign Language (Scotland) Act (2015)

    ERIC Educational Resources Information Center

    De Meulder, Maartje

    2017-01-01

    Through the British Sign Language (Scotland) Act, British Sign Language (BSL) was given legal status in Scotland. The main motives for the Act were a desire to put BSL on a similar footing with Gaelic and the fact that in Scotland, BSL signers are the only group whose first language is not English who must rely on disability discrimination…

  10. Type of Iconicity Matters in the Vocabulary Development of Signing Children

    ERIC Educational Resources Information Center

    Ortega, Gerardo; Sümer, Beyza; Özyürek, Asli

    2017-01-01

    Recent research on signed as well as spoken language shows that the iconic features of the target language might play a role in language development. Here, we ask further whether different types of iconic depictions modulate children's preferences for certain types of sign-referent links during vocabulary development in sign language. Results from…

  11. Directionality Effects in Simultaneous Language Interpreting: The Case of Sign Language Interpreters in the Netherlands

    ERIC Educational Resources Information Center

    van Dijk, Rick; Boers, Eveline; Christoffels, Ingrid; Hermans, Daan

    2011-01-01

    The quality of interpretations produced by sign language interpreters was investigated. Twenty-five experienced interpreters were instructed to interpret narratives from (a) spoken Dutch to Sign Language of the Netherlands (SLN), (b) spoken Dutch to Sign Supported Dutch (SSD), and (c) SLN to spoken Dutch. The quality of the interpreted narratives…

  12. Approaching Sign Language Test Construction: Adaptation of the German Sign Language Receptive Skills Test

    ERIC Educational Resources Information Center

    Haug, Tobias

    2011-01-01

    There is a current need for reliable and valid test instruments in different countries in order to monitor deaf children's sign language acquisition. However, very few tests are commercially available that offer strong evidence for their psychometric properties. A German Sign Language (DGS) test focusing on linguistic structures that are acquired…

  13. Similarities & Differences in Two Brazilian Sign Languages.

    ERIC Educational Resources Information Center

    Ferreira-Brito, Lucinda

    1984-01-01

    mparison of sign language used by Urubu-Kaapor Indians in the Amazonian jungle (UKSL) and sign language used by deaf people in Sao Paulo (SPSL). In the former situation, deaf people are more integrated and accepted into their community than in Sao Paulo, because most hearing individuals are able and willing to use sign language to communicate with…

  14. Motives and Outcomes of New Zealand Sign Language Legislation: A Comparative Study between New Zealand and Finland

    ERIC Educational Resources Information Center

    Reffell, Hayley; McKee, Rachel Locker

    2009-01-01

    The medicalized interpretation of deafness has until recently seen the rights and protections of sign language users embedded in disability law. Yet the rights and protections crucial to sign language users centre predominantly on matters of language access, maintenance and identity. Legislators, motivated by pressure from sign language…

  15. Accessibility of spoken, written, and sign language in Landau-Kleffner syndrome: a linguistic and functional MRI study.

    PubMed

    Sieratzki, J S; Calvert, G A; Brammer, M; David, A; Woll, B

    2001-06-01

    Landau-Kleffner syndrome (LKS) is an acquired aphasia which begins in childhood and is thought to arise from an epileptic disorder within the auditory speech cortex. Although the epilepsy usually subsides at puberty, a severe communication impairment often persists. Here we report on a detailed study of a 26-year old, left-handed male, with onset of LKS at age 5 years, who is aphasic for English but who learned British Sign Language (BSL) at age 13. We have investigated his skills in different language modalities, recorded EEGs during wakefulness, sleep, and under conditions of auditory stimulation, measured brain stem auditory-evoked potentials (BAEP), and performed functional MRI (fMRI) during a range of linguistic tasks. Our investigation demonstrated severe restrictions in comprehension and production of spoken English as well as lip-reading, while reading was comparatively less impaired. BSL was by far the most efficient mode of communication. All EEG recordings were normal, while BAEP showed minor abnormalities. fMRI revealed: 1) powerful and extensive bilateral (R > L) activation of auditory cortices in response to heard speech, much stronger than when listening to music; 2) very little response to silent lip-reading; 3) strong activation in the temporo-parieto-occipital association cortex, exclusively in the right hemisphere (RH), when viewing BSL signs. Analysis of these findings provides novel insights into the disturbance of the auditory speech cortex which underlies LKS and its diagnostic evaluation by fMRI, and underpins a strategy of restoring communication abilities in LKS through a natural sign language of the deaf (with Video)

  16. Sociolinguistic Typology and Sign Languages.

    PubMed

    Schembri, Adam; Fenlon, Jordan; Cormier, Kearsy; Johnston, Trevor

    2018-01-01

    This paper examines the possible relationship between proposed social determinants of morphological 'complexity' and how this contributes to linguistic diversity, specifically via the typological nature of the sign languages of deaf communities. We sketch how the notion of morphological complexity, as defined by Trudgill (2011), applies to sign languages. Using these criteria, sign languages appear to be languages with low to moderate levels of morphological complexity. This may partly reflect the influence of key social characteristics of communities on the typological nature of languages. Although many deaf communities are relatively small and may involve dense social networks (both social characteristics that Trudgill claimed may lend themselves to morphological 'complexification'), the picture is complicated by the highly variable nature of the sign language acquisition for most deaf people, and the ongoing contact between native signers, hearing non-native signers, and those deaf individuals who only acquire sign languages in later childhood and early adulthood. These are all factors that may work against the emergence of morphological complexification. The relationship between linguistic typology and these key social factors may lead to a better understanding of the nature of sign language grammar. This perspective stands in contrast to other work where sign languages are sometimes presented as having complex morphology despite being young languages (e.g., Aronoff et al., 2005); in some descriptions, the social determinants of morphological complexity have not received much attention, nor has the notion of complexity itself been specifically explored.

  17. Visual Information Literacy: Reading a Documentary Photograph

    ERIC Educational Resources Information Center

    Abilock, Debbie

    2008-01-01

    Like a printed text, an architectural blueprint, a mathematical equation, or a musical score, a visual image is its own language. Visual literacy has three components: (1) learning; (2) thinking; and (3) communicating. A "literate" person is able to decipher the basic code and syntax, interpret the signs and symbols, correctly apply terms from an…

  18. Emerging Evidence for Instructional Practice: Repeated Viewings of Sign Language Models

    ERIC Educational Resources Information Center

    Beal-Alvarez, Jennifer S.; Huston, Sandra G.

    2014-01-01

    Current initiatives in education, such as No Child Left Behind and the National Common Core Standards movement, call for the use of evidence-based practices, or those instructional practices that are supported by documentation of their effectiveness related to student learning outcomes, including students with special needs. While hearing loss is…

  19. A Compelling Desire for Deafness

    ERIC Educational Resources Information Center

    Veale, David

    2006-01-01

    A case is described of a patient who has a compelling and persistent desire to become deaf. She often kept cotton wool moistened with oil in her ears and was learning sign language. Living without sound appeared to be a severe form of avoidance behavior from hyperacusis and misophonia. She had a borderline personality disorder that was associated…

  20. Sign Language for K-8 Mathematics by 3D Interactive Animation

    ERIC Educational Resources Information Center

    Adamo-Villani, Nicoletta; Doublestein, John; Martin, Zachary

    2005-01-01

    We present a new highly interactive computer animation tool to increase the mathematical skills of deaf children. We aim at increasing the effectiveness of (hearing) parents in teaching arithmetic to their deaf children, and the opportunity of deaf children to learn arithmetic via interactive media. Using state-of-the-art computer animation…

  1. Representational Momentum for the Human Body: Awkwardness Matters, Experience Does Not

    ERIC Educational Resources Information Center

    Wilson, Margaret; Lancaster, Jessy; Emmorey, Karen

    2010-01-01

    Perception of the human body appears to involve predictive simulations that project forward to track unfolding body-motion events. Here we use representational momentum (RM) to investigate whether implicit knowledge of a learned arbitrary system of body movement such as sign language influences this prediction process, and how this compares to…

  2. Eye Gaze in Creative Sign Language

    ERIC Educational Resources Information Center

    Kaneko, Michiko; Mesch, Johanna

    2013-01-01

    This article discusses the role of eye gaze in creative sign language. Because eye gaze conveys various types of linguistic and poetic information, it is an intrinsic part of sign language linguistics in general and of creative signing in particular. We discuss various functions of eye gaze in poetic signing and propose a classification of gaze…

  3. Sign Language with Babies: What Difference Does It Make?

    ERIC Educational Resources Information Center

    Barnes, Susan Kubic

    2010-01-01

    Teaching sign language--to deaf or other children with special needs or to hearing children with hard-of-hearing family members--is not new. Teaching sign language to typically developing children has become increasingly popular since the publication of "Baby Signs"[R] (Goodwyn & Acredolo, 1996), now in its third edition. Attention to signing with…

  4. Sign language aphasia due to left occipital lesion in a deaf signer.

    PubMed

    Saito, Kozue; Otsuki, Mika; Ueno, Satoshi

    2007-10-02

    Localization of sign language production and comprehension in deaf people has been described as similar to that of spoken language aphasia. However, sign language employs a visuospatial modality through visual information. We present the first report of a deaf signer who showed substantial sign language aphasia with severe impairment in word production due to a left occipital lesion. This case may indicate the possibility of other localizations of plasticity.

  5. The role of syllables in sign language production.

    PubMed

    Baus, Cristina; Gutiérrez, Eva; Carreiras, Manuel

    2014-01-01

    The aim of the present study was to investigate the functional role of syllables in sign language and how the different phonological combinations influence sign production. Moreover, the influence of age of acquisition was evaluated. Deaf signers (native and non-native) of Catalan Signed Language (LSC) were asked in a picture-sign interference task to sign picture names while ignoring distractor-signs with which they shared two phonological parameters (out of three of the main sign parameters: Location, Movement, and Handshape). The results revealed a different impact of the three phonological combinations. While no effect was observed for the phonological combination Handshape-Location, the combination Handshape-Movement slowed down signing latencies, but only in the non-native group. A facilitatory effect was observed for both groups when pictures and distractors shared Location-Movement. Importantly, linguistic models have considered this phonological combination to be a privileged unit in the composition of signs, as syllables are in spoken languages. Thus, our results support the functional role of syllable units during phonological articulation in sign language production.

  6. Sociocultural and Academic Considerations for School-Age d/Deaf and Hard of Hearing Multilingual Learners: A Case Study of a Deaf Latina.

    PubMed

    Baker, Sharon; Scott, Jessica

    2016-01-01

    For decades , research has focused on American Sign Language/English bilingual education for d/Deaf and hard of hearing students whose families used English or ASL. However, a growing population of d/Dhh children come from households where languages other than English (e.g., Spanish, Chinese, Vietnamese) are used. In a longitudinal case study, the authors document the K-12 educational pathway of a deaf Latina student. Anecdotal records, semistructured interviews, assessment data, and document reviews of the participant's school and clinical records are used to develop the case study. The findings provide the basis for recommendations for future research and for critical factors to consider to improve the education of d/Dhh Multilingual Learners (DMLs). These include ensuring appropriate educational placements, addressing early communication and language needs, determining effective instructional techniques and assessments, strengthening the L1 to support L2 learning, and providing students with opportunities to learn their heritage language.

  7. Modality-specific processing precedes amodal linguistic processing during L2 sign language acquisition: A longitudinal study.

    PubMed

    Williams, Joshua T; Darcy, Isabelle; Newman, Sharlene D

    2016-02-01

    The present study tracked activation pattern differences in response to sign language processing by late hearing second language learners of American Sign Language. Learners were scanned before the start of their language courses. They were scanned again after their first semester of instruction and their second, for a total of 10 months of instruction. The study aimed to characterize modality-specific to modality-general processing throughout the acquisition of sign language. Results indicated that before the acquisition of sign language, neural substrates related to modality-specific processing were present. After approximately 45 h of instruction, the learners transitioned into processing signs on a phonological basis (e.g., supramarginal gyrus, putamen). After one more semester of input, learners transitioned once more to a lexico-semantic processing stage (e.g., left inferior frontal gyrus) at which language control mechanisms (e.g., left caudate, cingulate gyrus) were activated. During these transitional steps right hemispheric recruitment was observed, with increasing left-lateralization, which is similar to other native signers and L2 learners of spoken language; however, specialization for sign language processing with activation in the inferior parietal lobule (i.e., angular gyrus), even for late learners, was observed. As such, the present study is the first to track L2 acquisition of sign language learners in order to characterize modality-independent and modality-specific mechanisms for bilingual language processing. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Sensitivity to visual prosodic cues in signers and nonsigners.

    PubMed

    Brentari, Diane; González, Carolina; Seidl, Amanda; Wilbur, Ronnie

    2011-03-01

    Three studies are presented in this paper that address how nonsigners perceive the visual prosodic cues in a sign language. In Study 1, adult American nonsigners and users of American Sign Language (ASL) were compared on their sensitivity to the visual cues in ASL Intonational Phrases. In Study 2, hearing, nonsigning American infants were tested using the same stimuli used in Study I to see whether maturity, exposure to gesture, or exposure to sign language is necessary to demonstrate this type of sensitivity. Study 3 addresses nonsigners' and signers' strategies for segmenting Prosodic Words in a sign language. Adult participants from six language groups (3 spoken languages and 3 sign languages) were tested.The results of these three studies indicate that nonsigners have a high degree of sensitivity to sign language prosodic cues at the Intonational Phrase level and the Prosodic Word level; these are attributed to modality or'channel' effects of the visual signal.There are also some differences between signers' and nonsigners' sensitivity; these differences are attributed to language experience or language-particular constraints.This work is useful in understanding the gestural competence of nonsigners and the ways in which this type of competence may contribute to the grammaticalization of these properties in a sign language.

  9. Event representations constrain the structure of language: Sign language as a window into universally accessible linguistic biases.

    PubMed

    Strickland, Brent; Geraci, Carlo; Chemla, Emmanuel; Schlenker, Philippe; Kelepir, Meltem; Pfau, Roland

    2015-05-12

    According to a theoretical tradition dating back to Aristotle, verbs can be classified into two broad categories. Telic verbs (e.g., "decide," "sell," "die") encode a logical endpoint, whereas atelic verbs (e.g., "think," "negotiate," "run") do not, and the denoted event could therefore logically continue indefinitely. Here we show that sign languages encode telicity in a seemingly universal way and moreover that even nonsigners lacking any prior experience with sign language understand these encodings. In experiments 1-5, nonsigning English speakers accurately distinguished between telic (e.g., "decide") and atelic (e.g., "think") signs from (the historically unrelated) Italian Sign Language, Sign Language of the Netherlands, and Turkish Sign Language. These results were not due to participants' inferring that the sign merely imitated the action in question. In experiment 6, we used pseudosigns to show that the presence of a salient visual boundary at the end of a gesture was sufficient to elicit telic interpretations, whereas repeated movement without salient boundaries elicited atelic interpretations. Experiments 7-10 confirmed that these visual cues were used by all of the sign languages studied here. Together, these results suggest that signers and nonsigners share universally accessible notions of telicity as well as universally accessible "mapping biases" between telicity and visual form.

  10. Medical Signbank as a Model for Sign Language Planning? A Review of Community Engagement

    ERIC Educational Resources Information Center

    Napier, Jemina; Major, George; Ferrara, Lindsay; Johnston, Trevor

    2015-01-01

    This paper reviews a sign language planning project conducted in Australia with deaf Auslan users. The Medical Signbank project utilised a cooperative language planning process to engage with the Deaf community and sign language interpreters to develop an online interactive resource of health-related signs, in order to address a gap in the health…

  11. Is Teaching Sign Language in Early Childhood Classrooms Feasible for Busy Teachers and Beneficial for Children?

    ERIC Educational Resources Information Center

    Brereton, Amy Elizabeth

    2010-01-01

    Infants' hands are ready to construct words using sign language before their mouths are ready to speak. These research findings may explain the popularity of parents and caregivers teaching and using sign language with infants and toddlers, along with speech. The advantages of using sign language with young children go beyond the infant and…

  12. Use of Information and Communication Technologies in Sign Language Test Development: Results of an International Survey

    ERIC Educational Resources Information Center

    Haug, Tobias

    2015-01-01

    Sign language test development is a relatively new field within sign linguistics, motivated by the practical need for assessment instruments to evaluate language development in different groups of learners (L1, L2). Due to the lack of research on the structure and acquisition of many sign languages, developing an assessment instrument poses…

  13. Adaptation of a Vocabulary Test from British Sign Language to American Sign Language

    ERIC Educational Resources Information Center

    Mann, Wolfgang; Roy, Penny; Morgan, Gary

    2016-01-01

    This study describes the adaptation process of a vocabulary knowledge test for British Sign Language (BSL) into American Sign Language (ASL) and presents results from the first round of pilot testing with 20 deaf native ASL signers. The web-based test assesses the strength of deaf children's vocabulary knowledge by means of different mappings of…

  14. The Neural Correlates of Highly Iconic Structures and Topographic Discourse in French Sign Language as Observed in Six Hearing Native Signers

    ERIC Educational Resources Information Center

    Courtin, C.; Herve, P. -Y.; Petit, L.; Zago, L.; Vigneau, M.; Beaucousin, V.; Jobard, G.; Mazoyer, B.; Mellet, E.; Tzourio-Mazoyer, N.

    2010-01-01

    "Highly iconic" structures in Sign Language enable a narrator to act, switch characters, describe objects, or report actions in four-dimensions. This group of linguistic structures has no real spoken-language equivalent. Topographical descriptions are also achieved in a sign-language specific manner via the use of signing-space and…

  15. From the World's Trouble Spots They Arrive in Our Classrooms: Working with Deaf Refugees and Immigrants

    ERIC Educational Resources Information Center

    Moers, Pamela Wright

    2017-01-01

    Pamela Wright Moers has worked with American Sign Language (ASL) and English language instruction for over 25 years, and both her work and her studies have focused on the various uses of language. Her research has been on language endangerment, diversity in sign language, third-world sign languages, and the phonological and semantic structures…

  16. How do typically developing deaf children and deaf children with autism spectrum disorder use the face when comprehending emotional facial expressions in British sign language?

    PubMed

    Denmark, Tanya; Atkinson, Joanna; Campbell, Ruth; Swettenham, John

    2014-10-01

    Facial expressions in sign language carry a variety of communicative features. While emotion can modulate a spoken utterance through changes in intonation, duration and intensity, in sign language specific facial expressions presented concurrently with a manual sign perform this function. When deaf adult signers cannot see facial features, their ability to judge emotion in a signed utterance is impaired (Reilly et al. in Sign Lang Stud 75:113-118, 1992). We examined the role of the face in the comprehension of emotion in sign language in a group of typically developing (TD) deaf children and in a group of deaf children with autism spectrum disorder (ASD). We replicated Reilly et al.'s (Sign Lang Stud 75:113-118, 1992) adult results in the TD deaf signing children, confirming the importance of the face in understanding emotion in sign language. The ASD group performed more poorly on the emotion recognition task than the TD children. The deaf children with ASD showed a deficit in emotion recognition during sign language processing analogous to the deficit in vocal emotion recognition that has been observed in hearing children with ASD.

  17. A Comparison of Comprehension Processes in Sign Language Interpreter Videos with or without Captions

    PubMed Central

    Debevc, Matjaž; Milošević, Danijela; Kožuh, Ines

    2015-01-01

    One important theme in captioning is whether the implementation of captions in individual sign language interpreter videos can positively affect viewers’ comprehension when compared with sign language interpreter videos without captions. In our study, an experiment was conducted using four video clips with information about everyday events. Fifty-one deaf and hard of hearing sign language users alternately watched the sign language interpreter videos with, and without, captions. Afterwards, they answered ten questions. The results showed that the presence of captions positively affected their rates of comprehension, which increased by 24% among deaf viewers and 42% among hard of hearing viewers. The most obvious differences in comprehension between watching sign language interpreter videos with and without captions were found for the subjects of hiking and culture, where comprehension was higher when captions were used. The results led to suggestions for the consistent use of captions in sign language interpreter videos in various media. PMID:26010899

  18. Pinky Extension as a Phonestheme in Mongolian Sign Language

    ERIC Educational Resources Information Center

    Healy, Christina

    2011-01-01

    Mongolian Sign Language (MSL) is a visual-gestural language that developed from multiple languages interacting as a result of both geographic proximity and political relations and of the natural development of a communication system by deaf community members. Similar to the phonological systems of other signed languages, MSL combines handshapes,…

  19. Sign language Web pages.

    PubMed

    Fels, Deborah I; Richards, Jan; Hardman, Jim; Lee, Daniel G

    2006-01-01

    The WORLD WIDE WEB has changed the way people interact. It has also become an important equalizer of information access for many social sectors. However, for many people, including some sign language users, Web accessing can be difficult. For some, it not only presents another barrier to overcome but has left them without cultural equality. The present article describes a system that allows sign language-only Web pages to be created and linked through a video-based technique called sign-linking. In two studies, 14 Deaf participants examined two iterations of signlinked Web pages to gauge the usability and learnability of a signing Web page interface. The first study indicated that signing Web pages were usable by sign language users but that some interface features required improvement. The second study showed increased usability for those features; users consequently couldnavigate sign language information with ease and pleasure.

  20. Sociolinguistic Typology and Sign Languages

    PubMed Central

    Schembri, Adam; Fenlon, Jordan; Cormier, Kearsy; Johnston, Trevor

    2018-01-01

    This paper examines the possible relationship between proposed social determinants of morphological ‘complexity’ and how this contributes to linguistic diversity, specifically via the typological nature of the sign languages of deaf communities. We sketch how the notion of morphological complexity, as defined by Trudgill (2011), applies to sign languages. Using these criteria, sign languages appear to be languages with low to moderate levels of morphological complexity. This may partly reflect the influence of key social characteristics of communities on the typological nature of languages. Although many deaf communities are relatively small and may involve dense social networks (both social characteristics that Trudgill claimed may lend themselves to morphological ‘complexification’), the picture is complicated by the highly variable nature of the sign language acquisition for most deaf people, and the ongoing contact between native signers, hearing non-native signers, and those deaf individuals who only acquire sign languages in later childhood and early adulthood. These are all factors that may work against the emergence of morphological complexification. The relationship between linguistic typology and these key social factors may lead to a better understanding of the nature of sign language grammar. This perspective stands in contrast to other work where sign languages are sometimes presented as having complex morphology despite being young languages (e.g., Aronoff et al., 2005); in some descriptions, the social determinants of morphological complexity have not received much attention, nor has the notion of complexity itself been specifically explored. PMID:29515506

  1. On the System of Person-Denoting Signs in Estonian Sign Language: Estonian Name Signs

    ERIC Educational Resources Information Center

    Paales, Liina

    2010-01-01

    This article discusses Estonian personal name signs. According to study there are four personal name sign categories in Estonian Sign Language: (1) arbitrary name signs; (2) descriptive name signs; (3) initialized-descriptive name signs; (4) loan/borrowed name signs. Mostly there are represented descriptive and borrowed personal name signs among…

  2. Approaching sign language test construction: adaptation of the German sign language receptive skills test.

    PubMed

    Haug, Tobias

    2011-01-01

    There is a current need for reliable and valid test instruments in different countries in order to monitor deaf children's sign language acquisition. However, very few tests are commercially available that offer strong evidence for their psychometric properties. A German Sign Language (DGS) test focusing on linguistic structures that are acquired in preschool- and school-aged children (4-8 years old) is urgently needed. Using the British Sign Language Receptive Skills Test, that has been standardized and has sound psychometric properties, as a template for adaptation thus provides a starting point for tests of a sign language that is less documented, such as DGS. This article makes a novel contribution to the field by examining linguistic, cultural, and methodological issues in the process of adapting a test from the source language to the target language. The adapted DGS test has sound psychometric properties and provides the basis for revision prior to standardization. © The Author 2011. Published by Oxford University Press. All rights reserved.

  3. Semantic Fluency in Deaf Children Who Use Spoken and Signed Language in Comparison with Hearing Peers

    ERIC Educational Resources Information Center

    Marshall, C. R.; Jones, A.; Fastelli, A.; Atkinson, J.; Botting, N.; Morgan, G.

    2018-01-01

    Background: Deafness has an adverse impact on children's ability to acquire spoken languages. Signed languages offer a more accessible input for deaf children, but because the vast majority are born to hearing parents who do not sign, their early exposure to sign language is limited. Deaf children as a whole are therefore at high risk of language…

  4. Methodological and Theoretical Issues in the Adaptation of Sign Language Tests: An Example from the Adaptation of a Test to German Sign Language

    ERIC Educational Resources Information Center

    Haug, Tobias

    2012-01-01

    Despite the current need for reliable and valid test instruments in different countries in order to monitor the sign language acquisition of deaf children, very few tests are commercially available that offer strong evidence for their psychometric properties. This mirrors the current state of affairs for many sign languages, where very little…

  5. Examination of Sign Language Education According to the Opinions of Members from a Basic Sign Language Certification Program

    ERIC Educational Resources Information Center

    Akmese, Pelin Pistav

    2016-01-01

    Being hearing impaired limits one's ability to communicate in that it affects all areas of development, particularly speech. One of the methods the hearing impaired use to communicate is sign language. This study, a descriptive study, intends to examine the opinions of individuals who had enrolled in a sign language certification program by using…

  6. A biometric authentication model using hand gesture images

    PubMed Central

    2013-01-01

    A novel hand biometric authentication method based on measurements of the user’s stationary hand gesture of hand sign language is proposed. The measurement of hand gestures could be sequentially acquired by a low-cost video camera. There could possibly be another level of contextual information, associated with these hand signs to be used in biometric authentication. As an analogue, instead of typing a password ‘iloveu’ in text which is relatively vulnerable over a communication network, a signer can encode a biometric password using a sequence of hand signs, ‘i’ , ‘l’ , ‘o’ , ‘v’ , ‘e’ , and ‘u’. Subsequently the features from the hand gesture images are extracted which are integrally fuzzy in nature, to be recognized by a classification model for telling if this signer is who he claimed himself to be, by examining over his hand shape and the postures in doing those signs. It is believed that everybody has certain slight but unique behavioral characteristics in sign language, so are the different hand shape compositions. Simple and efficient image processing algorithms are used in hand sign recognition, including intensity profiling, color histogram and dimensionality analysis, coupled with several popular machine learning algorithms. Computer simulation is conducted for investigating the efficacy of this novel biometric authentication model which shows up to 93.75% recognition accuracy. PMID:24172288

  7. Who Is Qualified to Teach American Sign Language?

    ERIC Educational Resources Information Center

    Kanda, Jan; Fleischer, Larry

    1988-01-01

    Teachers of American Sign Language (ASL) can no longer qualify just by being able to sign well or by being deaf. ASL teachers must respect the language and its history, feel comfortable interacting with the deaf community, have completed formal study of language and pedagogy, be familiar with second-language teaching, and engage in personal and…

  8. Constraints on Negative Prefixation in Polish Sign Language.

    PubMed

    Tomaszewski, Piotr

    2015-01-01

    The aim of this article is to describe a negative prefix, NEG-, in Polish Sign Language (PJM) which appears to be indigenous to the language. This is of interest given the relative rarity of prefixes in sign languages. Prefixed PJM signs were analyzed on the basis of both a corpus of texts signed by 15 deaf PJM users who are either native or near-native signers, and material including a specified range of prefixed signs as demonstrated by native signers in dictionary form (i.e. signs produced in isolation, not as part of phrases or sentences). In order to define the morphological rules behind prefixation on both the phonological and morphological levels, native PJM users were consulted for their expertise. The research results can enrich models for describing processes of grammaticalization in the context of the visual-gestural modality that forms the basis for sign language structure.

  9. Sociolinguistic Variation and Change in British Sign Language Number Signs: Evidence of Leveling?

    ERIC Educational Resources Information Center

    Stamp, Rose; Schembri, Adam; Fenlon, Jordan; Rentelis, Ramas

    2015-01-01

    This article presents findings from the first major study to investigate lexical variation and change in British Sign Language (BSL) number signs. As part of the BSL Corpus Project, number sign variants were elicited from 249 deaf signers from eight sites throughout the UK. Age, school location, and language background were found to be significant…

  10. The role of syllables in sign language production

    PubMed Central

    Baus, Cristina; Gutiérrez, Eva; Carreiras, Manuel

    2014-01-01

    The aim of the present study was to investigate the functional role of syllables in sign language and how the different phonological combinations influence sign production. Moreover, the influence of age of acquisition was evaluated. Deaf signers (native and non-native) of Catalan Signed Language (LSC) were asked in a picture-sign interference task to sign picture names while ignoring distractor-signs with which they shared two phonological parameters (out of three of the main sign parameters: Location, Movement, and Handshape). The results revealed a different impact of the three phonological combinations. While no effect was observed for the phonological combination Handshape-Location, the combination Handshape-Movement slowed down signing latencies, but only in the non-native group. A facilitatory effect was observed for both groups when pictures and distractors shared Location-Movement. Importantly, linguistic models have considered this phonological combination to be a privileged unit in the composition of signs, as syllables are in spoken languages. Thus, our results support the functional role of syllable units during phonological articulation in sign language production. PMID:25431562

  11. The Phonetics of Head and Body Movement in the Realization of American Sign Language Signs.

    PubMed

    Tyrone, Martha E; Mauk, Claude E

    2016-01-01

    Because the primary articulators for sign languages are the hands, sign phonology and phonetics have focused mainly on them and treated other articulators as passive targets. However, there is abundant research on the role of nonmanual articulators in sign language grammar and prosody. The current study examines how hand and head/body movements are coordinated to realize phonetic targets. Kinematic data were collected from 5 deaf American Sign Language (ASL) signers to allow the analysis of movements of the hands, head and body during signing. In particular, we examine how the chin, forehead and torso move during the production of ASL signs at those three phonological locations. Our findings suggest that for signs with a lexical movement toward the head, the forehead and chin move to facilitate convergence with the hand. By comparison, the torso does not move to facilitate convergence with the hand for signs located at the torso. These results imply that the nonmanual articulators serve a phonetic as well as a grammatical or prosodic role in sign languages. Future models of sign phonetics and phonology should take into consideration the movements of the nonmanual articulators in the realization of signs. © 2016 S. Karger AG, Basel.

  12. The Use of Sign Language Pronouns by Native-Signing Children with Autism.

    PubMed

    Shield, Aaron; Meier, Richard P; Tager-Flusberg, Helen

    2015-07-01

    We report the first study on pronoun use by an under-studied research population, children with autism spectrum disorder (ASD) exposed to American Sign Language from birth by their deaf parents. Personal pronouns cause difficulties for hearing children with ASD, who sometimes reverse or avoid them. Unlike speech pronouns, sign pronouns are indexical points to self and other. Despite this transparency, we find evidence from an elicitation task and parental report that signing children with ASD avoid sign pronouns in favor of names. An analysis of spontaneous usage showed that all children demonstrated the ability to point, but only children with better-developed sign language produced pronouns. Differences in language abilities and self-representation may explain these phenomena in sign and speech.

  13. A human mirror neuron system for language: Perspectives from signed languages of the deaf.

    PubMed

    Knapp, Heather Patterson; Corina, David P

    2010-01-01

    Language is proposed to have developed atop the human analog of the macaque mirror neuron system for action perception and production [Arbib M.A. 2005. From monkey-like action recognition to human language: An evolutionary framework for neurolinguistics (with commentaries and author's response). Behavioral and Brain Sciences, 28, 105-167; Arbib M.A. (2008). From grasp to language: Embodied concepts and the challenge of abstraction. Journal de Physiologie Paris 102, 4-20]. Signed languages of the deaf are fully-expressive, natural human languages that are perceived visually and produced manually. We suggest that if a unitary mirror neuron system mediates the observation and production of both language and non-linguistic action, three prediction can be made: (1) damage to the human mirror neuron system should non-selectively disrupt both sign language and non-linguistic action processing; (2) within the domain of sign language, a given mirror neuron locus should mediate both perception and production; and (3) the action-based tuning curves of individual mirror neurons should support the highly circumscribed set of motions that form the "vocabulary of action" for signed languages. In this review we evaluate data from the sign language and mirror neuron literatures and find that these predictions are only partially upheld. 2009 Elsevier Inc. All rights reserved.

  14. Where "Sign Language Studies" Has Led Us in Forty Years: Opening High School and University Education for Deaf People in Viet Nam through Sign Language Analysis, Teaching, and Interpretation

    ERIC Educational Resources Information Center

    Woodward, James; Hoa, Nguyen Thi

    2012-01-01

    This paper discusses how the Nippon Foundation-funded project "Opening University Education to Deaf People in Viet Nam through Sign Language Analysis, Teaching, and Interpretation," also known as the Dong Nai Deaf Education Project, has been implemented through sign language studies from 2000 through 2012. This project has provided deaf…

  15. Standardization of Sign Languages

    ERIC Educational Resources Information Center

    Adam, Robert

    2015-01-01

    Over the years attempts have been made to standardize sign languages. This form of language planning has been tackled by a variety of agents, most notably teachers of Deaf students, social workers, government agencies, and occasionally groups of Deaf people themselves. Their efforts have most often involved the development of sign language books…

  16. Validity of the American Sign Language Discrimination Test

    ERIC Educational Resources Information Center

    Bochner, Joseph H.; Samar, Vincent J.; Hauser, Peter C.; Garrison, Wayne M.; Searls, J. Matt; Sanders, Cynthia A.

    2016-01-01

    American Sign Language (ASL) is one of the most commonly taught languages in North America. Yet, few assessment instruments for ASL proficiency have been developed, none of which have adequately demonstrated validity. We propose that the American Sign Language Discrimination Test (ASL-DT), a recently developed measure of learners' ability to…

  17. Writing Signed Languages: What for? What Form?

    ERIC Educational Resources Information Center

    Grushkin, Donald A.

    2017-01-01

    Signed languages around the world have tended to maintain an "oral," unwritten status. Despite the advantages of possessing a written form of their language, signed language communities typically resist and reject attempts to create such written forms. The present article addresses many of the arguments against written forms of signed…

  18. Audience Effects in American Sign Language Interpretation

    ERIC Educational Resources Information Center

    Weisenberg, Julia

    2009-01-01

    There is a system of English mouthing during interpretation that appears to be the result of language contact between spoken language and signed language. English mouthing is a voiceless visual representation of words on a signer's lips produced concurrently with manual signs. It is a type of borrowing prevalent among English-dominant…

  19. A Field Guide for Sign Language Research.

    ERIC Educational Resources Information Center

    Stokoe, William; Kuschel, Rolf

    Field researchers of sign language are the target of this methodological guide. The prospective researcher is briefed on the rationale of sign language study as language study and as distinct from the study of kinesics. Subjects covered include problems of translating, use of interpreters, and ethics. Instruments for obtaining social and language…

  20. Neural systems supporting linguistic structure, linguistic experience, and symbolic communication in sign language and gesture.

    PubMed

    Newman, Aaron J; Supalla, Ted; Fernandez, Nina; Newport, Elissa L; Bavelier, Daphne

    2015-09-15

    Sign languages used by deaf communities around the world possess the same structural and organizational properties as spoken languages: In particular, they are richly expressive and also tightly grammatically constrained. They therefore offer the opportunity to investigate the extent to which the neural organization for language is modality independent, as well as to identify ways in which modality influences this organization. The fact that sign languages share the visual-manual modality with a nonlinguistic symbolic communicative system-gesture-further allows us to investigate where the boundaries lie between language and symbolic communication more generally. In the present study, we had three goals: to investigate the neural processing of linguistic structure in American Sign Language (using verbs of motion classifier constructions, which may lie at the boundary between language and gesture); to determine whether we could dissociate the brain systems involved in deriving meaning from symbolic communication (including both language and gesture) from those specifically engaged by linguistically structured content (sign language); and to assess whether sign language experience influences the neural systems used for understanding nonlinguistic gesture. The results demonstrated that even sign language constructions that appear on the surface to be similar to gesture are processed within the left-lateralized frontal-temporal network used for spoken languages-supporting claims that these constructions are linguistically structured. Moreover, although nonsigners engage regions involved in human action perception to process communicative, symbolic gestures, signers instead engage parts of the language-processing network-demonstrating an influence of experience on the perception of nonlinguistic stimuli.

  1. Group Writing: How Writing Teaches Reading

    ERIC Educational Resources Information Center

    Campbell-Rush, Peggy

    2006-01-01

    What do Slinky toys, sign language, clipboards, golf pencils, and a house icon have in common? They all are a part of the author's writing and reading program, which teaches children how to write, and then read what they have written. This book includes: effective strategies that address multiple learning styles; a ready-to-use form for ongoing…

  2. Development of Japanese Children's Narrative Skills: Linguistic Devices and Strategies To Encode Their Perspective and Emotion.

    ERIC Educational Resources Information Center

    Minami, Masahiko

    Studies on child language acquisition suggest that Japanese children begin to use a variety of linguistic signs very early. However, even if young Japanese children learned the social pragmatic functions and interactional dimensions of such linguistic means and communicative devices, they might not have acquired the subtleties of those devices…

  3. Learning from the Past: What ESSA Has the Chance to Get Right

    ERIC Educational Resources Information Center

    Dennis, Danielle V.

    2017-01-01

    Signed into law by President Barack Obama in December 2015, the Every Student Succeeds Act (ESSA) replaces No Child Left Behind (NCLB), the 2001 reauthorization of the Elementary and Secondary Education Act. NCLB and its Reading First mandate brought punitive accountability models and scripted core curricula into schools. Based on the language of…

  4. Vocabulary Development: How Deaf Individuals Can Learn to Use the Information Given.

    ERIC Educational Resources Information Center

    Hirsh-Pasek, Kathy; Freyd, Pamela

    To determine if people analyze words in online reading, an experiment was conducted with 12 congenitally deaf, second generation sign language users with a reading level of 6.64 on a standardized reading achievement test. The hearing controls included seventh and eighth grade students who were matched for reading level. Both groups were split in…

  5. Helping the Child with a Cleft Palate in Your Classroom.

    ERIC Educational Resources Information Center

    Moran, Michael J.; Pentz, Arthur L.

    1995-01-01

    Guidelines for teachers of a student with a cleft palate include understand the physical problem; know what kind of speech problem to expect; be alert to the possibility of language-based learning difficulties; watch for signs of hearing loss; be alert to socialization problems; help the student make up work; and avoid self-fulfilling prophecies.…

  6. Legal Pathways to the Recognition of Sign Languages: A Comparison of the Catalan and Spanish Sign Language Acts

    ERIC Educational Resources Information Center

    Quer, Josep

    2012-01-01

    Despite being minority languages like many others, sign languages have traditionally remained absent from the agendas of policy makers and language planning and policies. In the past two decades, though, this situation has started to change at different paces and to different degrees in several countries. In this article, the author describes the…

  7. Numeral-Incorporating Roots in Numeral Systems: A Comparative Analysis of Two Sign Languages

    ERIC Educational Resources Information Center

    Fuentes, Mariana; Massone, Maria Ignacia; Fernandez-Viader, Maria del Pilar; Makotrinsky, Alejandro; Pulgarin, Francisca

    2010-01-01

    Numeral-incorporating roots in the numeral systems of Argentine Sign Language (LSA) and Catalan Sign Language (LSC), as well as the main features of the number systems of both languages, are described and compared. Informants discussed the use of numerals and roots in both languages (in most cases in natural contexts). Ten informants took part in…

  8. Spoken Language Activation Alters Subsequent Sign Language Activation in L2 Learners of American Sign Language

    ERIC Educational Resources Information Center

    Williams, Joshua T.; Newman, Sharlene D.

    2017-01-01

    A large body of literature has characterized unimodal monolingual and bilingual lexicons and how neighborhood density affects lexical access; however there have been relatively fewer studies that generalize these findings to bimodal (M2) second language (L2) learners of sign languages. The goal of the current study was to investigate parallel…

  9. Lexical access in sign language: a computational model.

    PubMed

    Caselli, Naomi K; Cohen-Goldberg, Ariel M

    2014-01-01

    PSYCHOLINGUISTIC THEORIES HAVE PREDOMINANTLY BEEN BUILT UPON DATA FROM SPOKEN LANGUAGE, WHICH LEAVES OPEN THE QUESTION: How many of the conclusions truly reflect language-general principles as opposed to modality-specific ones? We take a step toward answering this question in the domain of lexical access in recognition by asking whether a single cognitive architecture might explain diverse behavioral patterns in signed and spoken language. Chen and Mirman (2012) presented a computational model of word processing that unified opposite effects of neighborhood density in speech production, perception, and written word recognition. Neighborhood density effects in sign language also vary depending on whether the neighbors share the same handshape or location. We present a spreading activation architecture that borrows the principles proposed by Chen and Mirman (2012), and show that if this architecture is elaborated to incorporate relatively minor facts about either (1) the time course of sign perception or (2) the frequency of sub-lexical units in sign languages, it produces data that match the experimental findings from sign languages. This work serves as a proof of concept that a single cognitive architecture could underlie both sign and word recognition.

  10. Is Lhasa Tibetan Sign Language emerging, endangered, or both?

    PubMed

    Hofer, Theresia

    2017-05-24

    This article offers the first overview of the recent emergence of Tibetan Sign Language (TibSL) in Lhasa, capital of the Tibet Autonomous Region (TAR), China. Drawing on short anthropological fieldwork, in 2007 and 2014, with people and organisations involved in the formalisation and promotion of TibSL, the author discusses her findings within the nine-fold UNESCO model for assessing linguistic vitality and endangerment. She follows the adaptation of this model to assess signed languages by the Institute of Sign Languages and Deaf Studies (iSLanDS) at the University of Central Lancashire. The appraisal shows that TibSL appears to be between "severely" and "definitely" endangered, adding to the extant studies on the widespread phenomenon of sign language endangerment. Possible future influences and developments regarding the vitality and use of TibSL in Central Tibet and across the Tibetan plateau are then discussed and certain additions, not considered within the existing assessment model, suggested. In concluding, the article places the situation of TibSL within the wider circumstances of minority (sign) languages in China, Chinese Sign Language (CSL), and the post-2008 movement to promote and use "pure Tibetan language".

  11. American Sign Language Syntax and Analogical Reasoning Skills Are Influenced by Early Acquisition and Age of Entry to Signing Schools for the Deaf

    PubMed Central

    Henner, Jon; Caldwell-Harris, Catherine L.; Novogrodsky, Rama; Hoffmeister, Robert

    2016-01-01

    Failing to acquire language in early childhood because of language deprivation is a rare and exceptional event, except in one population. Deaf children who grow up without access to indirect language through listening, speech-reading, or sign language experience language deprivation. Studies of Deaf adults have revealed that late acquisition of sign language is associated with lasting deficits. However, much remains unknown about language deprivation in Deaf children, allowing myths and misunderstandings regarding sign language to flourish. To fill this gap, we examined signing ability in a large naturalistic sample of Deaf children attending schools for the Deaf where American Sign Language (ASL) is used by peers and teachers. Ability in ASL was measured using a syntactic judgment test and language-based analogical reasoning test, which are two sub-tests of the ASL Assessment Inventory. The influence of two age-related variables were examined: whether or not ASL was acquired from birth in the home from one or more Deaf parents, and the age of entry to the school for the Deaf. Note that for non-native signers, this latter variable is often the age of first systematic exposure to ASL. Both of these types of age-dependent language experiences influenced subsequent signing ability. Scores on the two tasks declined with increasing age of school entry. The influence of age of starting school was not linear. Test scores were generally lower for Deaf children who entered the school of assessment after the age of 12. The positive influence of signing from birth was found for students at all ages tested (7;6–18;5 years old) and for children of all age-of-entry groupings. Our results reflect a continuum of outcomes which show that experience with language is a continuous variable that is sensitive to maturational age. PMID:28082932

  12. American Sign Language Syntax and Analogical Reasoning Skills Are Influenced by Early Acquisition and Age of Entry to Signing Schools for the Deaf.

    PubMed

    Henner, Jon; Caldwell-Harris, Catherine L; Novogrodsky, Rama; Hoffmeister, Robert

    2016-01-01

    Failing to acquire language in early childhood because of language deprivation is a rare and exceptional event, except in one population. Deaf children who grow up without access to indirect language through listening, speech-reading, or sign language experience language deprivation. Studies of Deaf adults have revealed that late acquisition of sign language is associated with lasting deficits. However, much remains unknown about language deprivation in Deaf children, allowing myths and misunderstandings regarding sign language to flourish. To fill this gap, we examined signing ability in a large naturalistic sample of Deaf children attending schools for the Deaf where American Sign Language (ASL) is used by peers and teachers. Ability in ASL was measured using a syntactic judgment test and language-based analogical reasoning test, which are two sub-tests of the ASL Assessment Inventory. The influence of two age-related variables were examined: whether or not ASL was acquired from birth in the home from one or more Deaf parents, and the age of entry to the school for the Deaf. Note that for non-native signers, this latter variable is often the age of first systematic exposure to ASL. Both of these types of age-dependent language experiences influenced subsequent signing ability. Scores on the two tasks declined with increasing age of school entry. The influence of age of starting school was not linear. Test scores were generally lower for Deaf children who entered the school of assessment after the age of 12. The positive influence of signing from birth was found for students at all ages tested (7;6-18;5 years old) and for children of all age-of-entry groupings. Our results reflect a continuum of outcomes which show that experience with language is a continuous variable that is sensitive to maturational age.

  13. Health care system accessibility. Experiences and perceptions of deaf people.

    PubMed

    Steinberg, Annie G; Barnett, Steven; Meador, Helen E; Wiggins, Erin A; Zazove, Philip

    2006-03-01

    People who are deaf use health care services differently than the general population; little research has been carried out to understand the reasons. To better understand the health care experiences of deaf people who communicate in American Sign Language. Qualitative analyses of focus group discussions in 3 U.S. cities. Ninety-one deaf adults who communicate primarily in American Sign Language. We collected information about health care communication and perceptions of clinicians' attitudes. We elicited stories of both positive and negative encounters, as well as recommendations for improving health care. Communication difficulties were ubiquitous. Fear, mistrust, and frustration were prominent in participants' descriptions of health care encounters. Positive experiences were characterized by the presence of medically experienced certified interpreters, health care practitioners with sign language skills, and practitioners who made an effort to improve communication. Many participants acknowledged limited knowledge of their legal rights and did not advocate for themselves. Some participants believed that health care practitioners should learn more about sociocultural aspects of deafness. Deaf people report difficulties using health care services. Physicians can facilitate change to improve this. Future research should explore the perspective of clinicians when working with deaf people, ways to improve communication, and the impact of programs that teach deaf people self-advocacy skills and about their legal rights.

  14. Don’t Assume Deaf Students are Visual Learners

    PubMed Central

    Marschark, Marc; Paivio, Allan; Spencer, Linda J.; Durkin, Andreana; Borgna, Georgianna; Convertino, Carol; Machmer, Elizabeth

    2016-01-01

    In the education of deaf learners, from primary school to postsecondary settings, it frequently is suggested that deaf students are visual learners. That assumption appears to be based on the visual nature of signed languages—used by some but not all deaf individuals—and the fact that with greater hearing losses, deaf students will rely relatively more on vision than audition. However, the questions of whether individuals with hearing loss are more likely to be visual learners than verbal learners or more likely than hearing peers to be visual learners have not been empirically explored. Several recent studies, in fact, have indicated that hearing learners typically perform as well or better than deaf learners on a variety of visual-spatial tasks. The present study used two standardized instruments to examine learning styles among college deaf students who primarily rely on sign language or spoken language and their hearing peers. The visual-verbal dimension was of particular interest. Consistent with recent indirect findings, results indicated that deaf students are no more likely than hearing students to be visual learners and are no stronger in their visual skills and habits than their verbal skills and habits, nor are deaf students’ visual orientations associated with sign language skills. The results clearly have specific implications for the educating of deaf learners. PMID:28344430

  15. With or without Semantic Mediation: Retrieval of Lexical Representations in Sign Production

    ERIC Educational Resources Information Center

    Navarrete, Eduardo; Caccaro, Arianna; Pavani, Francesco; Mahon, Bradford Z.; Peressotti, Francesca

    2015-01-01

    How are lexical representations retrieved during sign production? Similar to spoken languages, lexical representation in sign language must be accessed through semantics when naming pictures. However, it remains an open issue whether lexical representations in sign language can be accessed via routes that bypass semantics when retrieval is…

  16. The Mechanics of Fingerspelling: Analyzing Ethiopian Sign Language

    ERIC Educational Resources Information Center

    Duarte, Kyle

    2010-01-01

    Ethiopian Sign Language utilizes a fingerspelling system that represents Amharic orthography. Just as each character of the Amharic abugida encodes a consonant-vowel sound pair, each sign in the Ethiopian Sign Language fingerspelling system uses handshape to encode a base consonant, as well as a combination of timing, placement, and orientation to…

  17. Kinship in Mongolian Sign Language

    ERIC Educational Resources Information Center

    Geer, Leah

    2011-01-01

    Information and research on Mongolian Sign Language is scant. To date, only one dictionary is available in the United States (Badnaa and Boll 1995), and even that dictionary presents only a subset of the signs employed in Mongolia. The present study describes the kinship system used in Mongolian Sign Language (MSL) based on data elicited from…

  18. Memory for Nonsemantic Attributes of American Sign Language Signs and English Words

    ERIC Educational Resources Information Center

    Siple, Patricia

    1977-01-01

    Two recognition memory experiments were used to study the retention of language and modality of input. A bilingual list of American Sign Language signs and English words was presented to two deaf and two hearing groups, one instructed to remember mode of input, and one hearing group. Findings are analyzed. (CHK)

  19. Identifying Overlapping Language Communities: The Case of Chiriquí and Panamanian Signed Languages

    ERIC Educational Resources Information Center

    Parks, Elizabeth S.

    2016-01-01

    In this paper, I use a holographic metaphor to explain the identification of overlapping sign language communities in Panama. By visualizing Panama's complex signing communities as emitting community "hotspots" through social drama on multiple stages, I employ ethnographic methods to explore overlapping contours of Panama's sign language…

  20. Awareness of Deaf Sign Language and Gang Signs.

    ERIC Educational Resources Information Center

    Smith, Cynthia; Morgan, Robert L.

    There have been increasing incidents of innocent people who use American Sign Language (ASL) or another form of sign language being victimized by gang violence due to misinterpretation of ASL hand formations. ASL is familiar to learners with a variety of disabilities, particularly those in the deaf community. The problem is that gang members have…

  1. Signed language working memory capacity of signed language interpreters and deaf signers.

    PubMed

    Wang, Jihong; Napier, Jemina

    2013-04-01

    This study investigated the effects of hearing status and age of signed language acquisition on signed language working memory capacity. Professional Auslan (Australian sign language)/English interpreters (hearing native signers and hearing nonnative signers) and deaf Auslan signers (deaf native signers and deaf nonnative signers) completed an Auslan working memory (WM) span task. The results revealed that the hearing signers (i.e., the professional interpreters) significantly outperformed the deaf signers on the Auslan WM span task. However, the results showed no significant differences between the native signers and the nonnative signers in their Auslan working memory capacity. Furthermore, there was no significant interaction between hearing status and age of signed language acquisition. Additionally, the study found no significant differences between the deaf native signers (adults) and the deaf nonnative signers (adults) in their Auslan working memory capacity. The findings are discussed in relation to the participants' memory strategies and their early language experience. The findings present challenges for WM theories.

  2. The Processing of Biologically Plausible and Implausible forms in American Sign Language: Evidence for Perceptual Tuning.

    PubMed

    Almeida, Diogo; Poeppel, David; Corina, David

    The human auditory system distinguishes speech-like information from general auditory signals in a remarkably fast and efficient way. Combining psychophysics and neurophysiology (MEG), we demonstrate a similar result for the processing of visual information used for language communication in users of sign languages. We demonstrate that the earliest visual cortical responses in deaf signers viewing American Sign Language (ASL) signs show specific modulations to violations of anatomic constraints that would make the sign either possible or impossible to articulate. These neural data are accompanied with a significantly increased perceptual sensitivity to the anatomical incongruity. The differential effects in the early visual evoked potentials arguably reflect an expectation-driven assessment of somatic representational integrity, suggesting that language experience and/or auditory deprivation may shape the neuronal mechanisms underlying the analysis of complex human form. The data demonstrate that the perceptual tuning that underlies the discrimination of language and non-language information is not limited to spoken languages but extends to languages expressed in the visual modality.

  3. Examining the communication skills of a young cochlear implant pioneer.

    PubMed

    Connor, Carol McDonald

    2006-01-01

    The purpose of this longitudinal case study was to closely examine one deaf child's experience with a cochlear implant and his speech, language, and communication skills from kindergarten through middle and high school using both developmental and sociocultural frameworks. The target child was one of the first children to receive a cochlear implant in the United States in 1988, when he was 5 years of age. The developmental analysis revealed that prior to receiving a cochlear implant the child demonstrated profound delays in speech and language skill development. His speech and language skills grew slowly during the first 3-4 years following implantation, very rapidly from about 5 through 7 years postimplantation, then slowed to rates that were highly similar to same-age peers with normal hearing. The sociocultural analysis revealed that the child's communicative competence improved; that he used sign language but use of sign language decreased as his oral communication skills improved; that as his oral communication skills improved, the adults talked and directed the topic of conversation less frequently; and that topics became less concrete and more personal over time. The results of this study indicate that we may learn more about how to support children who use cochlear implants by examining what they are saying as well as how they are saying it.

  4. Standardizing Chinese Sign Language for Use in Post-Secondary Education

    ERIC Educational Resources Information Center

    Lin, Christina Mien-Chun; Gerner de Garcia, Barbara; Chen-Pichler, Deborah

    2009-01-01

    There are over 100 languages in China, including Chinese Sign Language. Given the large population and geographical dispersion of the country's deaf community, sign variation is to be expected. Language barriers due to lexical variation may exist for deaf college students in China, who often live outside their home regions. In presenting an…

  5. Equity in Education: Signed Language and the Courts

    ERIC Educational Resources Information Center

    Snoddon, Kristin

    2009-01-01

    This article examines several legal cases in Canada, the USA, and Australia involving signed language in education for Deaf students. In all three contexts, signed language rights for Deaf students have been viewed from within a disability legislation framework that either does not extend to recognizing language rights in education or that…

  6. Language Facility and Theory of Mind Development in Deaf Children.

    ERIC Educational Resources Information Center

    Jackson, A. Lyn

    2001-01-01

    Deaf children with signing parents, nonnative signing deaf children, children from a hearing impaired unit, oral deaf children, and hearing controls were tested on theory of Mind (ToM) tasks and a British sign language receptive language test. Language ability correlated positively and significantly with ToM ability. Age underpinned the…

  7. Arabic Sign Language: A Perspective

    ERIC Educational Resources Information Center

    Abdel-Fattah, M. A.

    2005-01-01

    Sign language in the Arab World has been recently recognized and documented. Many efforts have been made to establish the sign language used in individual countries, including Jordan, Egypt, Libya, and the Gulf States, by trying to standardize the language and spread it among members of the Deaf community and those concerned. Such efforts produced…

  8. Regional Sign Language Varieties in Contact: Investigating Patterns of Accommodation

    ERIC Educational Resources Information Center

    Stamp, Rose; Schembri, Adam; Evans, Bronwen G.; Cormier, Kearsy

    2016-01-01

    Short-term linguistic accommodation has been observed in a number of spoken language studies. The first of its kind in sign language research, this study aims to investigate the effects of regional varieties in contact and lexical accommodation in British Sign Language (BSL). Twenty-five participants were recruited from Belfast, Glasgow,…

  9. The sign language skills classroom observation: a process for describing sign language proficiency in classroom settings.

    PubMed

    Reeves, J B; Newell, W; Holcomb, B R; Stinson, M

    2000-10-01

    In collaboration with teachers and students at the National Technical Institute for the Deaf (NTID), the Sign Language Skills Classroom Observation (SLSCO) was designed to provide feedback to teachers on their sign language communication skills in the classroom. In the present article, the impetus and rationale for development of the SLSCO is discussed. Previous studies related to classroom signing and observation methodology are reviewed. The procedure for developing the SLSCO is then described. This procedure included (a) interviews with faculty and students at NTID, (b) identification of linguistic features of sign language important for conveying content to deaf students, (c) development of forms for recording observations of classroom signing, (d) analysis of use of the forms, (e) development of a protocol for conducting the SLSCO, and (f) piloting of the SLSCO in classrooms. The results of use of the SLSCO with NTID faculty during a trial year are summarized.

  10. A Kinect-Based Sign Language Hand Gesture Recognition System for Hearing- and Speech-Impaired: A Pilot Study of Pakistani Sign Language.

    PubMed

    Halim, Zahid; Abbas, Ghulam

    2015-01-01

    Sign language provides hearing and speech impaired individuals with an interface to communicate with other members of the society. Unfortunately, sign language is not understood by most of the common people. For this, a gadget based on image processing and pattern recognition can provide with a vital aid for detecting and translating sign language into a vocal language. This work presents a system for detecting and understanding the sign language gestures by a custom built software tool and later translating the gesture into a vocal language. For the purpose of recognizing a particular gesture, the system employs a Dynamic Time Warping (DTW) algorithm and an off-the-shelf software tool is employed for vocal language generation. Microsoft(®) Kinect is the primary tool used to capture video stream of a user. The proposed method is capable of successfully detecting gestures stored in the dictionary with an accuracy of 91%. The proposed system has the ability to define and add custom made gestures. Based on an experiment in which 10 individuals with impairments used the system to communicate with 5 people with no disability, 87% agreed that the system was useful.

  11. Brain correlates of constituent structure in sign language comprehension.

    PubMed

    Moreno, Antonio; Limousin, Fanny; Dehaene, Stanislas; Pallier, Christophe

    2018-02-15

    During sentence processing, areas of the left superior temporal sulcus, inferior frontal gyrus and left basal ganglia exhibit a systematic increase in brain activity as a function of constituent size, suggesting their involvement in the computation of syntactic and semantic structures. Here, we asked whether these areas play a universal role in language and therefore contribute to the processing of non-spoken sign language. Congenitally deaf adults who acquired French sign language as a first language and written French as a second language were scanned while watching sequences of signs in which the size of syntactic constituents was manipulated. An effect of constituent size was found in the basal ganglia, including the head of the caudate and the putamen. A smaller effect was also detected in temporal and frontal regions previously shown to be sensitive to constituent size in written language in hearing French subjects (Pallier et al., 2011). When the deaf participants read sentences versus word lists, the same network of language areas was observed. While reading and sign language processing yielded identical effects of linguistic structure in the basal ganglia, the effect of structure was stronger in all cortical language areas for written language relative to sign language. Furthermore, cortical activity was partially modulated by age of acquisition and reading proficiency. Our results stress the important role of the basal ganglia, within the language network, in the representation of the constituent structure of language, regardless of the input modality. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Words in the bilingual brain: an fNIRS brain imaging investigation of lexical processing in sign-speech bimodal bilinguals

    PubMed Central

    Kovelman, Ioulia; Shalinsky, Mark H.; Berens, Melody S.; Petitto, Laura-Ann

    2014-01-01

    Early bilingual exposure, especially exposure to two languages in different modalities such as speech and sign, can profoundly affect an individual's language, culture, and cognition. Here we explore the hypothesis that bimodal dual language exposure can also affect the brain's organization for language. These changes occur across brain regions universally important for language and parietal regions especially critical for sign language (Newman et al., 2002). We investigated three groups of participants (N = 29) that completed a word repetition task in American Sign Language (ASL) during fNIRS brain imaging. Those groups were (1) hearing ASL-English bimodal bilinguals (n = 5), (2) deaf ASL signers (n = 7), and (3) English monolinguals naïve to sign language (n = 17). The key finding of the present study is that bimodal bilinguals showed reduced activation in left parietal regions relative to deaf ASL signers when asked to use only ASL. In contrast, this group of bimodal signers showed greater activation in left temporo-parietal regions relative to English monolinguals when asked to switch between their two languages (Kovelman et al., 2009). Converging evidence now suggest that bimodal bilingual experience changes the brain bases of language, including the left temporo-parietal regions known to be critical for sign language processing (Emmorey et al., 2007). The results provide insight into the resilience and constraints of neural plasticity for language and bilingualism. PMID:25191247

  13. Atypical Speech and Language Development: A Consensus Study on Clinical Signs in the Netherlands

    ERIC Educational Resources Information Center

    Visser-Bochane, Margot I.; Gerrits, Ellen; van der Schans, Cees P.; Reijneveld, Sijmen A.; Luinge, Margreet R.

    2017-01-01

    Background: Atypical speech and language development is one of the most common developmental difficulties in young children. However, which clinical signs characterize atypical speech-language development at what age is not clear. Aim: To achieve a national and valid consensus on clinical signs and red flags (i.e. most urgent clinical signs) for…

  14. Generation of Signs within Semantic and Phonological Categories: Data from Deaf Adults and Children Who Use American Sign Language

    ERIC Educational Resources Information Center

    Beal-Alvarez, Jennifer S.; Figueroa, Daileen M.

    2017-01-01

    Two key areas of language development include semantic and phonological knowledge. Semantic knowledge relates to word and concept knowledge. Phonological knowledge relates to how language parameters combine to create meaning. We investigated signing deaf adults' and children's semantic and phonological sign generation via one-minute tasks,…

  15. Why Doesn't Everyone Here Speak Sign Language? Questions of Language Policy, Ideology and Economics

    ERIC Educational Resources Information Center

    Rayman, Jennifer

    2009-01-01

    This paper is a thought experiment exploring the possibility of establishing universal bilingualism in Sign Languages. Focusing in the first part on historical examples of inclusive signing societies such as Martha's Vineyard, the author suggests that it is not possible to create such naturally occurring practices of Sign Bilingualism in societies…

  16. New Perspectives on the History of American Sign Language

    ERIC Educational Resources Information Center

    Shaw, Emily; Delaporte, Yves

    2011-01-01

    Examinations of the etymology of American Sign Language have typically involved superficial analyses of signs as they exist over a short period of time. While it is widely known that ASL is related to French Sign Language, there has yet to be a comprehensive study of this historic relationship between their lexicons. This article presents…

  17. Training social skills to severely mentally retarded multiply handicapped adolescents.

    PubMed

    Matson, J L; Manikam, R; Coe, D; Raymond, K; Taras, M; Long, N

    1988-01-01

    Three severely mentally retarded, multiply handicapped, adolescents were treated in a classroom setting for social skills deficits. Two of these children exhibited symptoms of autism including periods of echolalia, and fascination with tactile and visual stimulation. One of the pair was deaf. The third child was profoundly mentally retarded and had minimal expressive language skills. All had received sign language training to facilitate communication. Treatment focused on increasing the frequency of eye contact, in seat and response to verbal prompt behaviors, skills deemed necessary to facilitate use of sign language communication and to increase social interaction. Baseline and treatment were evaluated in a multiple baseline, alternating treatment design across children. Baseline was taken on responses to 10 standard questions, asked by the teacher, based on verbal presentation and sign language. This same procedure was then continued during the initial treatment phase following training sessions. During training, the children received social reinforcement, performance feedback and edible reinforcement, in the form of candy, for appropriate performance. Physical and verbal prompts as well as pictorial cues were employed to shape appropriate behavior. In the second treatment phase, training was implemented in the classroom in which baseline data had been collected. Improvement in target behaviors, via training sessions held four days a week, was noted. These data suggest that use of a combination of visual stimuli, operant and social learning methods can remediate social skills deficits in children with multiple psychological and physical deficits. The implications of these findings for current and future research are discussed.

  18. Visual Word Recognition in Deaf Readers: Lexicality Is Modulated by Communication Mode

    PubMed Central

    Barca, Laura; Pezzulo, Giovanni; Castrataro, Marianna; Rinaldi, Pasquale; Caselli, Maria Cristina

    2013-01-01

    Evidence indicates that adequate phonological abilities are necessary to develop proficient reading skills and that later in life phonology also has a role in the covert visual word recognition of expert readers. Impairments of acoustic perception, such as deafness, can lead to atypical phonological representations of written words and letters, which in turn can affect reading proficiency. Here, we report an experiment in which young adults with different levels of acoustic perception (i.e., hearing and deaf individuals) and different modes of communication (i.e., hearing individuals using spoken language, deaf individuals with a preference for sign language, and deaf individuals using the oral modality with less or no competence in sign language) performed a visual lexical decision task, which consisted of categorizing real words and consonant strings. The lexicality effect was restricted to deaf signers who responded faster to real words than consonant strings, showing over-reliance on whole word lexical processing of stimuli. No effect of stimulus type was found in deaf individuals using the oral modality or in hearing individuals. Thus, mode of communication modulates the lexicality effect. This suggests that learning a sign language during development shapes visuo-motor representations of words, which are tuned to the actions used to express them (phono-articulatory movements vs. hand movements) and to associated perceptions. As these visuo-motor representations are elicited during on-line linguistic processing and can overlap with the perceptual-motor processes required to execute the task, they can potentially produce interference or facilitation effects. PMID:23554976

  19. Visual word recognition in deaf readers: lexicality is modulated by communication mode.

    PubMed

    Barca, Laura; Pezzulo, Giovanni; Castrataro, Marianna; Rinaldi, Pasquale; Caselli, Maria Cristina

    2013-01-01

    Evidence indicates that adequate phonological abilities are necessary to develop proficient reading skills and that later in life phonology also has a role in the covert visual word recognition of expert readers. Impairments of acoustic perception, such as deafness, can lead to atypical phonological representations of written words and letters, which in turn can affect reading proficiency. Here, we report an experiment in which young adults with different levels of acoustic perception (i.e., hearing and deaf individuals) and different modes of communication (i.e., hearing individuals using spoken language, deaf individuals with a preference for sign language, and deaf individuals using the oral modality with less or no competence in sign language) performed a visual lexical decision task, which consisted of categorizing real words and consonant strings. The lexicality effect was restricted to deaf signers who responded faster to real words than consonant strings, showing over-reliance on whole word lexical processing of stimuli. No effect of stimulus type was found in deaf individuals using the oral modality or in hearing individuals. Thus, mode of communication modulates the lexicality effect. This suggests that learning a sign language during development shapes visuo-motor representations of words, which are tuned to the actions used to express them (phono-articulatory movements vs. hand movements) and to associated perceptions. As these visuo-motor representations are elicited during on-line linguistic processing and can overlap with the perceptual-motor processes required to execute the task, they can potentially produce interference or facilitation effects.

  20. The impact of input quality on early sign development in native and non-native language learners.

    PubMed

    Lu, Jenny; Jones, Anna; Morgan, Gary

    2016-05-01

    There is debate about how input variation influences child language. Most deaf children are exposed to a sign language from their non-fluent hearing parents and experience a delay in exposure to accessible language. A small number of children receive language input from their deaf parents who are fluent signers. Thus it is possible to document the impact of quality of input on early sign acquisition. The current study explores the outcomes of differential input in two groups of children aged two to five years: deaf children of hearing parents (DCHP) and deaf children of deaf parents (DCDP). Analysis of child sign language revealed DCDP had a more developed vocabulary and more phonological handshape types compared with DCHP. In naturalistic conversations deaf parents used more sign tokens and more phonological types than hearing parents. Results are discussed in terms of the effects of early input on subsequent language abilities.

  1. American Sign Language

    MedlinePlus

    ... Langue des Signes Française).Today’s ASL includes some elements of LSF plus the original local sign languages, which over the years ... evolves. It can also be used to model the essential elements and organization of natural language. Another NIDCD-funded research team is ...

  2. Non parametric, self organizing, scalable modeling of spatiotemporal inputs: the sign language paradigm.

    PubMed

    Caridakis, G; Karpouzis, K; Drosopoulos, A; Kollias, S

    2012-12-01

    Modeling and recognizing spatiotemporal, as opposed to static input, is a challenging task since it incorporates input dynamics as part of the problem. The vast majority of existing methods tackle the problem as an extension of the static counterpart, using dynamics, such as input derivatives, at feature level and adopting artificial intelligence and machine learning techniques originally designed for solving problems that do not specifically address the temporal aspect. The proposed approach deals with temporal and spatial aspects of the spatiotemporal domain in a discriminative as well as coupling manner. Self Organizing Maps (SOM) model the spatial aspect of the problem and Markov models its temporal counterpart. Incorporation of adjacency, both in training and classification, enhances the overall architecture with robustness and adaptability. The proposed scheme is validated both theoretically, through an error propagation study, and experimentally, on the recognition of individual signs, performed by different, native Greek Sign Language users. Results illustrate the architecture's superiority when compared to Hidden Markov Model techniques and variations both in terms of classification performance and computational cost. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Lexical prediction via forward models: N400 evidence from German Sign Language.

    PubMed

    Hosemann, Jana; Herrmann, Annika; Steinbach, Markus; Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias

    2013-09-01

    Models of language processing in the human brain often emphasize the prediction of upcoming input-for example in order to explain the rapidity of language understanding. However, the precise mechanisms of prediction are still poorly understood. Forward models, which draw upon the language production system to set up expectations during comprehension, provide a promising approach in this regard. Here, we present an event-related potential (ERP) study on German Sign Language (DGS) which tested the hypotheses of a forward model perspective on prediction. Sign languages involve relatively long transition phases between one sign and the next, which should be anticipated as part of a forward model-based prediction even though they are semantically empty. Native speakers of DGS watched videos of naturally signed DGS sentences which either ended with an expected or a (semantically) unexpected sign. Unexpected signs engendered a biphasic N400-late positivity pattern. Crucially, N400 onset preceded critical sign onset and was thus clearly elicited by properties of the transition phase. The comprehension system thereby clearly anticipated modality-specific information about the realization of the predicted semantic item. These results provide strong converging support for the application of forward models in language comprehension. © 2013 Elsevier Ltd. All rights reserved.

  4. Can Experience with Co-Speech Gesture Influence the Prosody of a Sign Language? Sign Language Prosodic Cues in Bimodal Bilinguals

    ERIC Educational Resources Information Center

    Brentari, Diane; Nadolske, Marie A.; Wolford, George

    2012-01-01

    In this paper the prosodic structure of American Sign Language (ASL) narratives is analyzed in deaf native signers (L1-D), hearing native signers (L1-H), and highly proficient hearing second language signers (L2-H). The results of this study show that the prosodic patterns used by these groups are associated both with their ASL language experience…

  5. Linguistic Policies, Linguistic Planning, and Brazilian Sign Language in Brazil

    ERIC Educational Resources Information Center

    de Quadros, Ronice Muller

    2012-01-01

    This article explains the consolidation of Brazilian Sign Language in Brazil through a linguistic plan that arose from the Brazilian Sign Language Federal Law 10.436 of April 2002 and the subsequent Federal Decree 5695 of December 2005. Two concrete facts that emerged from this existing language plan are discussed: the implementation of bilingual…

  6. Pointing and Reference in Sign Language and Spoken Language: Anchoring vs. Identifying

    ERIC Educational Resources Information Center

    Barberà, Gemma; Zwets, Martine

    2013-01-01

    In both signed and spoken languages, pointing serves to direct an addressee's attention to a particular entity. This entity may be either present or absent in the physical context of the conversation. In this article we focus on pointing directed to nonspeaker/nonaddressee referents in Sign Language of the Netherlands (Nederlandse Gebarentaal,…

  7. Neural Language Processing in Adolescent First-Language Learners: Longitudinal Case Studies in American Sign Language

    PubMed Central

    Ferjan Ramirez, Naja; Leonard, Matthew K.; Davenport, Tristan S.; Torres, Christina; Halgren, Eric; Mayberry, Rachel I.

    2016-01-01

    One key question in neurolinguistics is the extent to which the neural processing system for language requires linguistic experience during early life to develop fully. We conducted a longitudinal anatomically constrained magnetoencephalography (aMEG) analysis of lexico-semantic processing in 2 deaf adolescents who had no sustained language input until 14 years of age, when they became fully immersed in American Sign Language. After 2 to 3 years of language, the adolescents' neural responses to signed words were highly atypical, localizing mainly to right dorsal frontoparietal regions and often responding more strongly to semantically primed words (Ferjan Ramirez N, Leonard MK, Torres C, Hatrak M, Halgren E, Mayberry RI. 2014. Neural language processing in adolescent first-language learners. Cereb Cortex. 24 (10): 2772–2783). Here, we show that after an additional 15 months of language experience, the adolescents' neural responses remained atypical in terms of polarity. While their responses to less familiar signed words still showed atypical localization patterns, the localization of responses to highly familiar signed words became more concentrated in the left perisylvian language network. Our findings suggest that the timing of language experience affects the organization of neural language processing; however, even in adolescence, language representation in the human brain continues to evolve with experience. PMID:25410427

  8. Is Lhasa Tibetan Sign Language emerging, endangered, or both?

    PubMed Central

    Hofer, Theresia

    2017-01-01

    This article offers the first overview of the recent emergence of Tibetan Sign Language (TibSL) in Lhasa, capital of the Tibet Autonomous Region (TAR), China. Drawing on short anthropological fieldwork, in 2007 and 2014, with people and organisations involved in the formalisation and promotion of TibSL, the author discusses her findings within the nine-fold UNESCO model for assessing linguistic vitality and endangerment. She follows the adaptation of this model to assess signed languages by the Institute of Sign Languages and Deaf Studies (iSLanDS) at the University of Central Lancashire. The appraisal shows that TibSL appears to be between “severely” and “definitely” endangered, adding to the extant studies on the widespread phenomenon of sign language endangerment. Possible future influences and developments regarding the vitality and use of TibSL in Central Tibet and across the Tibetan plateau are then discussed and certain additions, not considered within the existing assessment model, suggested. In concluding, the article places the situation of TibSL within the wider circumstances of minority (sign) languages in China, Chinese Sign Language (CSL), and the post-2008 movement to promote and use “pure Tibetan language”. PMID:29033477

  9. Indonesian Sign Language Number Recognition using SIFT Algorithm

    NASA Astrophysics Data System (ADS)

    Mahfudi, Isa; Sarosa, Moechammad; Andrie Asmara, Rosa; Azrino Gustalika, M.

    2018-04-01

    Indonesian sign language (ISL) is generally used for deaf individuals and poor people communication in communicating. They use sign language as their primary language which consists of 2 types of action: sign and finger spelling. However, not all people understand their sign language so that this becomes a problem for them to communicate with normal people. this problem also becomes a factor they are isolated feel from the social life. It needs a solution that can help them to be able to interacting with normal people. Many research that offers a variety of methods in solving the problem of sign language recognition based on image processing. SIFT (Scale Invariant Feature Transform) algorithm is one of the methods that can be used to identify an object. SIFT is claimed very resistant to scaling, rotation, illumination and noise. Using SIFT algorithm for Indonesian sign language recognition number result rate recognition to 82% with the use of a total of 100 samples image dataset consisting 50 sample for training data and 50 sample images for testing data. Change threshold value get affect the result of the recognition. The best value threshold is 0.45 with rate recognition of 94%.

  10. Lexical access in sign language: a computational model

    PubMed Central

    Caselli, Naomi K.; Cohen-Goldberg, Ariel M.

    2014-01-01

    Psycholinguistic theories have predominantly been built upon data from spoken language, which leaves open the question: How many of the conclusions truly reflect language-general principles as opposed to modality-specific ones? We take a step toward answering this question in the domain of lexical access in recognition by asking whether a single cognitive architecture might explain diverse behavioral patterns in signed and spoken language. Chen and Mirman (2012) presented a computational model of word processing that unified opposite effects of neighborhood density in speech production, perception, and written word recognition. Neighborhood density effects in sign language also vary depending on whether the neighbors share the same handshape or location. We present a spreading activation architecture that borrows the principles proposed by Chen and Mirman (2012), and show that if this architecture is elaborated to incorporate relatively minor facts about either (1) the time course of sign perception or (2) the frequency of sub-lexical units in sign languages, it produces data that match the experimental findings from sign languages. This work serves as a proof of concept that a single cognitive architecture could underlie both sign and word recognition. PMID:24860539

  11. Using Noldus Observer XT for research on deaf signers learning to read: an innovative methodology.

    PubMed

    Ducharme, Daphne A; Arcand, Isabelle

    2009-08-01

    Despite years of research on the reading problems of deaf students, we still do not know how deaf signers who read well actually crack the code of print. How connections are made between sign language and written language is still an open question. In this article, we show how the Noldus Observer XT software can be used to conduct an in-depth analysis of the online behavior of deaf readers. First, we examine factors that may have an impact on reading behavior. Then, we describe how we videotaped teachers with their deaf student signers of langue des signes québécoise during a reading task, how we conducted a recall activity to better understand the students' reading behavior, and how we used this innovative software to analyze the taped footage. Finally, we discuss the contribution this type of research can have on the future reading behavior of deaf students.

  12. How to improve communication with deaf children in the dental clinic.

    PubMed

    Alsmark, Silvia San Bernardino; García, Joaquín; Martínez, María Rosa Mourelle; López, Nuria Esther Gallardo

    2007-12-01

    It may be difficult for hearing-impaired people to communicate with people who hear. In the health care area, there is often little awareness of the communication barriers faced by the deaf and, in dentistry, the attitude adopted towards the deaf is not always correct. A review is given of the basic rules and advice given for communicating with the hearing-impaired. The latter are classified in three groups - lip-readers, sign language users and those with hearing aids. The advice given varies for the different groups although the different methods of communication are often combined (e.g. sign language plus lip-reading, hearing-aids plus lip-reading). Treatment of hearing-impaired children in the dental clinic must be personalised. Each child is different, depending on the education received, the communication skills possessed, family factors (degree of parental protection, etc.), the existence of associated problems (learning difficulties), degree of loss of hearing, age, etc.

  13. Sign Language Translator Application Using OpenCV

    NASA Astrophysics Data System (ADS)

    Triyono, L.; Pratisto, E. H.; Bawono, S. A. T.; Purnomo, F. A.; Yudhanto, Y.; Raharjo, B.

    2018-03-01

    This research focuses on the development of sign language translator application using OpenCV Android based, this application is based on the difference in color. The author also utilizes Support Machine Learning to predict the label. Results of the research showed that the coordinates of the fingertip search methods can be used to recognize a hand gesture to the conditions contained open arms while to figure gesture with the hand clenched using search methods Hu Moments value. Fingertip methods more resilient in gesture recognition with a higher success rate is 95% on the distance variation is 35 cm and 55 cm and variations of light intensity of approximately 90 lux and 100 lux and light green background plain condition compared with the Hu Moments method with the same parameters and the percentage of success of 40%. While the background of outdoor environment applications still can not be used with a success rate of only 6 managed and the rest failed.

  14. Explicitly Teaching English through the Air to Students Who Are Deaf or Hard of Hearing

    ERIC Educational Resources Information Center

    Bennett, Jessica G.; Gardner, Ralph, III; Leighner, Ross; Clancy, Shannon; Garner, Joshua

    2014-01-01

    The Effects of the Language for Learning curriculum (Engelmann & Osborne, 1999) on through-the-air (i.e., signed and/or spoken) English skills for students who are deaf or hard of hearing (DHH) were examined by means of a single-subject, concurrent-multiple-probes-across-participants design. Four 11-year-old participants varied in auditory…

  15. A Curriculum for ASL: Empowering Students by Giving Them Ownership of Their Learning

    ERIC Educational Resources Information Center

    Herzig, Melissa P.

    2017-01-01

    In response to the need for deaf and hard of hearing students to facilitate literacy in American Sign Language (ASL) and to put as much focus on developing students' ASL skills as they usually do on developing their English skills, Melissa Herzig has created a curriculum entitled "Creating the Narrative Stories: The Development of the…

  16. Intertextuality and Sense Production in the Learning of Algebraic Methods

    ERIC Educational Resources Information Center

    Rojano, Teresa; Filloy, Eugenio; Puig, Luis

    2014-01-01

    In studies carried out in the 1980s the algebraic symbols and expressions are revealed through prealgebraic readers as non-independent texts, as texts that relate to other texts that in some cases belong to the reader's native language or to the arithmetic sign system. Such outcomes suggest that the act of reading algebraic texts submerges…

  17. Independent transmission of sign language interpreter in DVB: assessment of image compression

    NASA Astrophysics Data System (ADS)

    Zatloukal, Petr; Bernas, Martin; Dvořák, LukáÅ.¡

    2015-02-01

    Sign language on television provides information to deaf that they cannot get from the audio content. If we consider the transmission of the sign language interpreter over an independent data stream, the aim is to ensure sufficient intelligibility and subjective image quality of the interpreter with minimum bit rate. The work deals with the ROI-based video compression of Czech sign language interpreter implemented to the x264 open source library. The results of this approach are verified in subjective tests with the deaf. They examine the intelligibility of sign language expressions containing minimal pairs for different levels of compression and various resolution of image with interpreter and evaluate the subjective quality of the final image for a good viewing experience.

  18. Interaction of cerebral hemispheres and artistic thinking

    NASA Astrophysics Data System (ADS)

    Nikolaenko, Nikolay N.

    1998-07-01

    Study of drawings by patients with local lesions of the right or left hemisphere allows to understand how artistic thinking is supported by brain structures. The role of the right hemisphere is significant at the early stage of creative process. The right hemisphere is a generator of nonverbal visuo-spatial thinking. It operates with blurred nonverbal images and arrange them in a visual space. With the help of iconic signs the right hemisphere reflects the world and creates perceptive visual standards which are stored in the long-term right hemisphere memory. The image, which appeared in the `inner' space, should be transferred into a principally different language, i.e. a left hemispheric sign language. This language operates with a number of discrete units, logical succession and learned grammar rules. This process can be explained by activation (information) transfer from the right hemisphere to the left one. Thus, natural and spontaneous creative process, which is finished by a conscious effort, can be understood as an activation impulse transfer from the right hemisphere to the left one and back.

  19. Bilingual Cancer Genetic Education Modules for the Deaf Community: Development and Evaluation of the Online Video Material.

    PubMed

    Boudreault, Patrick; Wolfson, Alicia; Berman, Barbara; Venne, Vickie L; Sinsheimer, Janet S; Palmer, Christina

    2018-04-01

    Health information about inherited forms of cancer and the role of family history in cancer risk for the American Sign Language (ASL) Deaf community, a linguistic and cultural community, needs improvement. Cancer genetic education materials available in English print format are not accessible for many sign language users because English is not their native or primary language. Per Center for Disease Control and Prevention recommendations, the level of literacy for printed health education materials should not be higher than 6th grade level (~ 11 to 12 years old), and even with this recommendation, printed materials are still not accessible to sign language users or other nonnative English speakers. Genetic counseling is becoming an integral part of healthcare, but often ASL users are not considered when health education materials are developed. As a result, there are few genetic counseling materials available in ASL. Online tools such as video and closed captioning offer opportunities for educators and genetic counselors to provide digital access to genetic information in ASL to the Deaf community. The Deaf Genetics Project team used a bilingual approach to develop a 37-min interactive Cancer Genetics Education Module (CGEM) video in ASL with closed captions and quizzes, and demonstrated that this approach resulted in greater cancer genetic knowledge and increased intentions to obtain counseling or testing, compared to standard English text information (Palmer et al., Disability and Health Journal, 10(1):23-32, 2017). Though visually enhanced educational materials have been developed for sign language users with multimodal/lingual approach, little is known about design features that can accommodate a diverse audience of sign language users so the material is engaging to a wide audience. The main objectives of this paper are to describe the development of the CGEM and to determine if viewer demographic characteristics are associated with two measurable aspects of CGEM viewing behavior: (1) length of time spent viewing and (2) number of pause, play, and seek events. These objectives are important to address, especially for Deaf individuals because the amount of simultaneous content (video, print) requires cross-modal cognitive processing of visual and textual materials. The use of technology and presentational strategies is needed that enhance and not interfere with health learning in this population.

  20. Deficits in Narrative Abilities in Child British Sign Language Users with Specific Language Impairment

    ERIC Educational Resources Information Center

    Herman, Ros; Rowley, Katherine; Mason, Kathryn; Morgan, Gary

    2014-01-01

    This study details the first ever investigation of narrative skills in a group of 17 deaf signing children who have been diagnosed with disorders in their British Sign Language development compared with a control group of 17 deaf child signers matched for age, gender, education, quantity, and quality of language exposure and non-verbal…

  1. Meeting the Needs of Signers in the Field of Speech and Language Pathology: Some Considerations for Action

    ERIC Educational Resources Information Center

    Cripps, Jody H.; Cooper, Sheryl B.; Supalla, Samuel J.; Evitts, Paul M.

    2016-01-01

    Deaf individuals who use American Sign Language (ASL) are rarely the focus of professionals in speech-language pathology. Although society is widely thought of in terms of those who speak, this norm is not all-inclusive. Many signing individuals exhibit disorders in signed language and need treatment much like their speaking peers. Although there…

  2. Introduction: Sign Language, Sustainable Development, and Equal Opportunities

    ERIC Educational Resources Information Center

    De Clerck, Goedele A. M.

    2017-01-01

    This article has been excerpted from "Introduction: Sign Language, Sustainable Development, and Equal Opportunities" (De Clerck) in "Sign Language, Sustainable Development, and Equal Opportunities: Envisioning the Future for Deaf Students" (G. A. M. De Clerck & P. V. Paul (Eds.) 2016). The idea of exploring various…

  3. Sign Language Echolalia in Deaf Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Shield, Aaron; Cooley, Frances; Meier, Richard P.

    2017-01-01

    Purpose: We present the first study of echolalia in deaf, signing children with autism spectrum disorder (ASD). We investigate the nature and prevalence of sign echolalia in native-signing children with ASD, the relationship between sign echolalia and receptive language, and potential modality differences between sign and speech. Method: Seventeen…

  4. Development of E-learning prototype for MUET assessment

    NASA Astrophysics Data System (ADS)

    Mit Anak Mawan, Amylia; Mohamed, Rozlini; Othman, Muhaini; Yusof, Munirah Mohd

    2017-08-01

    This paper aims to discuss the development of E-learning prototype for MUET assessment in Fakulti Sains Komputer dan Teknologi Maklumat (FSKTM), Universiti Tun Hussein Onn Malaysia (UTHM) namely, MUET Online System. The system is considered as a learning centre to study MUET examination that follows the MUET syllabus. The system will be used to assist students in making preparation before sitting for MUET examination. Before student can gain access to the system, students need to sign up and pay some fees before they are enrolled into virtual MUET class. The class will be guided by the English language lecturer from Faculty of Science, Technology and Human Development (FSTPI), UTHM as teacher. The system provides learning modules, quiz and test section. At the end of learning session students’ performance are assessed through quizzes and test measure the level of student understands. The teacher will evaluate the student’s mark and provide advices to the student. Therefore, the MUET Online System will be able to improve student knowledge in English language and subsequently help student to obtain the best result in MUET by providing more guided references and practices.

  5. Deficits in narrative abilities in child British Sign Language users with specific language impairment.

    PubMed

    Herman, Ros; Rowley, Katherine; Mason, Kathryn; Morgan, Gary

    2014-01-01

    This study details the first ever investigation of narrative skills in a group of 17 deaf signing children who have been diagnosed with disorders in their British Sign Language development compared with a control group of 17 deaf child signers matched for age, gender, education, quantity, and quality of language exposure and non-verbal intelligence. Children were asked to generate a narrative based on events in a language free video. Narratives were analysed for global structure, information content and local level grammatical devices, especially verb morphology. The language-impaired group produced shorter, less structured and grammatically simpler narratives than controls, with verb morphology particularly impaired. Despite major differences in how sign and spoken languages are articulated, narrative is shown to be a reliable marker of language impairment across the modality boundaries. © 2014 Royal College of Speech and Language Therapists.

  6. Sign Lowering and Phonetic Reduction in American Sign Language.

    PubMed

    Tyrone, Martha E; Mauk, Claude E

    2010-04-01

    This study examines sign lowering as a form of phonetic reduction in American Sign Language. Phonetic reduction occurs in the course of normal language production, when instead of producing a carefully articulated form of a word, the language user produces a less clearly articulated form. When signs are produced in context by native signers, they often differ from the citation forms of signs. In some cases, phonetic reduction is manifested as a sign being produced at a lower location than in the citation form. Sign lowering has been documented previously, but this is the first study to examine it in phonetic detail. The data presented here are tokens of the sign WONDER, as produced by six native signers, in two phonetic contexts and at three signing rates, which were captured by optoelectronic motion capture. The results indicate that sign lowering occurred for all signers, according to the factors we manipulated. Sign production was affected by several phonetic factors that also influence speech production, namely, production rate, phonetic context, and position within an utterance. In addition, we have discovered interesting variations in sign production, which could underlie distinctions in signing style, analogous to accent or voice quality in speech.

  7. Effects of Iconicity and Semantic Relatedness on Lexical Access in American Sign Language

    PubMed Central

    Bosworth, Rain G.; Emmorey, Karen

    2010-01-01

    Iconicity is a property that pervades the lexicon of many sign languages, including American Sign Language (ASL). Iconic signs exhibit a motivated, non-arbitrary mapping between the form of the sign and its meaning. We investigated whether iconicity enhances semantic priming effects for ASL and whether iconic signs are recognized more quickly than non-iconic signs (controlling for strength of iconicity, semantic relatedness, familiarity, and imageability). Twenty deaf signers made lexical decisions to the second item of a prime-target pair. Iconic target signs were preceded by prime signs that were a) iconic and semantically related, b) non-iconic and semantically related, or c) semantically unrelated. In addition, a set of non-iconic target signs was preceded by semantically unrelated primes. Significant facilitation was observed for target signs when preceded by semantically related primes. However, iconicity did not increase the priming effect (e.g., the target sign PIANO was primed equally by the iconic sign GUITAR and the non-iconic sign MUSIC). In addition, iconic signs were not recognized faster or more accurately than non-iconic signs. These results confirm the existence of semantic priming for sign language and suggest that iconicity does not play a robust role in on-line lexical processing. PMID:20919784

  8. How Do Typically Developing Deaf Children and Deaf Children with Autism Spectrum Disorder Use the Face When Comprehending Emotional Facial Expressions in British Sign Language?

    ERIC Educational Resources Information Center

    Denmark, Tanya; Atkinson, Joanna; Campbell, Ruth; Swettenham, John

    2014-01-01

    Facial expressions in sign language carry a variety of communicative features. While emotion can modulate a spoken utterance through changes in intonation, duration and intensity, in sign language specific facial expressions presented concurrently with a manual sign perform this function. When deaf adult signers cannot see facial features, their…

  9. How Grammar Can Cope with Limited Short-Term Memory: Simultaneity and Seriality in Sign Languages

    ERIC Educational Resources Information Center

    Geraci, Carlo; Gozzi, Marta; Papagno, Costanza; Cecchetto, Carlo

    2008-01-01

    It is known that in American Sign Language (ASL) span is shorter than in English, but this discrepancy has never been systematically investigated using other pairs of signed and spoken languages. This finding is at odds with results showing that short-term memory (STM) for signs has an internal organization similar to STM for words. Moreover, some…

  10. Wavelets for sign language translation

    NASA Astrophysics Data System (ADS)

    Wilson, Beth J.; Anspach, Gretel

    1993-10-01

    Wavelet techniques are applied to help extract the relevant parameters of sign language from video images of a person communicating in American Sign Language or Signed English. The compression and edge detection features of two-dimensional wavelet analysis are exploited to enhance the algorithms under development to classify the hand motion, hand location with respect to the body, and handshape. These three parameters have different processing requirements and complexity issues. The results are described for applying various quadrature mirror filter designs to a filterbank implementation of the desired wavelet transform. The overall project is to develop a system that will translate sign language to English to facilitate communication between deaf and hearing people.

  11. New graduates’ perceptions of preparedness to provide speech-language therapy services in general and dysphagia services in particular

    PubMed Central

    Booth, Alannah; Choto, Fadziso; Gotlieb, Jessica; Robertson, Rebecca; Morris, Gabriella; Stockley, Nicola; Mauff, Katya

    2015-01-01

    Background Upon graduation, newly qualified speech-language therapists are expected to provide services independently. This study describes new graduates’ perceptions of their preparedness to provide services across the scope of the profession and explores associations between perceptions of dysphagia theory and clinical learning curricula with preparedness for adult and paediatric dysphagia service delivery. Methods New graduates of six South African universities were recruited to participate in a survey by completing an electronic questionnaire exploring their perceptions of the dysphagia curricula and their preparedness to practise across the scope of the profession of speech-language therapy. Results Eighty graduates participated in the study yielding a response rate of 63.49%. Participants perceived themselves to be well prepared in some areas (e.g. child language: 100%; articulation and phonology: 97.26%), but less prepared in other areas (e.g. adult dysphagia: 50.70%; paediatric dysarthria: 46.58%; paediatric dysphagia: 38.36%) and most unprepared to provide services requiring sign language (23.61%) and African languages (20.55%). There was a significant relationship between perceptions of adequate theory and clinical learning opportunities with assessment and management of dysphagia and perceptions of preparedness to provide dysphagia services. Conclusion There is a need for review of existing curricula and consideration of developing a standard speech-language therapy curriculum across universities, particularly in service provision to a multilingual population, and in both the theory and clinical learning of the assessment and management of adult and paediatric dysphagia, to better equip graduates for practice. PMID:26304217

  12. An fMRI Study of Perception and Action in Deaf Signers

    PubMed Central

    Okada, Kayoko; Rogalsky, Corianne; O’Grady, Lucinda; Hanaumi, Leila; Bellugi, Ursula; Corina, David; Hickok, Gregory

    2016-01-01

    Since the discovery of mirror neurons, there has been a great deal of interest in understanding the relationship between perception and action, and the role of the human mirror system in language comprehension and production. Two questions have dominated research. One concerns the role of Broca’s area in speech perception. The other concerns the role of the motor system more broadly in understanding action-related language. The current study investigates both of these questions in a way that bridges research on language with research on manual actions. We studied the neural basis of observing and executing American Sign Language (ASL) object and action signs. In an fMRI experiment, deaf signers produced signs depicting actions and objects as well as observed/comprehended signs of actions and objects. Different patterns of activation were found for observation and execution although with overlap in Broca’s area, providing prima facie support for the claim that the motor system participates in language perception. In contrast, we found no evidence that action related signs differentially involved the motor system compared to object related signs. These findings are discussed in the context of lesion studies of sign language execution and observation. In this broader context, we conclude that the activation in Broca’s area during ASL observation is not causally related to sign language understanding. PMID:26796716

  13. An fMRI study of perception and action in deaf signers.

    PubMed

    Okada, Kayoko; Rogalsky, Corianne; O'Grady, Lucinda; Hanaumi, Leila; Bellugi, Ursula; Corina, David; Hickok, Gregory

    2016-02-01

    Since the discovery of mirror neurons, there has been a great deal of interest in understanding the relationship between perception and action, and the role of the human mirror system in language comprehension and production. Two questions have dominated research. One concerns the role of Broca's area in speech perception. The other concerns the role of the motor system more broadly in understanding action-related language. The current study investigates both of these questions in a way that bridges research on language with research on manual actions. We studied the neural basis of observing and executing American Sign Language (ASL) object and action signs. In an fMRI experiment, deaf signers produced signs depicting actions and objects as well as observed/comprehended signs of actions and objects. Different patterns of activation were found for observation and execution although with overlap in Broca's area, providing prima facie support for the claim that the motor system participates in language perception. In contrast, we found no evidence that action related signs differentially involved the motor system compared to object related signs. These findings are discussed in the context of lesion studies of sign language execution and observation. In this broader context, we conclude that the activation in Broca's area during ASL observation is not causally related to sign language understanding. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Neural Language Processing in Adolescent First-Language Learners: Longitudinal Case Studies in American Sign Language.

    PubMed

    Ferjan Ramirez, Naja; Leonard, Matthew K; Davenport, Tristan S; Torres, Christina; Halgren, Eric; Mayberry, Rachel I

    2016-03-01

    One key question in neurolinguistics is the extent to which the neural processing system for language requires linguistic experience during early life to develop fully. We conducted a longitudinal anatomically constrained magnetoencephalography (aMEG) analysis of lexico-semantic processing in 2 deaf adolescents who had no sustained language input until 14 years of age, when they became fully immersed in American Sign Language. After 2 to 3 years of language, the adolescents' neural responses to signed words were highly atypical, localizing mainly to right dorsal frontoparietal regions and often responding more strongly to semantically primed words (Ferjan Ramirez N, Leonard MK, Torres C, Hatrak M, Halgren E, Mayberry RI. 2014. Neural language processing in adolescent first-language learners. Cereb Cortex. 24 (10): 2772-2783). Here, we show that after an additional 15 months of language experience, the adolescents' neural responses remained atypical in terms of polarity. While their responses to less familiar signed words still showed atypical localization patterns, the localization of responses to highly familiar signed words became more concentrated in the left perisylvian language network. Our findings suggest that the timing of language experience affects the organization of neural language processing; however, even in adolescence, language representation in the human brain continues to evolve with experience. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Teachers' perceptions of promoting sign language phonological awareness in an ASL/English bilingual program.

    PubMed

    Crume, Peter K

    2013-10-01

    The National Reading Panel emphasizes that spoken language phonological awareness (PA) developed at home and school can lead to improvements in reading performance in young children. However, research indicates that many deaf children are good readers even though they have limited spoken language PA. Is it possible that some deaf students benefit from teachers who promote sign language PA instead? The purpose of this qualitative study is to examine teachers' beliefs and instructional practices related to sign language PA. A thematic analysis is conducted on 10 participant interviews at an ASL/English bilingual school for the deaf to understand their views and instructional practices. The findings reveal that the participants had strong beliefs in developing students' structural knowledge of signs and used a variety of instructional strategies to build students' knowledge of sign structures in order to promote their language and literacy skills.

  16. Language choice in bimodal bilingual development.

    PubMed

    Lillo-Martin, Diane; de Quadros, Ronice M; Chen Pichler, Deborah; Fieldsteel, Zoe

    2014-01-01

    Bilingual children develop sensitivity to the language used by their interlocutors at an early age, reflected in differential use of each language by the child depending on their interlocutor. Factors such as discourse context and relative language dominance in the community may mediate the degree of language differentiation in preschool age children. Bimodal bilingual children, acquiring both a sign language and a spoken language, have an even more complex situation. Their Deaf parents vary considerably in access to the spoken language. Furthermore, in addition to code-mixing and code-switching, they use code-blending-expressions in both speech and sign simultaneously-an option uniquely available to bimodal bilinguals. Code-blending is analogous to code-switching sociolinguistically, but is also a way to communicate without suppressing one language. For adult bimodal bilinguals, complete suppression of the non-selected language is cognitively demanding. We expect that bimodal bilingual children also find suppression difficult, and use blending rather than suppression in some contexts. We also expect relative community language dominance to be a factor in children's language choices. This study analyzes longitudinal spontaneous production data from four bimodal bilingual children and their Deaf and hearing interlocutors. Even at the earliest observations, the children produced more signed utterances with Deaf interlocutors and more speech with hearing interlocutors. However, while three of the four children produced >75% speech alone in speech target sessions, they produced <25% sign alone in sign target sessions. All four produced bimodal utterances in both, but more frequently in the sign sessions, potentially because they find suppression of the dominant language more difficult. Our results indicate that these children are sensitive to the language used by their interlocutors, while showing considerable influence from the dominant community language.

  17. Language choice in bimodal bilingual development

    PubMed Central

    Lillo-Martin, Diane; de Quadros, Ronice M.; Chen Pichler, Deborah; Fieldsteel, Zoe

    2014-01-01

    Bilingual children develop sensitivity to the language used by their interlocutors at an early age, reflected in differential use of each language by the child depending on their interlocutor. Factors such as discourse context and relative language dominance in the community may mediate the degree of language differentiation in preschool age children. Bimodal bilingual children, acquiring both a sign language and a spoken language, have an even more complex situation. Their Deaf parents vary considerably in access to the spoken language. Furthermore, in addition to code-mixing and code-switching, they use code-blending—expressions in both speech and sign simultaneously—an option uniquely available to bimodal bilinguals. Code-blending is analogous to code-switching sociolinguistically, but is also a way to communicate without suppressing one language. For adult bimodal bilinguals, complete suppression of the non-selected language is cognitively demanding. We expect that bimodal bilingual children also find suppression difficult, and use blending rather than suppression in some contexts. We also expect relative community language dominance to be a factor in children's language choices. This study analyzes longitudinal spontaneous production data from four bimodal bilingual children and their Deaf and hearing interlocutors. Even at the earliest observations, the children produced more signed utterances with Deaf interlocutors and more speech with hearing interlocutors. However, while three of the four children produced >75% speech alone in speech target sessions, they produced <25% sign alone in sign target sessions. All four produced bimodal utterances in both, but more frequently in the sign sessions, potentially because they find suppression of the dominant language more difficult. Our results indicate that these children are sensitive to the language used by their interlocutors, while showing considerable influence from the dominant community language. PMID:25368591

  18. An Eye Tracking Study on the Perception and Comprehension of Unimodal and Bimodal Linguistic Inputs by Deaf Adolescents

    PubMed Central

    Mastrantuono, Eliana; Saldaña, David; Rodríguez-Ortiz, Isabel R.

    2017-01-01

    An eye tracking experiment explored the gaze behavior of deaf individuals when perceiving language in spoken and sign language only, and in sign-supported speech (SSS). Participants were deaf (n = 25) and hearing (n = 25) Spanish adolescents. Deaf students were prelingually profoundly deaf individuals with cochlear implants (CIs) used by age 5 or earlier, or prelingually profoundly deaf native signers with deaf parents. The effectiveness of SSS has rarely been tested within the same group of children for discourse-level comprehension. Here, video-recorded texts, including spatial descriptions, were alternately transmitted in spoken language, sign language and SSS. The capacity of these communicative systems to equalize comprehension in deaf participants with that of spoken language in hearing participants was tested. Within-group analyses of deaf participants tested if the bimodal linguistic input of SSS favored discourse comprehension compared to unimodal languages. Deaf participants with CIs achieved equal comprehension to hearing controls in all communicative systems while deaf native signers with no CIs achieved equal comprehension to hearing participants if tested in their native sign language. Comprehension of SSS was not increased compared to spoken language, even when spatial information was communicated. Eye movements of deaf and hearing participants were tracked and data of dwell times spent looking at the face or body area of the sign model were analyzed. Within-group analyses focused on differences between native and non-native signers. Dwell times of hearing participants were equally distributed across upper and lower areas of the face while deaf participants mainly looked at the mouth area; this could enable information to be obtained from mouthings in sign language and from lip-reading in SSS and spoken language. Few fixations were directed toward the signs, although these were more frequent when spatial language was transmitted. Both native and non-native signers looked mainly at the face when perceiving sign language, although non-native signers looked significantly more at the body than native signers. This distribution of gaze fixations suggested that deaf individuals – particularly native signers – mainly perceived signs through peripheral vision. PMID:28680416

  19. An Eye Tracking Study on the Perception and Comprehension of Unimodal and Bimodal Linguistic Inputs by Deaf Adolescents.

    PubMed

    Mastrantuono, Eliana; Saldaña, David; Rodríguez-Ortiz, Isabel R

    2017-01-01

    An eye tracking experiment explored the gaze behavior of deaf individuals when perceiving language in spoken and sign language only, and in sign-supported speech (SSS). Participants were deaf ( n = 25) and hearing ( n = 25) Spanish adolescents. Deaf students were prelingually profoundly deaf individuals with cochlear implants (CIs) used by age 5 or earlier, or prelingually profoundly deaf native signers with deaf parents. The effectiveness of SSS has rarely been tested within the same group of children for discourse-level comprehension. Here, video-recorded texts, including spatial descriptions, were alternately transmitted in spoken language, sign language and SSS. The capacity of these communicative systems to equalize comprehension in deaf participants with that of spoken language in hearing participants was tested. Within-group analyses of deaf participants tested if the bimodal linguistic input of SSS favored discourse comprehension compared to unimodal languages. Deaf participants with CIs achieved equal comprehension to hearing controls in all communicative systems while deaf native signers with no CIs achieved equal comprehension to hearing participants if tested in their native sign language. Comprehension of SSS was not increased compared to spoken language, even when spatial information was communicated. Eye movements of deaf and hearing participants were tracked and data of dwell times spent looking at the face or body area of the sign model were analyzed. Within-group analyses focused on differences between native and non-native signers. Dwell times of hearing participants were equally distributed across upper and lower areas of the face while deaf participants mainly looked at the mouth area; this could enable information to be obtained from mouthings in sign language and from lip-reading in SSS and spoken language. Few fixations were directed toward the signs, although these were more frequent when spatial language was transmitted. Both native and non-native signers looked mainly at the face when perceiving sign language, although non-native signers looked significantly more at the body than native signers. This distribution of gaze fixations suggested that deaf individuals - particularly native signers - mainly perceived signs through peripheral vision.

  20. Grammar Predicts Procedural Learning and Consolidation Deficits in Children with Specific Language Impairment

    PubMed Central

    Hedenius, Martina; Persson, Jonas; Tremblay, Antoine; Adi-Japha, Esther; Veríssimo, João; Dye, Cristina D.; Alm, Per; Jennische, Margareta; Tomblin, J. Bruce; Ullman, Michael T.

    2011-01-01

    The Procedural Deficit Hypothesis (PDH) posits that Specific Language Impairment (SLI) can be largely explained by abnormalities of brain structures that subserve procedural memory. The PDH predicts impairments of procedural memory itself, and that such impairments underlie the grammatical deficits observed in the disorder. Previous studies have indeed reported procedural learning impairments in SLI, and have found that these are associated with grammatical difficulties. The present study extends this research by examining the consolidation and longer-term procedural sequence learning in children with SLI. The Alternating Serial Reaction Time (ASRT) task was given to children with SLI and typically-developing (TD) children in an initial learning session and an average of three days later to test for consolidation and longer-term learning. Although both groups showed evidence of initial sequence learning, only the TD children showed clear signs of consolidation, even though the two groups did not differ in longer-term learning. When the children were re-categorized on the basis of grammar deficits rather than broader language deficits, a clearer pattern emerged. Whereas both the grammar impaired and normal grammar groups showed evidence of initial sequence learning, only those with normal grammar showed consolidation and longer-term learning. Indeed, the grammar-impaired group appeared to lose any sequence knowledge gained during the initial testing session. These findings held even when controlling for vocabulary or a broad non-grammatical language measure, neither of which were associated with procedural memory. When grammar was examined as a continuous variable over all children, the same relationships between procedural memory and grammar, but not vocabulary or the broader language measure, were observed. Overall, the findings support and further specify the PDH. They suggest that consolidation and longer-term procedural learning are impaired in SLI, but that these impairments are specifically tied to the grammatical deficits in the disorder. The possibility that consolidation and longer-term learning are problematic in the disorder suggests a locus of potential study for therapeutic approaches. In sum, this study clarifies our understanding of the underlying deficits in SLI, and suggests avenues for further research. PMID:21840165

  1. The effect of sign language structure on complex word reading in Chinese deaf adolescents.

    PubMed

    Lu, Aitao; Yu, Yanping; Niu, Jiaxin; Zhang, John X

    2015-01-01

    The present study was carried out to investigate whether sign language structure plays a role in the processing of complex words (i.e., derivational and compound words), in particular, the delay of complex word reading in deaf adolescents. Chinese deaf adolescents were found to respond faster to derivational words than to compound words for one-sign-structure words, but showed comparable performance for two-sign-structure words. For both derivational and compound words, response latencies to one-sign-structure words were shorter than to two-sign-structure words. These results provide strong evidence that the structure of sign language affects written word processing in Chinese. Additionally, differences between derivational and compound words in the one-sign-structure condition indicate that Chinese deaf adolescents acquire print morphological awareness. The results also showed that delayed word reading was found in derivational words with two signs (DW-2), compound words with one sign (CW-1), and compound words with two signs (CW-2), but not in derivational words with one sign (DW-1), with the delay being maximum in DW-2, medium in CW-2, and minimum in CW-1, suggesting that the structure of sign language has an impact on the delayed processing of Chinese written words in deaf adolescents. These results provide insight into the mechanisms about how sign language structure affects written word processing and its delayed processing relative to their hearing peers of the same age.

  2. Research Ethics in Sign Language Communities

    ERIC Educational Resources Information Center

    Harris, Raychelle; Holmes, Heidi M.; Mertens, Donna M.

    2009-01-01

    Codes of ethics exist for most professional associations whose members do research on, for, or with sign language communities. However, these ethical codes are silent regarding the need to frame research ethics from a cultural standpoint, an issue of particular salience for sign language communities. Scholars who write from the perspective of…

  3. Comprehending Sentences with the Body: Action Compatibility in British Sign Language?

    ERIC Educational Resources Information Center

    Vinson, David; Perniss, Pamela; Fox, Neil; Vigliocco, Gabriella

    2017-01-01

    Previous studies show that reading sentences about actions leads to specific motor activity associated with actually performing those actions. We investigate how sign language input may modulate motor activation, using British Sign Language (BSL) sentences, some of which explicitly encode direction of motion, versus written English, where motion…

  4. Sign Language and the Brain: A Review

    ERIC Educational Resources Information Center

    Campbell, Ruth; MacSweeney, Mairead; Waters, Dafydd

    2008-01-01

    How are signed languages processed by the brain? This review briefly outlines some basic principles of brain structure and function and the methodological principles and techniques that have been used to investigate this question. We then summarize a number of different studies exploring brain activity associated with sign language processing…

  5. Early visual language exposure and emergent literacy in preschool deaf children: findings from a national longitudinal study.

    PubMed

    Allen, Thomas E; Letteri, Amy; Choi, Song Hoa; Dang, Daqian

    2014-01-01

    Brief review is provided of recent research on the impact of early visual language exposure on a variety of developmental outcomes, including literacy, cognition, and social adjustment. This body of work points to the great importance of giving young deaf children early exposure to a visual language as a critical precursor to the acquisition of literacy. Four analyses of data from the Visual Language and Visual Learning (VL2) Early Education Longitudinal Study are summarized. Each confirms findings from previously published laboratory findings and points to the positive effects of early sign language on, respectively, letter knowledge, social adaptability, sustained visual attention, and cognitive-behavioral milestones necessary for academic success. The article concludes with a consideration of the qualitative similarity hypothesis and a finding that the hypothesis is valid, but only if it can be presented as being modality independent.

  6. Variation in handshape and orientation in British Sign Language: The case of the ‘1’ hand configuration

    PubMed Central

    Fenlon, Jordan; Schembri, Adam; Rentelis, Ramas; Cormier, Kearsy

    2013-01-01

    This paper investigates phonological variation in British Sign Language (BSL) signs produced with a ‘1’ hand configuration in citation form. Multivariate analyses of 2084 tokens reveals that handshape variation in these signs is constrained by linguistic factors (e.g., the preceding and following phonological environment, grammatical category, indexicality, lexical frequency). The only significant social factor was region. For the subset of signs where orientation was also investigated, only grammatical function was important (the surrounding phonological environment and social factors were not significant). The implications for an understanding of pointing signs in signed languages are discussed. PMID:23805018

  7. The effects of sign language on spoken language acquisition in children with hearing loss: a systematic review protocol.

    PubMed

    Fitzpatrick, Elizabeth M; Stevens, Adrienne; Garritty, Chantelle; Moher, David

    2013-12-06

    Permanent childhood hearing loss affects 1 to 3 per 1000 children and frequently disrupts typical spoken language acquisition. Early identification of hearing loss through universal newborn hearing screening and the use of new hearing technologies including cochlear implants make spoken language an option for most children. However, there is no consensus on what constitutes optimal interventions for children when spoken language is the desired outcome. Intervention and educational approaches ranging from oral language only to oral language combined with various forms of sign language have evolved. Parents are therefore faced with important decisions in the first months of their child's life. This article presents the protocol for a systematic review of the effects of using sign language in combination with oral language intervention on spoken language acquisition. Studies addressing early intervention will be selected in which therapy involving oral language intervention and any form of sign language or sign support is used. Comparison groups will include children in early oral language intervention programs without sign support. The primary outcomes of interest to be examined include all measures of auditory, vocabulary, language, speech production, and speech intelligibility skills. We will include randomized controlled trials, controlled clinical trials, and other quasi-experimental designs that include comparator groups as well as prospective and retrospective cohort studies. Case-control, cross-sectional, case series, and case studies will be excluded. Several electronic databases will be searched (for example, MEDLINE, EMBASE, CINAHL, PsycINFO) as well as grey literature and key websites. We anticipate that a narrative synthesis of the evidence will be required. We will carry out meta-analysis for outcomes if clinical similarity, quantity and quality permit quantitative pooling of data. We will conduct subgroup analyses if possible according to severity/type of hearing disorder, age of identification, and type of hearing technology. This review will provide evidence on the effectiveness of using sign language in combination with oral language therapies for developing spoken language in children with hearing loss who are identified at a young age. The information from this review can provide guidance to parents and intervention specialists, inform policy decisions and provide directions for future research. CRD42013005426.

  8. Cross-Modal Recruitment of Auditory and Orofacial Areas During Sign Language in a Deaf Subject.

    PubMed

    Martino, Juan; Velasquez, Carlos; Vázquez-Bourgon, Javier; de Lucas, Enrique Marco; Gomez, Elsa

    2017-09-01

    Modern sign languages used by deaf people are fully expressive, natural human languages that are perceived visually and produced manually. The literature contains little data concerning human brain organization in conditions of deficient sensory information such as deafness. A deaf-mute patient underwent surgery of a left temporoinsular low-grade glioma. The patient underwent awake surgery with intraoperative electrical stimulation mapping, allowing direct study of the cortical and subcortical organization of sign language. We found a similar distribution of language sites to what has been reported in mapping studies of patients with oral language, including 1) speech perception areas inducing anomias and alexias close to the auditory cortex (at the posterior portion of the superior temporal gyrus and supramarginal gyrus); 2) speech production areas inducing speech arrest (anarthria) at the ventral premotor cortex, close to the lip motor area and away from the hand motor area; and 3) subcortical stimulation-induced semantic paraphasias at the inferior fronto-occipital fasciculus at the temporal isthmus. The intraoperative setup for sign language mapping with intraoperative electrical stimulation in deaf-mute patients is similar to the setup described in patients with oral language. To elucidate the type of language errors, a sign language interpreter in close interaction with the neuropsychologist is necessary. Sign language is perceived visually and produced manually; however, this case revealed a cross-modal recruitment of auditory and orofacial motor areas. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Sign language spotting with a threshold model based on conditional random fields.

    PubMed

    Yang, Hee-Deok; Sclaroff, Stan; Lee, Seong-Whan

    2009-07-01

    Sign language spotting is the task of detecting and recognizing signs in a signed utterance, in a set vocabulary. The difficulty of sign language spotting is that instances of signs vary in both motion and appearance. Moreover, signs appear within a continuous gesture stream, interspersed with transitional movements between signs in a vocabulary and nonsign patterns (which include out-of-vocabulary signs, epentheses, and other movements that do not correspond to signs). In this paper, a novel method for designing threshold models in a conditional random field (CRF) model is proposed which performs an adaptive threshold for distinguishing between signs in a vocabulary and nonsign patterns. A short-sign detector, a hand appearance-based sign verification method, and a subsign reasoning method are included to further improve sign language spotting accuracy. Experiments demonstrate that our system can spot signs from continuous data with an 87.0 percent spotting rate and can recognize signs from isolated data with a 93.5 percent recognition rate versus 73.5 percent and 85.4 percent, respectively, for CRFs without a threshold model, short-sign detection, subsign reasoning, and hand appearance-based sign verification. Our system can also achieve a 15.0 percent sign error rate (SER) from continuous data and a 6.4 percent SER from isolated data versus 76.2 percent and 14.5 percent, respectively, for conventional CRFs.

  10. Operationalization of Sign Language Phonological Similarity and its Effects on Lexical Access.

    PubMed

    Williams, Joshua T; Stone, Adam; Newman, Sharlene D

    2017-07-01

    Cognitive mechanisms for sign language lexical access are fairly unknown. This study investigated whether phonological similarity facilitates lexical retrieval in sign languages using measures from a new lexical database for American Sign Language. Additionally, it aimed to determine which similarity metric best fits the present data in order to inform theories of how phonological similarity is constructed within the lexicon and to aid in the operationalization of phonological similarity in sign language. Sign repetition latencies and accuracy were obtained when native signers were asked to reproduce a sign displayed on a computer screen. Results indicated that, as predicted, phonological similarity facilitated repetition latencies and accuracy as long as there were no strict constraints on the type of sublexical features that overlapped. The data converged to suggest that one similarity measure, MaxD, defined as the overlap of any 4 sublexical features, likely best represents mechanisms of phonological similarity in the mental lexicon. Together, these data suggest that lexical access in sign language is facilitated by phonologically similar lexical representations in memory and the optimal operationalization is defined as liberal constraints on overlap of 4 out of 5 sublexical features-similar to the majority of extant definitions in the literature. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Building Languages

    MedlinePlus

    ... Support Services Technology and Audiology Medical and Surgical Solutions Putting it all Together Building Language American Sign Language (ASL) Conceptually Accurate Signed English (CASE) Cued Speech Finger Spelling Listening/Auditory Training ...

  12. What sign language creation teaches us about language.

    PubMed

    Brentari, Diane; Coppola, Marie

    2013-03-01

    How do languages emerge? What are the necessary ingredients and circumstances that permit new languages to form? Various researchers within the disciplines of primatology, anthropology, psychology, and linguistics have offered different answers to this question depending on their perspective. Language acquisition, language evolution, primate communication, and the study of spoken varieties of pidgin and creoles address these issues, but in this article we describe a relatively new and important area that contributes to our understanding of language creation and emergence. Three types of communication systems that use the hands and body to communicate will be the focus of this article: gesture, homesign systems, and sign languages. The focus of this article is to explain why mapping the path from gesture to homesign to sign language has become an important research topic for understanding language emergence, not only for the field of sign languages, but also for language in general. WIREs Cogn Sci 2013, 4:201-211. doi: 10.1002/wcs.1212 For further resources related to this article, please visit the WIREs website. Copyright © 2012 John Wiley & Sons, Ltd.

  13. Dynamical Integration of Language and Behavior in a Recurrent Neural Network for Human-Robot Interaction.

    PubMed

    Yamada, Tatsuro; Murata, Shingo; Arie, Hiroaki; Ogata, Tetsuya

    2016-01-01

    To work cooperatively with humans by using language, robots must not only acquire a mapping between language and their behavior but also autonomously utilize the mapping in appropriate contexts of interactive tasks online. To this end, we propose a novel learning method linking language to robot behavior by means of a recurrent neural network. In this method, the network learns from correct examples of the imposed task that are given not as explicitly separated sets of language and behavior but as sequential data constructed from the actual temporal flow of the task. By doing this, the internal dynamics of the network models both language-behavior relationships and the temporal patterns of interaction. Here, "internal dynamics" refers to the time development of the system defined on the fixed-dimensional space of the internal states of the context layer. Thus, in the execution phase, by constantly representing where in the interaction context it is as its current state, the network autonomously switches between recognition and generation phases without any explicit signs and utilizes the acquired mapping in appropriate contexts. To evaluate our method, we conducted an experiment in which a robot generates appropriate behavior responding to a human's linguistic instruction. After learning, the network actually formed the attractor structure representing both language-behavior relationships and the task's temporal pattern in its internal dynamics. In the dynamics, language-behavior mapping was achieved by the branching structure. Repetition of human's instruction and robot's behavioral response was represented as the cyclic structure, and besides, waiting to a subsequent instruction was represented as the fixed-point attractor. Thanks to this structure, the robot was able to interact online with a human concerning the given task by autonomously switching phases.

  14. Lateralization of motor excitability during observation of bimanual signs.

    PubMed

    Möttönen, Riikka; Farmer, Harry; Watkins, Kate E

    2010-08-01

    Viewing another person's hand actions enhances excitability in an observer's left and right primary motor (M1) cortex. We aimed to determine whether viewing communicative hand actions alters this bilateral sensorimotor resonance. Using single-pulse transcranial magnetic stimulation (TMS), we measured excitability in the left and right M1 while right-handed non-signing participants observed bimanual communicative hand actions, i.e., meaningful signs in British Sign Language. TMS-induced motor evoked potentials were recorded from hand muscles during sign observation before and after teaching the participants to associate meanings with half of the signs. Before this teaching, when participants did not know that the presented hand actions were signs, excitability of left and right M1 was modulated equally. After learning the meanings of half the signs, excitability of the left, but not right, M1 was significantly enhanced. This left-lateralized enhancement of M1 excitability occurred during observation of signs with known and unknown meanings. The findings suggest that awareness of the communicative nature of another person's hand actions strengthens sensorimotor resonance in the left M1 cortex and alters hemispheric balance during action observation. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  15. Assessing language skills in adult key word signers with intellectual disabilities: Insights from sign linguistics.

    PubMed

    Grove, Nicola; Woll, Bencie

    2017-03-01

    Manual signing is one of the most widely used approaches to support the communication and language skills of children and adults who have intellectual or developmental disabilities, and problems with communication in spoken language. A recent series of papers reporting findings from this population raises critical issues for professionals in the assessment of multimodal language skills of key word signers. Approaches to assessment will differ depending on whether key word signing (KWS) is viewed as discrete from, or related to, natural sign languages. Two available assessments from these different perspectives are compared. Procedures appropriate to the assessment of sign language production are recommended as a valuable addition to the clinician's toolkit. Sign and speech need to be viewed as multimodal, complementary communicative endeavours, rather than as polarities. Whilst narrative has been shown to be a fruitful context for eliciting language samples, assessments for adult users should be designed to suit the strengths, needs and values of adult signers with intellectual disabilities, using materials that are compatible with their life course stage rather than those designed for young children. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. USSR Report, International Affairs

    DTIC Science & Technology

    1987-03-26

    international competitions for the best textbooks and reading books for learning foreign languages and belles lettres. It also recommends other...agreements on cooperation signed by Poland recently relate to the manufacture of several parts for motor vehicles produced in the Soviet Union, the...rigs over five years is envisaged. According to an agreement concluded with the Yerevan Motor -Vehicle Works, the Truck Plant imeni B. Berut in the

  17. A Psycho-Pragmatic Study of Self-Identity of Kurdish EFL Learners in Kurdistan Region

    ERIC Educational Resources Information Center

    Mahmood, Ayad Hameed; Hassan, Zana Mahmood

    2018-01-01

    This paper is an extract from a PhD dissertation on the impacts of learning English on the self-identity of Kurdish EFL learners. Language is a distinctive feature of human being. Similarly, identity is considered as a sign humans are recognized by. So, scrutinizing the relationship between these two related components of human life is revealing.…

  18. The influence of the visual modality on language structure and conventionalization: insights from sign language and gesture.

    PubMed

    Perniss, Pamela; Özyürek, Asli; Morgan, Gary

    2015-01-01

    For humans, the ability to communicate and use language is instantiated not only in the vocal modality but also in the visual modality. The main examples of this are sign languages and (co-speech) gestures. Sign languages, the natural languages of Deaf communities, use systematic and conventionalized movements of the hands, face, and body for linguistic expression. Co-speech gestures, though non-linguistic, are produced in tight semantic and temporal integration with speech and constitute an integral part of language together with speech. The articles in this issue explore and document how gestures and sign languages are similar or different and how communicative expression in the visual modality can change from being gestural to grammatical in nature through processes of conventionalization. As such, this issue contributes to our understanding of how the visual modality shapes language and the emergence of linguistic structure in newly developing systems. Studying the relationship between signs and gestures provides a new window onto the human ability to recruit multiple levels of representation (e.g., categorical, gradient, iconic, abstract) in the service of using or creating conventionalized communicative systems. Copyright © 2015 Cognitive Science Society, Inc.

  19. Phonological Development in Hearing Learners of a Sign Language: The Influence of Phonological Parameters, Sign Complexity, and Iconicity

    ERIC Educational Resources Information Center

    Ortega, Gerardo; Morgan, Gary

    2015-01-01

    The present study implemented a sign-repetition task at two points in time to hearing adult learners of British Sign Language and explored how each phonological parameter, sign complexity, and iconicity affected sign production over an 11-week (22-hour) instructional period. The results show that training improves articulation accuracy and that…

  20. Processing of Formational, Semantic, and Iconic Information in American Sign Language.

    ERIC Educational Resources Information Center

    Poizner, Howard; And Others

    1981-01-01

    Three experiments examined short-term encoding processes of deaf signers for different aspects of signs from American Sign Language. Results indicated that deaf signers code signs at one level in terms of linguistically significant formational parameters. The semantic and iconic information of signs, however, has little effect on short-term…

  1. Phonological Similarity in American Sign Language.

    ERIC Educational Resources Information Center

    Hildebrandt, Ursula; Corina, David

    2002-01-01

    Investigates deaf and hearing subjects' ratings of American Sign Language (ASL) signs to assess whether linguistic experience shapes judgments of sign similarity. Findings are consistent with linguistic theories that posit movement and location as core structural elements of syllable structure in ASL. (Author/VWL)

  2. Location, Location, Location

    ERIC Educational Resources Information Center

    Cates, Deborah; Gutiérrez, Eva; Hafer, Sarah; Barrett, Ryan; Corina, David

    2013-01-01

    This article presents an analysis of the relationship between sign structure and iconicity in American Sign Language. Historically, linguists have been pressured to downplay the role of form-meaning relationships (iconicity) in signed languages. However, recent inquiries into the role of traditional phonological parameters of signs (handshape,…

  3. Promotion in Times of Endangerment: The Sign Language Act in Finland

    ERIC Educational Resources Information Center

    De Meulder, Maartje

    2017-01-01

    The development of sign language recognition legislation is a relatively recent phenomenon in the field of language policy. So far only few authors have documented signing communities' aspirations for recognition legislation, how they work with their governments to achieve legislation which most reflects these goals, and whether and why outcomes…

  4. Italian Sign Language (LIS) Poetry: Iconic Properties and Structural Regularities.

    ERIC Educational Resources Information Center

    Russo, Tommaso; Giuranna, Rosaria; Pizzuto, Elena

    2001-01-01

    Explores and describes from a crosslinguistic perspective, some of the major structural irregularities that characterize poetry in Italian Sign Language and distinguish poetic from nonpoetic texts. Reviews findings of previous studies of signed language poetry, and points out issues that need to be clarified to provide a more accurate description…

  5. Reading and American Sign Language: Strategies for Translation.

    ERIC Educational Resources Information Center

    Burkholder, Kim

    1999-01-01

    A hearing teacher for whom American Sign Language is a second language identifies nine strategies developed for reading and telling stories to deaf children. These include: ask obvious questions related to the story, portray written dialog as conversation, emphasize points by saying the same thing with different signs, and adapt the story to…

  6. Phonological Awareness for American Sign Language

    ERIC Educational Resources Information Center

    Corina, David P.; Hafer, Sarah; Welch, Kearnan

    2014-01-01

    This paper examines the concept of phonological awareness (PA) as it relates to the processing of American Sign Language (ASL). We present data from a recently developed test of PA for ASL and examine whether sign language experience impacts the use of metalinguistic routines necessary for completion of our task. Our data show that deaf signers…

  7. Technology to Support Sign Language for Students with Disabilities

    ERIC Educational Resources Information Center

    Donne, Vicki

    2013-01-01

    This systematic review of the literature provides a synthesis of research on the use of technology to support sign language. Background research on the use of sign language with students who are deaf/hard of hearing and students with low incidence disabilities, such as autism, intellectual disability, or communication disorders is provided. The…

  8. Lexical Properties of Slovene Sign Language: A Corpus-Based Study

    ERIC Educational Resources Information Center

    Vintar, Špela

    2015-01-01

    Slovene Sign Language (SZJ) has as yet received little attention from linguists. This article presents some basic facts about SZJ, its history, current status, and a description of the Slovene Sign Language Corpus and Pilot Grammar (SIGNOR) project, which compiled and annotated a representative corpus of SZJ. Finally, selected quantitative data…

  9. Proactive Interference & Language Change in Hearing Adult Students of American Sign Language.

    ERIC Educational Resources Information Center

    Hoemann, Harry W.; Kreske, Catherine M.

    1995-01-01

    Describes a study that found, contrary to previous reports, that a strong, symmetrical release from proactive interference (PI) is the normal outcome for switches between American Sign Language (ASL) signs and English words and with switches between Manual and English alphabet characters. Subjects were college students enrolled in their first ASL…

  10. The Bimodal Bilingual Brain: Effects of Sign Language Experience

    ERIC Educational Resources Information Center

    Emmorey, Karen; McCullough, Stephen

    2009-01-01

    Bimodal bilinguals are hearing individuals who know both a signed and a spoken language. Effects of bimodal bilingualism on behavior and brain organization are reviewed, and an fMRI investigation of the recognition of facial expressions by ASL-English bilinguals is reported. The fMRI results reveal separate effects of sign language and spoken…

  11. Understanding Communication among Deaf Students Who Sign and Speak: A Trivial Pursuit?

    ERIC Educational Resources Information Center

    Marschark, Marc; Convertino, Carol M.; Macias, Gayle; Monikowski, Christine M.; Sapere, Patricia; Seewagen, Rosemarie

    2007-01-01

    Classroom communication between deaf students was modeled using a question-and-answer game. Participants consisted of student pairs that relied on spoken language, pairs that relied on American Sign Language (ASL), and mixed pairs in which one student used spoken language and one signed. Although the task encouraged students to request…

  12. Neural systems underlying lexical retrieval for sign language.

    PubMed

    Emmorey, Karen; Grabowski, Thomas; McCullough, Stephen; Damasio, Hanna; Ponto, Laura L B; Hichwa, Richard D; Bellugi, Ursula

    2003-01-01

    Positron emission tomography was used to investigate whether signed languages exhibit the same neural organization for lexical retrieval within classical and non-classical language areas as has been described for spoken English. Ten deaf native American sign language (ASL) signers were shown pictures of unique entities (famous persons) and non-unique entities (animals) and were asked to name each stimulus with an overt signed response. Proper name signed responses to famous people were fingerspelled, and common noun responses to animals were both fingerspelled and signed with native ASL signs. In general, retrieving ASL signs activated neural sites similar to those activated by hearing subjects retrieving English words. Naming famous persons activated the left temporal pole (TP), whereas naming animals (whether fingerspelled or signed) activated left inferotemporal (IT) cortex. The retrieval of fingerspelled and native signs generally engaged the same cortical regions, but fingerspelled signs in addition activated a premotor region, perhaps due to the increased motor planning and sequencing demanded by fingerspelling. Native signs activated portions of the left supramarginal gyrus (SMG), an area previously implicated in the retrieval of phonological features of ASL signs. Overall, the findings indicate that similar neuroanatomical areas are involved in lexical retrieval for both signs and words. Copyright 2003 Elsevier Science Ltd.

  13. Deaf Mothers and Breastfeeding: Do Unique Features of Deaf Culture and Language Support Breastfeeding Success?

    PubMed Central

    Chin, Nancy P.; Cuculick, Jess; Starr, Matthew; Panko, Tiffany; Widanka, Holly; Dozier, Ann

    2014-01-01

    Background Deaf mothers who use American Sign Language (ASL) consider themselves a linguistic minority group, with specific cultural practices. Rarely has this group been engaged in infant-feeding research. Objectives To understand how ASL-using Deaf mothers learn about infant feeding and to identify their breastfeeding challenges. Methods Using a community-based participatory research (CBPR) approach we conducted four focus groups with Deaf mothers who had at least one child 0–5 years. A script was developed using a social ecological model (SEM) to capture multiple levels of influence. All groups were conducted in ASL, filmed, and transcribed into English. Deaf and hearing researchers analyzed data by coding themes within each SEM level. Results Fifteen mothers participated. All had initiated breastfeeding with their most recent child. Breastfeeding duration for eight of the mothers was three weeks to 12 months. Seven of the mothers were still breastfeeding, the longest for 19 months. Those mothers who breastfed longer described a supportive social environment and the ability to surmount challenges. Participants described characteristics of Deaf culture such as direct communication, sharing information, use of technologies, language access through interpreters and ASL-using providers, and strong self-advocacy skills. Finally, mothers used the sign ‘struggle’ to describe their breastfeeding experience. The sign implies a sustained effort over time which leads to success. Conclusions In a setting with a large population of Deaf women and ASL-using providers, we identified several aspects of Deaf culture and language which support BF mothers across institutional, community, and interpersonal levels of the SEM. PMID:23492762

  14. The gradual emergence of phonological form in a new language

    PubMed Central

    Aronoff, Mark; Meir, Irit; Padden, Carol

    2011-01-01

    The division of linguistic structure into a meaningless (phonological) level and a meaningful level of morphemes and words is considered a basic design feature of human language. Although established sign languages, like spoken languages, have been shown to be characterized by this bifurcation, no information has been available about the way in which such structure arises. We report here on a newly emerging sign language, Al-Sayyid Bedouin Sign Language, which functions as a full language but in which a phonological level of structure has not yet emerged. Early indications of formal regularities provide clues to the way in which phonological structure may develop over time. PMID:22223927

  15. The Signs B [Image Omitted] and B-Bent [Image Omitted] in Israeli Sign Language According to the Theory of Phonology as Human Behavior

    ERIC Educational Resources Information Center

    Fuks, Orit; Tobin, Yishai

    2008-01-01

    The purpose of the present research is to examine which of the two factors: (1) the iconic-semiotic factor; or (2) the human-phonetic factor is more relevant in explaining the appearance and distribution of the hand shape B-bent in Israeli Sign Language (ISL). The B-bent shape has been the subject of much attention in sign language research…

  16. Choosing Accommodations: Signed Language Interpreting and the Absence of Choice.

    PubMed

    Burke, Teresa Blankmeyer

    This paper carves out a topic space for discussion about the ethical question of whether input from signing Deaf consumers of interpreting services ought to be included in the provision of signed language interpreter accommodations. The first section provides background about disability accommodations and practices, including how signed language interpreting accommodations are similar and dissimilar to other kinds of disability accommodations. In the second section, I offer a personal narrative of my experience as a Deaf academic who has been excluded from the interpreter selection process, highlighting some of the harmful consequences of such exclusion. In the subsequent two sections, I describe and analyze the process of choosing interpreter accommodations, starting with the process of requesting signed language interpreters and the institutionalization of this process, followed by a brief overview of privacy and autonomy concerns from the standpoint of the signing Deaf consumer. The penultimate section considers some objections to the proposal of involving more consumer choice in signed language accommodations. I conclude the paper with some concrete suggestions for a more Deaf-centered, inclusive process for choosing interpreter accommodations.

  17. Three-dimensional grammar in the brain: Dissociating the neural correlates of natural sign language and manually coded spoken language.

    PubMed

    Jednoróg, Katarzyna; Bola, Łukasz; Mostowski, Piotr; Szwed, Marcin; Boguszewski, Paweł M; Marchewka, Artur; Rutkowski, Paweł

    2015-05-01

    In several countries natural sign languages were considered inadequate for education. Instead, new sign-supported systems were created, based on the belief that spoken/written language is grammatically superior. One such system called SJM (system językowo-migowy) preserves the grammatical and lexical structure of spoken Polish and since 1960s has been extensively employed in schools and on TV. Nevertheless, the Deaf community avoids using SJM for everyday communication, its preferred language being PJM (polski język migowy), a natural sign language, structurally and grammatically independent of spoken Polish and featuring classifier constructions (CCs). Here, for the first time, we compare, with fMRI method, the neural bases of natural vs. devised communication systems. Deaf signers were presented with three types of signed sentences (SJM and PJM with/without CCs). Consistent with previous findings, PJM with CCs compared to either SJM or PJM without CCs recruited the parietal lobes. The reverse comparison revealed activation in the anterior temporal lobes, suggesting increased semantic combinatory processes in lexical sign comprehension. Finally, PJM compared with SJM engaged left posterior superior temporal gyrus and anterior temporal lobe, areas crucial for sentence-level speech comprehension. We suggest that activity in these two areas reflects greater processing efficiency for naturally evolved sign language. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Workplace Concepts in Sign and Text. A Computerized Sign Language Dictionary.

    ERIC Educational Resources Information Center

    Western Pennsylvania School for the Deaf, Pittsburgh.

    This document is a dictionary of essential vocabulary, signs, and illustrations of workplace activities to be used to train deaf or hearing-impaired adults. It contains more than 500 entries with workplace-relevant vocabulary, each including an illustration of the signed word or phrase in American Sign Language, a description of how to make the…

  19. The Verbal System of Catalan Sign Language (LSC)

    ERIC Educational Resources Information Center

    Morales-Lopez, Esperanza; Boldu-Menasanch, Rosa Maria; Alonso-Rodriguez, Jesus Amador; Gras-Ferrer, Victoria; Rodriguez-Gonzalez, Maria Angeles

    2005-01-01

    This article describes the predicative verbal system of Catalan Sign Language (LSC) as it is used by Deaf people in the province of Barcelona. We also present a historical perspective of the research on this topic, which provides insight into the changes that have taken place over the last few decades in sign language linguistics. The principal…

  20. Language and Literacy Acquisition through Parental Mediation in American Sign Language

    ERIC Educational Resources Information Center

    Bailes, Cynthia Neese; Erting, Lynne C.; Thumann-Prezioso, Carlene; Erting, Carol J.

    2009-01-01

    This longitudinal case study examined the language and literacy acquisition of a Deaf child as mediated by her signing Deaf parents during her first three years of life. Results indicate that the parents' interactions with their child were guided by linguistic and cultural knowledge that produced an intuitive use of child-directed signing (CDSi)…

  1. Identifying Movement Patterns and Severity of Associated Pain in Sign Language Interpreters

    ERIC Educational Resources Information Center

    Freeman, Julie K.; Rogers, Janet L.

    2010-01-01

    Our research sought to identify the most common movement patterns and postures performed by sign language interpreters and the frequency and severity of any pain that may be associated with the movements. A survey was developed and mailed to registered sign language interpreters throughout the state of Illinois. For each specific upper extremity…

  2. Historical Development of Hong Kong Sign Language

    ERIC Educational Resources Information Center

    Sze, Felix; Lo, Connie; Lo, Lisa; Chu, Kenny

    2013-01-01

    This article traces the origins of Hong Kong Sign Language (hereafter HKSL) and its subsequent development in relation to the establishment of Deaf education in Hong Kong after World War II. We begin with a detailed description of the history of Deaf education with a particular focus on the role of sign language in such development. We then…

  3. The Effect of New Technologies on Sign Language Research

    ERIC Educational Resources Information Center

    Lucas, Ceil; Mirus, Gene; Palmer, Jeffrey Levi; Roessler, Nicholas James; Frost, Adam

    2013-01-01

    This paper first reviews the fairly established ways of collecting sign language data. It then discusses the new technologies available and their impact on sign language research, both in terms of how data is collected and what new kinds of data are emerging as a result of technology. New data collection methods and new kinds of data are…

  4. Perspectives on the Sign Language Factor in Sub-Saharan Africa: Challenges of Sustainability

    ERIC Educational Resources Information Center

    Lutalo-Kiingi, Sam; De Clerck, Goedele A. M.

    2017-01-01

    This article has been excerpted from "Perspectives on the Sign Language Factor in Sub-Saharan Africa: Challenges of Sustainability" (Lutalo-Kiingi and De Clerck) in "Sign Language, Sustainable Development, and Equal Opportunities: Envisioning the Future for Deaf Students" (G. A. M. De Clerck and P. V. Paul (Eds.) 2016). In this…

  5. Meemul Tziij: An Indigenous Sign Language Complex of Mesoamerica

    ERIC Educational Resources Information Center

    Tree, Erich Fox

    2009-01-01

    This article examines sign languages that belong to a complex of indigenous sign languages in Mesoamerica that K'iche'an Maya people of Guatemala refer to collectively as Meemul Tziij. It explains the relationship between the Meemul Tziij variety of the Yukatek Maya village of Chican (state of Yucatan, Mexico) and the hitherto undescribed Meemul…

  6. The Birth and Rebirth of "Sign Language Studies"

    ERIC Educational Resources Information Center

    Armstrong, David F.

    2012-01-01

    As most readers of this journal are aware, "Sign Language Studies" ("SLS") served for many years as effectively the only serious scholarly outlet for work in the nascent field of sign language linguistics. Now reaching its 40th anniversary, the journal was founded by William C. Stokoe and then edited by him for the first quarter century of its…

  7. American Sign Language Comprehension Test: A Tool for Sign Language Researchers

    ERIC Educational Resources Information Center

    Hauser, Peter C.; Paludneviciene, Raylene; Riddle, Wanda; Kurz, Kim B.; Emmorey, Karen; Contreras, Jessica

    2016-01-01

    The American Sign Language Comprehension Test (ASL-CT) is a 30-item multiple-choice test that measures ASL receptive skills and is administered through a website. This article describes the development and psychometric properties of the test based on a sample of 80 college students including deaf native signers, hearing native signers, deaf…

  8. Cross-Linguistic Differences in the Neural Representation of Human Language: Evidence from Users of Signed Languages

    PubMed Central

    Corina, David P.; Lawyer, Laurel A.; Cates, Deborah

    2013-01-01

    Studies of deaf individuals who are users of signed languages have provided profound insight into the neural representation of human language. Case studies of deaf signers who have incurred left- and right-hemisphere damage have shown that left-hemisphere resources are a necessary component of sign language processing. These data suggest that, despite frank differences in the input and output modality of language, core left perisylvian regions universally serve linguistic function. Neuroimaging studies of deaf signers have generally provided support for this claim. However, more fine-tuned studies of linguistic processing in deaf signers are beginning to show evidence of important differences in the representation of signed and spoken languages. In this paper, we provide a critical review of this literature and present compelling evidence for language-specific cortical representations in deaf signers. These data lend support to the claim that the neural representation of language may show substantive cross-linguistic differences. We discuss the theoretical implications of these findings with respect to an emerging understanding of the neurobiology of language. PMID:23293624

  9. Evaluating Effects of Language Recognition on Language Rights and the Vitality of New Zealand Sign Language

    ERIC Educational Resources Information Center

    McKee, Rachel Locker; Manning, Victoria

    2015-01-01

    Status planning through legislation made New Zealand Sign Language (NZSL) an official language in 2006. But this strong symbolic action did not create resources or mechanisms to further the aims of the act. In this article we discuss the extent to which legal recognition and ensuing language-planning activities by state and community have affected…

  10. Deaf Education Policy as Language Policy: A Comparative Analysis of Sweden and the United States

    ERIC Educational Resources Information Center

    Hult, Francis M.; Compton, Sarah E.

    2012-01-01

    The role of languages is a central issue in deaf education. The function of sign languages in education and deaf students' opportunities to develop linguistic abilities in both sign languages and the dominant language(s) of a society are key considerations (Hogan-Brun 2009; Reagan 2010, 53; Swanwick 2010a). Accordingly, what Kaplan and Baldauf…

  11. Semantic categorization: a comparison between deaf and hearing children.

    PubMed

    Ormel, Ellen A; Gijsel, Martine A R; Hermans, Daan; Bosman, Anna M T; Knoors, Harry; Verhoeven, Ludo

    2010-01-01

    Learning to read is a major obstacle for children who are deaf. The otherwise significant role of phonology is often limited as a result of hearing loss. However, semantic knowledge may facilitate reading comprehension. One important aspect of semantic knowledge concerns semantic categorization. In the present study, the quality of the semantic categorization of both deaf and hearing children was examined for written words and pictures at two categorization levels. The deaf children performed better at the picture condition compared to the written word condition, while the hearing children performed similarly at pictures and written words. The hearing children outperformed the deaf children, in particular for written words. In addition, the results of the deaf children for the written words correlated to their sign vocabulary and sign language comprehension. The increase in semantic categorization was limited across elementary school grade levels. Readers will be able to: (1) understand several semantic categorization differences between groups of deaf and hearing children; (2) describe factors that may affect the development of semantic categorization, in particular the relationship between sign language skills and semantic categorization for deaf children. Copyright 2010 Elsevier Inc. All rights reserved.

  12. Directionality effects in simultaneous language interpreting: the case of sign language interpreters in The Netherlands.

    PubMed

    Van Dijk, Rick; Boers, Eveline; Christoffels, Ingrid; Hermans, Daan

    2011-01-01

    The quality of interpretations produced by sign language interpreters was investigated. Twenty-five experienced interpreters were instructed to interpret narratives from (a) spoken Dutch to Sign Language of The Netherlands (SLN), (b) spoken Dutch to Sign Supported Dutch (SSD), and (c) SLN to spoken Dutch. The quality of the interpreted narratives was assessed by 5 certified sign language interpreters who did not participate in the study. Two measures were used to assess interpreting quality: the propositional accuracy of the interpreters' interpretations and a subjective quality measure. The results showed that the interpreted narratives in the SLN-to-Dutch interpreting direction were of lower quality (on both measures) than the interpreted narratives in the Dutch-to-SLN and Dutch-to-SSD directions. Furthermore, interpreters who had begun acquiring SLN when they entered the interpreter training program performed as well in all 3 interpreting directions as interpreters who had acquired SLN from birth.

  13. "Thinking-for-Writing": A Prolegomenon on Writing Signed Languages.

    PubMed

    Rosen, Russell S; Hartman, Maria C; Wang, Ye

    2017-01-01

    In his article in this American Annals of the Deaf special issue that also includes the present article, Grushkin argues that the writing difficulties of many deaf and hard of hearing children result primarily from the orthographic nature of the writing system; he proposes a new system based on features found in signed languages. In response, the present authors review the literature on D/HH children's writing difficulties, outline the main percepts of and assumptions about writing signed languages, discuss "thinking-for-writing" as a process in developing writing skills, offer research designs to test the effectiveness of writing signed language systems, and provide strategies for adopting "thinking-for-writing" in education. They conclude that until empirical studies show that writing signed languages effectively reflects writers' "thinking-for-writing," the alphabetic orthographic system of English should still be used, and ways should be found to teach D/HH children to use English writing to express their thoughts.

  14. Students who are deaf and hard of hearing and use sign language: considerations and strategies for developing spoken language and literacy skills.

    PubMed

    Nussbaum, Debra; Waddy-Smith, Bettie; Doyle, Jane

    2012-11-01

    There is a core body of knowledge, experience, and skills integral to facilitating auditory, speech, and spoken language development when working with the general population of students who are deaf and hard of hearing. There are additional issues, strategies, and challenges inherent in speech habilitation/rehabilitation practices essential to the population of deaf and hard of hearing students who also use sign language. This article will highlight philosophical and practical considerations related to practices used to facilitate spoken language development and associated literacy skills for children and adolescents who sign. It will discuss considerations for planning and implementing practices that acknowledge and utilize a student's abilities in sign language, and address how to link these skills to developing and using spoken language. Included will be considerations for children from early childhood through high school with a broad range of auditory access, language, and communication characteristics. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  15. The psychotherapist and the sign language interpreter.

    PubMed

    de Bruin, Ed; Brugmans, Petra

    2006-01-01

    Specialized psychotherapy for deaf people in the Dutch and Western European mental health systems is still a rather young specialism. A key policy principle in Dutch mental health care for the deaf is that they should receive treatment in the language most accessible to them, which is usually Dutch Sign Language (Nederlandse Gebarentaal or NGT). Although psychotherapists for the deaf are trained to use sign language, situations will always arise in which a sign language interpreter is needed. Most psychotherapists have the opinion that working with a sign language interpreter in therapy sessions can be a valuable alternative option but also see it as a second-best solution because of its impact on the therapeutic process. This paper describes our years of collaborationship as a therapist and a sign language interpreter. If this collaborationship is optimal, it can generate a certain "therapeutic power" in the therapy sessions. Achieving this depends largely on the interplay between the therapist and the interpreter, which in our case is the result of literature research and our experiences during the last 17 years. We analyze this special collaborative relationship, which has several dimensions and recurrent themes like, the role conception of the interpreter, situational interpreting, organizing the interpretation setting, or managing therapeutic phenomena during therapy sessions.

  16. Sign Vocabulary in Deaf Toddlers Exposed to Sign Language since Birth

    ERIC Educational Resources Information Center

    Rinaldi, Pasquale; Caselli, Maria Cristina; Di Renzo, Alessio; Gulli, Tiziana; Volterra, Virginia

    2014-01-01

    Lexical comprehension and production is directly evaluated for the first time in deaf signing children below the age of 3 years. A Picture Naming Task was administered to 8 deaf signing toddlers (aged 2-3 years) who were exposed to Sign Language since birth. Results were compared with data of hearing speaking controls. In both deaf and hearing…

  17. Referential shift in Nicaraguan Sign Language: a transition from lexical to spatial devices

    PubMed Central

    Kocab, Annemarie; Pyers, Jennie; Senghas, Ann

    2015-01-01

    Even the simplest narratives combine multiple strands of information, integrating different characters and their actions by expressing multiple perspectives of events. We examined the emergence of referential shift devices, which indicate changes among these perspectives, in Nicaraguan Sign Language (NSL). Sign languages, like spoken languages, mark referential shift grammatically with a shift in deictic perspective. In addition, sign languages can mark the shift with a point or a movement of the body to a specified spatial location in the three-dimensional space in front of the signer, capitalizing on the spatial affordances of the manual modality. We asked whether the use of space to mark referential shift emerges early in a new sign language by comparing the first two age cohorts of deaf signers of NSL. Eight first-cohort signers and 10 second-cohort signers watched video vignettes and described them in NSL. Narratives were coded for lexical (use of words) and spatial (use of signing space) devices. Although the cohorts did not differ significantly in the number of perspectives represented, second-cohort signers used referential shift devices to explicitly mark a shift in perspective in more of their narratives. Furthermore, while there was no significant difference between cohorts in the use of non-spatial, lexical devices, there was a difference in spatial devices, with second-cohort signers using them in significantly more of their narratives. This suggests that spatial devices have only recently increased as systematic markers of referential shift. Spatial referential shift devices may have emerged more slowly because they depend on the establishment of fundamental spatial conventions in the language. While the modality of sign languages can ultimately engender the syntactic use of three-dimensional space, we propose that a language must first develop systematic spatial distinctions before harnessing space for grammatical functions. PMID:25713541

  18. Referential shift in Nicaraguan Sign Language: a transition from lexical to spatial devices.

    PubMed

    Kocab, Annemarie; Pyers, Jennie; Senghas, Ann

    2014-01-01

    Even the simplest narratives combine multiple strands of information, integrating different characters and their actions by expressing multiple perspectives of events. We examined the emergence of referential shift devices, which indicate changes among these perspectives, in Nicaraguan Sign Language (NSL). Sign languages, like spoken languages, mark referential shift grammatically with a shift in deictic perspective. In addition, sign languages can mark the shift with a point or a movement of the body to a specified spatial location in the three-dimensional space in front of the signer, capitalizing on the spatial affordances of the manual modality. We asked whether the use of space to mark referential shift emerges early in a new sign language by comparing the first two age cohorts of deaf signers of NSL. Eight first-cohort signers and 10 second-cohort signers watched video vignettes and described them in NSL. Narratives were coded for lexical (use of words) and spatial (use of signing space) devices. Although the cohorts did not differ significantly in the number of perspectives represented, second-cohort signers used referential shift devices to explicitly mark a shift in perspective in more of their narratives. Furthermore, while there was no significant difference between cohorts in the use of non-spatial, lexical devices, there was a difference in spatial devices, with second-cohort signers using them in significantly more of their narratives. This suggests that spatial devices have only recently increased as systematic markers of referential shift. Spatial referential shift devices may have emerged more slowly because they depend on the establishment of fundamental spatial conventions in the language. While the modality of sign languages can ultimately engender the syntactic use of three-dimensional space, we propose that a language must first develop systematic spatial distinctions before harnessing space for grammatical functions.

  19. A dictionary of Astronomy for the French Sign Language (LSF)

    NASA Astrophysics Data System (ADS)

    Proust, Dominique; Abbou, Daniel; Chab, Nasro

    2011-06-01

    Since a few years, the french deaf communauty have access to astronomy at Paris-Meudon observatory through a specific teaching adapted from the French Sign Language (Langue des Signes Françcaise, LSF) including direct observations with the observatory telescopes. From this experience, an encyclopedic dictionary of astronomy The Hands in the Stars is now available, containing more than 200 astronomical concepts. Many of them did not existed in Sign Language and can be now fully expressed and explained.

  20. Articulatory Suppression Effects on Short-Term Memory of Signed Digits and Lexical Items in Hearing Bimodal-Bilingual Adults

    ERIC Educational Resources Information Center

    Liu, Hsiu Tan; Squires, Bonita; Liu, Chun Jung

    2016-01-01

    We can gain a better understanding of short-term memory processes by studying different language codes and modalities. Three experiments were conducted to investigate: (a) Taiwanese Sign Language (TSL) digit spans in Chinese/TSL hearing bilinguals (n = 32); (b) American Sign Language (ASL) digit spans in English/ASL hearing bilinguals (n = 15);…

  1. The Link between Form and Meaning in British Sign Language: Effects of Iconicity for Phonological Decisions

    ERIC Educational Resources Information Center

    Thompson, Robin L.; Vinson, David P.; Vigliocco, Gabriella

    2010-01-01

    Signed languages exploit the visual/gestural modality to create iconic expression across a wide range of basic conceptual structures in which the phonetic resources of the language are built up into an analogue of a mental image (Taub, 2001). Previously, we demonstrated a processing advantage when iconic properties of signs were made salient in a…

  2. Evidence for Website Claims about the Benefits of Teaching Sign Language to Infants and Toddlers with Normal Hearing

    ERIC Educational Resources Information Center

    Nelson, Lauri H.; White, Karl R.; Grewe, Jennifer

    2012-01-01

    The development of proficient communication skills in infants and toddlers is an important component to child development. A popular trend gaining national media attention is teaching sign language to babies with normal hearing whose parents also have normal hearing. Thirty-three websites were identified that advocate sign language for hearing…

  3. The Effect of Sign Language Rehearsal on Deaf Subjects' Immediate and Delayed Recall of English Word Lists.

    ERIC Educational Resources Information Center

    Bonvillian, John D.; And Others

    1987-01-01

    The relationship between sign language rehearsal and written free recall was examined by having deaf college students rehearse the sign language equivalents of printed English words. Studies of both immediate and delayed memory suggested that word recall increased as a function of total rehearsal frequency and frequency of appearance in rehearsal…

  4. Constructing an Online Test Framework, Using the Example of a Sign Language Receptive Skills Test

    ERIC Educational Resources Information Center

    Haug, Tobias; Herman, Rosalind; Woll, Bencie

    2015-01-01

    This paper presents the features of an online test framework for a receptive skills test that has been adapted, based on a British template, into different sign languages. The online test includes features that meet the needs of the different sign language versions. Features such as usability of the test, automatic saving of scores, and score…

  5. Gesture in Multiparty Interaction: A Study of Embodied Discourse in Spoken English and American Sign Language

    ERIC Educational Resources Information Center

    Shaw, Emily P.

    2013-01-01

    This dissertation is an examination of gesture in two game nights: one in spoken English between four hearing friends and another in American Sign Language between four Deaf friends. Analyses of gesture have shown there exists a complex integration of manual gestures with speech. Analyses of sign language have implicated the body as a medium…

  6. The British Sign Language Variant of Stokoe Notation: Report on a Type-Design Project.

    ERIC Educational Resources Information Center

    Thoutenhoofd, Ernst

    2003-01-01

    Explores the outcome of a publicly-funded research project titled "Redesign of the British Sign Language (BSL) Notation System with a New Font for Use in ICT." The aim of the project was to redesign the British Sign Language variant of Stokoe notation for practical use in information technology systems and software, such as lexical…

  7. Deaf Students' Receptive and Expressive American Sign Language Skills: Comparisons and Relations

    ERIC Educational Resources Information Center

    Beal-Alvarez, Jennifer S.

    2014-01-01

    This article presents receptive and expressive American Sign Language skills of 85 students, 6 through 22 years of age at a residential school for the deaf using the American Sign Language Receptive Skills Test and the Ozcaliskan Motion Stimuli. Results are presented by ages and indicate that students' receptive skills increased with age and…

  8. The physiognomic unity of sign, word, and gesture.

    PubMed

    Cornejo, Carlos; Musa, Roberto

    2017-01-01

    Goldin-Meadow & Brentari (G-M&B) are implicitly going against the dominant paradigm in language research, namely, the "speech as written language" metaphor that portrays vocal sounds and bodily signs as means of delivering stable word meanings. We argue that Heinz Werner's classical research on the physiognomic properties of language supports and complements their view of sign and gesture as a unified system.

  9. Bilingual Word Recognition in Deaf and Hearing Signers: Effects of Proficiency and Language Dominance on Cross-Language Activation

    ERIC Educational Resources Information Center

    Morford, Jill P.; Kroll, Judith F.; Piñar, Pilar; Wilkinson, Erin

    2014-01-01

    Recent evidence demonstrates that American Sign Language (ASL) signs are active during print word recognition in deaf bilinguals who are highly proficient in both ASL and English. In the present study, we investigate whether signs are active during print word recognition in two groups of unbalanced bilinguals: deaf ASL-dominant and hearing…

  10. Post-glossectomy in lingual carcinomas: a scope for sign language in rehabilitation

    PubMed Central

    Cumberbatch, Keren; Jones, Thaon

    2017-01-01

    The treatment option for cancers of the tongue is glossectomy, which may be partial, sub-total, or total, depending on the size of the tumour. Glossectomies result in speech deficits for these patients, and rehabilitative therapy involving communication modalities is highly recommended. Sign language is a possible therapeutic solution for post-glossectomy oral cancer patients. Patients with tongue cancers who have undergone total glossectomy as a surgical treatment can utilise sign language to replace their loss of speech production and maintain their engagement in life. This manuscript emphasises the importance of sign language in rehabilitation strategies in post-glossectomy patients. PMID:28947881

  11. Post-glossectomy in lingual carcinomas: a scope for sign language in rehabilitation.

    PubMed

    Rajendra Santosh, Arvind Babu; Cumberbatch, Keren; Jones, Thaon

    2017-01-01

    The treatment option for cancers of the tongue is glossectomy, which may be partial, sub-total, or total, depending on the size of the tumour. Glossectomies result in speech deficits for these patients, and rehabilitative therapy involving communication modalities is highly recommended. Sign language is a possible therapeutic solution for post-glossectomy oral cancer patients. Patients with tongue cancers who have undergone total glossectomy as a surgical treatment can utilise sign language to replace their loss of speech production and maintain their engagement in life. This manuscript emphasises the importance of sign language in rehabilitation strategies in post-glossectomy patients.

  12. On the linguistic status of ‘agreement’ in sign languages

    PubMed Central

    LILLO-MARTIN, DIANE; MEIER, RICHARD P.

    2013-01-01

    In signed languages, the arguments of verbs can be marked by a system of verbal modification that has been termed “agreement” (more neutrally, “directionality”). Fundamental issues regarding directionality remain unresolved and the phenomenon has characteristics that call into question its analysis as agreement. We conclude that directionality marks person in American Sign Language, and the ways person marking interacts with syntactic phenomena are largely analogous to morpho-syntactic properties of familiar agreement systems. Overall, signed languages provide a crucial test for how gestural and linguistic mechanisms can jointly contribute to the satisfaction of fundamental aspects of linguistic structure. PMID:23495262

  13. Order of the major constituents in sign languages: implications for all language

    PubMed Central

    Napoli, Donna Jo; Sutton-Spence, Rachel

    2014-01-01

    A survey of reports of sign order from 42 sign languages leads to a handful of generalizations. Two accounts emerge, one amodal and the other modal. We argue that universal pressures are at work with respect to some generalizations, but that pressure from the visual modality is at work with respect to others. Together, these pressures conspire to make all sign languages order their major constituents SOV or SVO. This study leads us to the conclusion that the order of S with regard to verb phrase (VP) may be driven by sensorimotor system concerns that feed universal grammar. PMID:24860523

  14. Iconicity as a General Property of Language: Evidence from Spoken and Signed Languages

    PubMed Central

    Perniss, Pamela; Thompson, Robin L.; Vigliocco, Gabriella

    2010-01-01

    Current views about language are dominated by the idea of arbitrary connections between linguistic form and meaning. However, if we look beyond the more familiar Indo-European languages and also include both spoken and signed language modalities, we find that motivated, iconic form-meaning mappings are, in fact, pervasive in language. In this paper, we review the different types of iconic mappings that characterize languages in both modalities, including the predominantly visually iconic mappings found in signed languages. Having shown that iconic mapping are present across languages, we then proceed to review evidence showing that language users (signers and speakers) exploit iconicity in language processing and language acquisition. While not discounting the presence and importance of arbitrariness in language, we put forward the idea that iconicity need also be recognized as a general property of language, which may serve the function of reducing the gap between linguistic form and conceptual representation to allow the language system to “hook up” to motor, perceptual, and affective experience. PMID:21833282

  15. Input Processing at First Exposure to a Sign Language

    ERIC Educational Resources Information Center

    Ortega, Gerardo; Morgan, Gary

    2015-01-01

    There is growing interest in learners' cognitive capacities to process a second language (L2) at first exposure to the target language. Evidence suggests that L2 learners are capable of processing novel words by exploiting phonological information from their first language (L1). Hearing adult learners of a sign language, however, cannot fall back…

  16. A Case of Specific Language Impairment in a Deaf Signer of American Sign Language

    ERIC Educational Resources Information Center

    Quinto-Pozos, David; Singleton, Jenny L.; Hauser, Peter C.

    2017-01-01

    This article describes the case of a deaf native signer of American Sign Language (ASL) with a specific language impairment (SLI). School records documented normal cognitive development but atypical language development. Data include school records; interviews with the child, his mother, and school professionals; ASL and English evaluations; and a…

  17. Bilingual Education for Deaf Children in Sweden

    ERIC Educational Resources Information Center

    Svartholm, Kristina

    2010-01-01

    In 1981, Swedish Sign Language gained recognition by the Swedish Parliament as the language of deaf people, a decision that made Sweden the first country in the world to give a sign language the status of a language. Swedish was designated as a second language for deaf people, and the need for bilingualism among them was officially asserted. This…

  18. Functional changes in people with different hearing status and experiences of using Chinese sign language: an fMRI study.

    PubMed

    Li, Qiang; Xia, Shuang; Zhao, Fei; Qi, Ji

    2014-01-01

    The purpose of this study was to assess functional changes in the cerebral cortex in people with different sign language experience and hearing status whilst observing and imitating Chinese Sign Language (CSL) using functional magnetic resonance imaging (fMRI). 50 participants took part in the study, and were divided into four groups according to their hearing status and experience of using sign language: prelingual deafness signer group (PDS), normal hearing non-signer group (HnS), native signer group with normal hearing (HNS), and acquired signer group with normal hearing (HLS). fMRI images were scanned from all subjects when they performed block-designed tasks that involved observing and imitating sign language stimuli. Nine activation areas were found in response to undertaking either observation or imitation CSL tasks and three activated areas were found only when undertaking the imitation task. Of those, the PDS group had significantly greater activation areas in terms of the cluster size of the activated voxels in the bilateral superior parietal lobule, cuneate lobe and lingual gyrus in response to undertaking either the observation or the imitation CSL task than the HnS, HNS and HLS groups. The PDS group also showed significantly greater activation in the bilateral inferior frontal gyrus which was also found in the HNS or the HLS groups but not in the HnS group. This indicates that deaf signers have better sign language proficiency, because they engage more actively with the phonetic and semantic elements. In addition, the activations of the bilateral superior temporal gyrus and inferior parietal lobule were only found in the PDS group and HNS group, and not in the other two groups, which indicates that the area for sign language processing appears to be sensitive to the age of language acquisition. After reading this article, readers will be able to: discuss the relationship between sign language and its neural mechanisms. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Static sign language recognition using 1D descriptors and neural networks

    NASA Astrophysics Data System (ADS)

    Solís, José F.; Toxqui, Carina; Padilla, Alfonso; Santiago, César

    2012-10-01

    A frame work for static sign language recognition using descriptors which represents 2D images in 1D data and artificial neural networks is presented in this work. The 1D descriptors were computed by two methods, first one consists in a correlation rotational operator.1 and second is based on contour analysis of hand shape. One of the main problems in sign language recognition is segmentation; most of papers report a special color in gloves or background for hand shape analysis. In order to avoid the use of gloves or special clothing, a thermal imaging camera was used to capture images. Static signs were picked up from 1 to 9 digits of American Sign Language, a multilayer perceptron reached 100% recognition with cross-validation.

  20. Bilateral Cochlear Implants: Maximizing Expected Outcomes.

    PubMed

    Wallis, Kate E; Blum, Nathan J; Waryasz, Stephanie A; Augustyn, Marilyn

    Sonia is a 4 years 1 month-year-old girl with Waardenburg syndrome and bilateral sensorineural hearing loss who had bilateral cochlear implants at 2 years 7 months years of age. She is referred to Developmental-Behavioral Pediatrics by her speech/language pathologist because of concerns that her language skills are not progressing as expected after the cochlear implant. At the time of the implant, she communicated using approximately 20 signs and 1 spoken word (mama). At the time of the evaluation (18 months after the implant) she had approximately 70 spoken words (English and Spanish) and innumerable signs that she used to communicate. She could follow 1-step directions in English but had more difficulty after 2-step directions.Sonia was born in Puerto Rico at 40 weeks gestation after an uncomplicated pregnancy. She failed her newborn hearing test and was given hearing aids that did not seem to help.At age 2 years, Sonia, her mother, and younger sister moved to the United States where she was diagnosed with bilateral severe-to-profound hearing loss. Genetic testing led to a diagnosis of Waardenburg syndrome (group of genetic conditions that can cause hearing loss and changes in coloring [pigmentation] of the hair, skin, and eyes). She received bilateral cochlear implants 6 months later.Sonia's mother is primarily Spanish-speaking and mostly communicates with her in Spanish or with gestures but has recently begun to learn American Sign Language (ASL). In a preschool program at a specialized school for the deaf, Sonia is learning both English and ASL. Sonia seems to prefer to use ASL to communicate.Sonia receives speech and language therapy (SLT) 3 times per week (90 minutes total) individually in school and once per week within a group. She is also receiving outpatient SLT once per week. Therapy sessions are completed in English, with the aid of an ASL interpreter. Sonia's language scores remain low, with her receptive skills in the first percentile, and her expressive skills in the fifth percentile.During her evaluation in Developmental and Behavioral Pediatrics, an ASL interpreter was present, and the examiner is a fluent Spanish speaker. Testing was completed through a combination of English, Spanish, and ASL. Sonia seemed to prefer ASL to communicate, although she used some English words with errors of pronunciation. On the Beery Visual-Motor Integration Test, she obtained a standard score of 95. Parent and teacher rating scales were not significant for symptoms of attention-deficit/hyperactivity disorder.What factors are contributing to her slow language acquisition and how would you modify her treatment plan?

  1. Australian Aboriginal Deaf People and Aboriginal Sign Language

    ERIC Educational Resources Information Center

    Power, Des

    2013-01-01

    Many Australian Aboriginal people use a sign language ("hand talk") that mirrors their local spoken language and is used both in culturally appropriate settings when speech is taboo or counterindicated and for community communication. The characteristics of these languages are described, and early European settlers' reports of deaf…

  2. Sign Language and Pantomime Production Differentially Engage Frontal and Parietal Cortices

    ERIC Educational Resources Information Center

    Emmorey, Karen; McCullough, Stephen; Mehta, Sonya; Ponto, Laura L. B.; Grabowski, Thomas J.

    2011-01-01

    We investigated the functional organisation of neural systems supporting language production when the primary language articulators are also used for meaningful, but nonlinguistic, expression such as pantomime. Fourteen hearing nonsigners and 10 deaf native users of American Sign Language (ASL) participated in an H[subscript 2][superscript…

  3. Uncovering Translingual Practices in Teaching Parents Classical ASL Varieties

    ERIC Educational Resources Information Center

    Snoddon, Kristin

    2017-01-01

    The view of sign languages as bounded systems is often important for deaf community empowerment and for pedagogical practice in terms of supporting deaf children's language acquisition and second language learners' communicative competence. Conversely, the notion of translanguaging in the American Sign Language (ASL) community highlights a number…

  4. The neural correlates of highly iconic structures and topographic discourse in French Sign Language as observed in six hearing native signers.

    PubMed

    Courtin, C; Hervé, P-Y; Petit, L; Zago, L; Vigneau, M; Beaucousin, V; Jobard, G; Mazoyer, B; Mellet, E; Tzourio-Mazoyer, N

    2010-09-01

    "Highly iconic" structures in Sign Language enable a narrator to act, switch characters, describe objects, or report actions in four-dimensions. This group of linguistic structures has no real spoken-language equivalent. Topographical descriptions are also achieved in a sign-language specific manner via the use of signing-space and spatial-classifier signs. We used functional magnetic resonance imaging (fMRI) to compare the neural correlates of topographic discourse and highly iconic structures in French Sign Language (LSF) in six hearing native signers, children of deaf adults (CODAs), and six LSF-naïve monolinguals. LSF materials consisted of videos of a lecture excerpt signed without spatially organized discourse or highly iconic structures (Lect LSF), a tale signed using highly iconic structures (Tale LSF), and a topographical description using a diagrammatic format and spatial-classifier signs (Topo LSF). We also presented texts in spoken French (Lect French, Tale French, Topo French) to all participants. With both languages, the Topo texts activated several different regions that are involved in mental navigation and spatial working memory. No specific correlate of LSF spatial discourse was evidenced. The same regions were more activated during Tale LSF than Lect LSF in CODAs, but not in monolinguals, in line with the presence of signing-space structure in both conditions. Motion processing areas and parts of the fusiform gyrus and precuneus were more active during Tale LSF in CODAs; no such effect was observed with French or in LSF-naïve monolinguals. These effects may be associated with perspective-taking and acting during personal transfers. 2010 Elsevier Inc. All rights reserved.

  5. A New Kind of Heterogeneity: What We Can Learn From d/Deaf and Hard of Hearing Multilingual Learners.

    PubMed

    Cannon, Joanna E; Guardino, Caroline; Gallimore, Erin

    2016-01-01

    The present article introduces a special issue of the American Annals of the Deaf. Students who are d/Deaf or hard of hearing and come from homes where a language other than English or American Sign Language is used constitute 19.4%-35.0% of the U.S. d/Dhh population (Gallaudet Research Institute, 2013). The authors propose moving beyond the standardized use of the designation English Language Learners to embrace terminology encompassing these learners as diverse and rich in language: d/Dhh Multilingual Learners (DMLs). The authors present (a) a discussion of terminology, (b) an overview of available demographic data, (c) a synopsis of the special issue, (d) themes across three case study vignettes, and (e) overall recommendations to advance curriculum design and pedagogy for DMLs. Questions are posed challenging researchers and practitioners to investigate theory, research, and pedagogy that can enhance practice with DMLs and their families.

  6. Early Sign Language Experience Goes along with an Increased Cross-Modal Gain for Affective Prosodic Recognition in Congenitally Deaf CI Users

    ERIC Educational Resources Information Center

    Fengler, Ineke; Delfau, Pia-Céline; Röder, Brigitte

    2018-01-01

    It is yet unclear whether congenitally deaf cochlear implant (CD CI) users' visual and multisensory emotion perception is influenced by their history in sign language acquisition. We hypothesized that early-signing CD CI users, relative to late-signing CD CI users and hearing, non-signing controls, show better facial expression recognition and…

  7. Visual Iconicity Across Sign Languages: Large-Scale Automated Video Analysis of Iconic Articulators and Locations

    PubMed Central

    Östling, Robert; Börstell, Carl; Courtaux, Servane

    2018-01-01

    We use automatic processing of 120,000 sign videos in 31 different sign languages to show a cross-linguistic pattern for two types of iconic form–meaning relationships in the visual modality. First, we demonstrate that the degree of inherent plurality of concepts, based on individual ratings by non-signers, strongly correlates with the number of hands used in the sign forms encoding the same concepts across sign languages. Second, we show that certain concepts are iconically articulated around specific parts of the body, as predicted by the associational intuitions by non-signers. The implications of our results are both theoretical and methodological. With regard to theoretical implications, we corroborate previous research by demonstrating and quantifying, using a much larger material than previously available, the iconic nature of languages in the visual modality. As for the methodological implications, we show how automatic methods are, in fact, useful for performing large-scale analysis of sign language data, to a high level of accuracy, as indicated by our manual error analysis. PMID:29867684

  8. When does a system become phonological? Handshape production in gesturers, signers, and homesigners

    PubMed Central

    Coppola, Marie; Mazzoni, Laura; Goldin-Meadow, Susan

    2013-01-01

    Sign languages display remarkable crosslinguistic consistencies in the use of handshapes. In particular, handshapes used in classifier predicates display a consistent pattern in finger complexity: classifier handshapes representing objects display more finger complexity than those representing how objects are handled. Here we explore the conditions under which this morphophonological phenomenon arises. In Study 1, we ask whether hearing individuals in Italy and the United States, asked to communicate using only their hands, show the same pattern of finger complexity found in the classifier handshapes of two sign languages: Italian Sign Language (LIS) and American Sign Language (ASL). We find that they do not: gesturers display more finger complexity in handling handshapes than in object handshapes. The morphophonological pattern found in conventional sign languages is therefore not a codified version of the pattern invented by hearing individuals on the spot. In Study 2, we ask whether continued use of gesture as a primary communication system results in a pattern that is more similar to the morphophonological pattern found in conventional sign languages or to the pattern found in gesturers. Homesigners have not acquired a signed or spoken language and instead use a self-generated gesture system to communicate with their hearing family members and friends. We find that homesigners pattern more like signers than like gesturers: their finger complexity in object handshapes is higher than that of gesturers (indeed as high as signers); and their finger complexity in handling handshapes is lower than that of gesturers (but not quite as low as signers). Generally, our findings indicate two markers of the phonologization of handshape in sign languages: increasing finger complexity in object handshapes, and decreasing finger complexity in handling handshapes. These first indicators of phonology appear to be present in individuals developing a gesture system without benefit of a linguistic community. Finally, we propose that iconicity, morphology and phonology each play an important role in the system of sign language classifiers to create the earliest markers of phonology at the morphophonological interface. PMID:23723534

  9. Syntactic priming in American Sign Language.

    PubMed

    Hall, Matthew L; Ferreira, Victor S; Mayberry, Rachel I

    2015-01-01

    Psycholinguistic studies of sign language processing provide valuable opportunities to assess whether language phenomena, which are primarily studied in spoken language, are fundamentally shaped by peripheral biology. For example, we know that when given a choice between two syntactically permissible ways to express the same proposition, speakers tend to choose structures that were recently used, a phenomenon known as syntactic priming. Here, we report two experiments testing syntactic priming of a noun phrase construction in American Sign Language (ASL). Experiment 1 shows that second language (L2) signers with normal hearing exhibit syntactic priming in ASL and that priming is stronger when the head noun is repeated between prime and target (the lexical boost effect). Experiment 2 shows that syntactic priming is equally strong among deaf native L1 signers, deaf late L1 learners, and hearing L2 signers. Experiment 2 also tested for, but did not find evidence of, phonological or semantic boosts to syntactic priming in ASL. These results show that despite the profound differences between spoken and signed languages in terms of how they are produced and perceived, the psychological representation of sentence structure (as assessed by syntactic priming) operates similarly in sign and speech.

  10. "We Communicated That Way for a Reason": Language Practices and Language Ideologies among Hearing Adults Whose Parents Are Deaf

    ERIC Educational Resources Information Center

    Pizer, Ginger; Walters, Keith; Meier, Richard P.

    2013-01-01

    Families with deaf parents and hearing children are often bilingual and bimodal, with both a spoken language and a signed one in regular use among family members. When interviewed, 13 American hearing adults with deaf parents reported widely varying language practices, sign language abilities, and social affiliations with Deaf and Hearing…

  11. Psychometric properties of a sign language version of the Mini International Neuropsychiatric Interview (MINI).

    PubMed

    Øhre, Beate; Saltnes, Hege; von Tetzchner, Stephen; Falkum, Erik

    2014-05-22

    There is a need for psychiatric assessment instruments that enable reliable diagnoses in persons with hearing loss who have sign language as their primary language. The objective of this study was to assess the validity of the Norwegian Sign Language (NSL) version of the Mini International Neuropsychiatric Interview (MINI). The MINI was translated into NSL. Forty-one signing patients consecutively referred to two specialised psychiatric units were assessed with a diagnostic interview by clinical experts and with the MINI. Inter-rater reliability was assessed with Cohen's kappa and "observed agreement". There was 65% agreement between MINI diagnoses and clinical expert diagnoses. Kappa values indicated fair to moderate agreement, and observed agreement was above 76% for all diagnoses. The MINI diagnosed more co-morbid conditions than did the clinical expert interview (mean diagnoses: 1.9 versus 1.2). Kappa values indicated moderate to substantial agreement, and "observed agreement" was above 88%. The NSL version performs similarly to other MINI versions and demonstrates adequate reliability and validity as a diagnostic instrument for assessing mental disorders in persons who have sign language as their primary and preferred language.

  12. Cerebral organization of oral and signed language responses: case study evidence from amytal and cortical stimulation studies.

    PubMed

    Mateer, C A; Rapport, R L; Kettrick, C

    1984-01-01

    A normally hearing left-handed patient familiar with American Sign Language (ASL) was assessed under sodium amytal conditions and with left cortical stimulation in both oral speech and signed English. Lateralization was mixed but complementary in each language mode: the right hemisphere perfusion severely disrupted motoric aspects of both types of language expression, the left hemisphere perfusion specifically disrupted features of grammatical and semantic usage in each mode of expression. Both semantic and syntactic aspects of oral and signed responses were altered during left posterior temporal-parietal stimulation. Findings are discussed in terms of the neurological organization of ASL and linguistic organization in cases of early left hemisphere damage.

  13. Hierarchically Structured Non-Intrusive Sign Language Recognition. Chapter 2

    NASA Technical Reports Server (NTRS)

    Zieren, Jorg; Zieren, Jorg; Kraiss, Karl-Friedrich

    2007-01-01

    This work presents a hierarchically structured approach at the nonintrusive recognition of sign language from a monocular frontal view. Robustness is achieved through sophisticated localization and tracking methods, including a combined EM/CAMSHIFT overlap resolution procedure and the parallel pursuit of multiple hypotheses about hands position and movement. This allows handling of ambiguities and automatically corrects tracking errors. A biomechanical skeleton model and dynamic motion prediction using Kalman filters represents high level knowledge. Classification is performed by Hidden Markov Models. 152 signs from German sign language were recognized with an accuracy of 97.6%.

  14. The history of sign language and deaf education in Turkey.

    PubMed

    Kemaloğlu, Yusuf Kemal; Kemaloğlu, Pınar Yaprak

    2012-01-01

    Sign language is the natural language of the prelingually deaf people particularly without hearing-speech rehabilitation. Otorhinolaryngologists, regarding health as complete physical, mental and psychosocial well-being, aim hearing by diagnosing deafness as deviance from normality. However, it's obvious that the perception conflicted with the behavior which does not meet the mental and social well-being of the individual also contradicts with the definition mentioned above. This article aims to investigate the effects of hearing-speech target ignoring the sign language in Turkish population and its consistency with the history through statistical data, scientific publications and historical documents and to support critical perspective on this issue. The study results showed that maximum 50% of the deaf benefited from hearing-speech program for last 60 years before hearing screening programs; however, systems including sign language in education were not generated. In the light of these data, it is clear that the approach ignoring sign language particularly before the development of screening programs is not reasonable. In addition, considering sign language being part of the Anatolian history from Hittites to Ottomans, it is a question to be answered that why evaluation, habilitation and education systems excluding sign language are still the only choice for deaf individuals in Turkey. Despite legislative amendments in the last 6-7 years, the primary cause of failure to come into force is probably because of inadequate conception of the issue content and importance, as well as limited effort to offer solutions by academicians and authorized politicians. Within this context, this paper aims to make a positive effect on this issue offering a review for the medical staff, particularly otorhinolaryngologists and audiologists.

  15. Language Justice for Sign Language Peoples: The UN Convention on the Rights of Persons with Disabilities

    ERIC Educational Resources Information Center

    Batterbury, Sarah C. E.

    2012-01-01

    Sign Language Peoples (SLPs) across the world have developed their own languages and visuo-gestural-tactile cultures embodying their collective sense of Deafhood (Ladd 2003). Despite this, most nation-states treat their respective SLPs as disabled individuals, favoring disability benefits, cochlear implants, and mainstream education over language…

  16. Visual Sonority Modulates Infants' Attraction to Sign Language

    ERIC Educational Resources Information Center

    Stone, Adam; Petitto, Laura-Ann; Bosworth, Rain

    2018-01-01

    The infant brain may be predisposed to identify perceptually salient cues that are common to both signed and spoken languages. Recent theory based on spoken languages has advanced sonority as one of these potential language acquisition cues. Using a preferential looking paradigm with an infrared eye tracker, we explored visual attention of hearing…

  17. Language lateralization of hearing native signers: A functional transcranial Doppler sonography (fTCD) study of speech and sign production

    PubMed Central

    Gutierrez-Sigut, Eva; Daws, Richard; Payne, Heather; Blott, Jonathan; Marshall, Chloë; MacSweeney, Mairéad

    2016-01-01

    Neuroimaging studies suggest greater involvement of the left parietal lobe in sign language compared to speech production. This stronger activation might be linked to the specific demands of sign encoding and proprioceptive monitoring. In Experiment 1 we investigate hemispheric lateralization during sign and speech generation in hearing native users of English and British Sign Language (BSL). Participants exhibited stronger lateralization during BSL than English production. In Experiment 2 we investigated whether this increased lateralization index could be due exclusively to the higher motoric demands of sign production. Sign naïve participants performed a phonological fluency task in English and a non-sign repetition task. Participants were left lateralized in the phonological fluency task but there was no consistent pattern of lateralization for the non-sign repetition in these hearing non-signers. The current data demonstrate stronger left hemisphere lateralization for producing signs than speech, which was not primarily driven by motoric articulatory demands. PMID:26605960

  18. The Use of Sign Language Pronouns by Native-Signing Children with Autism

    ERIC Educational Resources Information Center

    Shield, Aaron; Meier, Richard P.; Tager-Flusberg, Helen

    2015-01-01

    We report the first study on pronoun use by an under-studied research population, children with autism spectrum disorder (ASD) exposed to American Sign Language from birth by their deaf parents. Personal pronouns cause difficulties for hearing children with ASD, who sometimes reverse or avoid them. Unlike speech pronouns, sign pronouns are…

  19. The role of language skills and internationalization in nursing degree programmes: A literature review.

    PubMed

    Garone, Anja; Van de Craen, Piet

    2017-02-01

    Globalization and internationalization have had major influences on higher education, including nursing education. Since the signing of the Bologna declaration, many institutions in Europe have adopted English as the "scientific lingua franca", and have instated courses and entire degree programmes taught in English. Several countries in the European Union also offer nursing degree programmes in English. With the rise of multilingualism in Europe, new challenges have become apparent in multilingual education. The Content and Language Integrated Learning (CLIL) approach has emerged as a new, innovative way to learn languages. The approach has become mainstream in primary and secondary education with proven success, and has also spread to higher education. Nurses are required to develop their linguistic skills such that they can communicate well with their patients and colleagues. Due to globalization, nurses are faced with increasingly diverse patients, presenting new challenges in nursing education concerning linguistic and transcultural preparation of students. Although CLIL is becoming more widely accepted in many academic faculties, it has not yet been studied sufficiently in the nursing education context. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Signing Science! Andy And Tonya Are Just Like Me! They Wear Hearing Aids And Know My Language!?

    ERIC Educational Resources Information Center

    Vesel, Judy

    2005-01-01

    Are these students talking about their classmates? No, they are describing the Signing Avatar characters--3-D figures who appear on the EnViSci Network Web site and sign the resources and activities in American Sign Language (ASL) or Signed English (SE). During the 2003?04 school year, students in schools for the deaf and hard of hearing…

  1. Signs in Which Handshape and Hand Orientation Are either Not Visible or Are Only Partially Visible: What Is the Consequence for Lexical Recognition?

    ERIC Educational Resources Information Center

    ten Holt, G. A.; van Doorn, A. J.; de Ridder, H.; Reinders, M. J. T.; Hendriks, E. A.

    2009-01-01

    We present the results of an experiment on lexical recognition of human sign language signs in which the available perceptual information about handshape and hand orientation was manipulated. Stimuli were videos of signs from Sign Language of the Netherlands (SLN). The videos were processed to create four conditions: (1) one in which neither…

  2. Sign Language Echolalia in Deaf Children With Autism Spectrum Disorder

    PubMed Central

    Cooley, Frances; Meier, Richard P.

    2017-01-01

    Purpose We present the first study of echolalia in deaf, signing children with autism spectrum disorder (ASD). We investigate the nature and prevalence of sign echolalia in native-signing children with ASD, the relationship between sign echolalia and receptive language, and potential modality differences between sign and speech. Method Seventeen deaf children with ASD and 18 typically developing (TD) deaf children were video-recorded in a series of tasks. Data were coded for type of signs produced (spontaneous, elicited, echo, or nonecho repetition). Echoes were coded as pure or partial, and timing and reduplication of echoes were coded. Results Seven of the 17 deaf children with ASD produced signed echoes, but none of the TD deaf children did. The echoic children had significantly lower receptive language scores than did both the nonechoic children with ASD and the TD children. Modality differences also were found in terms of the directionality, timing, and reduplication of echoes. Conclusions Deaf children with ASD sometimes echo signs, just as hearing children with ASD sometimes echo words, and TD deaf children and those with ASD do so at similar stages of linguistic development, when comprehension is relatively low. The sign language modality might provide a powerful new framework for analyzing the purpose and function of echolalia in deaf children with ASD. PMID:28586822

  3. Sign Language Echolalia in Deaf Children With Autism Spectrum Disorder.

    PubMed

    Shield, Aaron; Cooley, Frances; Meier, Richard P

    2017-06-10

    We present the first study of echolalia in deaf, signing children with autism spectrum disorder (ASD). We investigate the nature and prevalence of sign echolalia in native-signing children with ASD, the relationship between sign echolalia and receptive language, and potential modality differences between sign and speech. Seventeen deaf children with ASD and 18 typically developing (TD) deaf children were video-recorded in a series of tasks. Data were coded for type of signs produced (spontaneous, elicited, echo, or nonecho repetition). Echoes were coded as pure or partial, and timing and reduplication of echoes were coded. Seven of the 17 deaf children with ASD produced signed echoes, but none of the TD deaf children did. The echoic children had significantly lower receptive language scores than did both the nonechoic children with ASD and the TD children. Modality differences also were found in terms of the directionality, timing, and reduplication of echoes. Deaf children with ASD sometimes echo signs, just as hearing children with ASD sometimes echo words, and TD deaf children and those with ASD do so at similar stages of linguistic development, when comprehension is relatively low. The sign language modality might provide a powerful new framework for analyzing the purpose and function of echolalia in deaf children with ASD.

  4. Aphasia in a prelingually deaf woman.

    PubMed

    Chiarello, C; Knight, R; Mandel, M

    1982-03-01

    A left parietal infarct in a prelingually deaf person resulted in an aphasia for both American Sign Language (ASL) and written and finger-spelled English. Originally the patient had a nearly global aphasia affecting all language systems. By five to seven weeks post-onset her symptoms resembled those of hearing aphasics with posterior lesions: fluent but paraphasic signing, anomia, impaired comprehension and repetition, alexia, and agraphia with elements of neologistic jargon. In addition, there was a pronounced sequential movement copying disorder, reduced short-term verbal memory and acalculia. In general, the patient's sign errors showed a consistent disruption in the structure of ASL signs which parallels the speech errors of oral aphasic patients. We conclude that most aphasic symptoms are not modality-dependent, but rather reflect a disruption of linguistic processes common to all human languages. This case confirms the importance of the left hemisphere in the processing of sign language. Furthermore, the results indicate that the left supramarginal and angular gyri are necessary substrates for the comprehension of visual/gestural languages.

  5. Sign Language Acquisition and Use by Single-Generation Deaf Adults in Australia Who Attended Specific Educational Settings for Deaf and Hard of Hearing Children

    ERIC Educational Resources Information Center

    Winn, Stephen

    2007-01-01

    This article examines the acquisition and use of Australian Sign Language (Auslan) by 53 profoundly deaf adults (31 male, 22 female) who attended educational units for deaf and hard of hearing children. The results indicate that, regardless of age, the acquisition of sign language, particularly Auslan, by deaf people occurred primarily through…

  6. Content Questions In American Sign Language: An RRG Analysis

    DTIC Science & Technology

    2004-12-08

    a temporal framework, someone might sign (29) DURING FIVE YEAR YONDER GALLAUDET … During my five years at Gallaudet …. Until a new topic is...Language: A Teacher’s Resource on Grammar and Culture. Washington, D.C.: Gallaudet University Press. BATTISON, ROBBIN. 1978. Loan Signs from...Typology and Syntactic Description, ed. by Timothy Shopen. Cambridge, MA: Cambridge University Press. —. In press b. Clause Types. Language Typology

  7. American Sign Language Syntactic and Narrative Comprehension in Skilled and Less Skilled Readers: Bilingual and Bimodal Evidence for the Linguistic Basis of Reading

    ERIC Educational Resources Information Center

    Chamberlain, Charlene; Mayberry, Rachel I.

    2008-01-01

    We tested the hypothesis that syntactic and narrative comprehension of a natural sign language can serve as the linguistic basis for skilled reading. Thirty-one adults who were deaf from birth and used American Sign Language (ASL) were classified as skilled or less skilled readers using an eighth-grade criterion. Proficiency with ASL syntax, and…

  8. Kinematic Parameters of Signed Verbs

    ERIC Educational Resources Information Center

    Malaia, Evie; Wilbur, Ronnie B.; Milkovic, Marina

    2013-01-01

    Purpose: Sign language users recruit physical properties of visual motion to convey linguistic information. Research on American Sign Language (ASL) indicates that signers systematically use kinematic features (e.g., velocity, deceleration) of dominant hand motion for distinguishing specific semantic properties of verb classes in production…

  9. Language and literacy development of deaf and hard-of-hearing children: successes and challenges.

    PubMed

    Lederberg, Amy R; Schick, Brenda; Spencer, Patricia E

    2013-01-01

    Childhood hearing loss presents challenges to language development, especially spoken language. In this article, we review existing literature on deaf and hard-of-hearing (DHH) children's patterns and trajectories of language as well as development of theory of mind and literacy. Individual trajectories vary significantly, reflecting access to early identification/intervention, advanced technologies (e.g., cochlear implants), and perceptually accessible language models. DHH children develop sign language in a similar manner as hearing children develop spoken language, provided they are in a language-rich environment. This occurs naturally for DHH children of deaf parents, who constitute 5% of the deaf population. For DHH children of hearing parents, sign language development depends on the age that they are exposed to a perceptually accessible 1st language as well as the richness of input. Most DHH children are born to hearing families who have spoken language as a goal, and such development is now feasible for many children. Some DHH children develop spoken language in bilingual (sign-spoken language) contexts. For the majority of DHH children, spoken language development occurs in either auditory-only contexts or with sign supports. Although developmental trajectories of DHH children with hearing parents have improved with early identification and appropriate interventions, the majority of children are still delayed compared with hearing children. These DHH children show particular weaknesses in the development of grammar. Language deficits and differences have cascading effects in language-related areas of development, such as theory of mind and literacy development.

  10. Language-learning impairments: a 30-year follow-up of language-impaired children with and without psychiatric, neurological and cognitive difficulties.

    PubMed

    Elbro, Carsten; Dalby, Mogens; Maarbjerg, Stine

    2011-01-01

    This study investigated the long-term consequences of language impairments for academic, educational and socio-economic outcomes. It also assessed the unique contributions of childhood measures of speech and language, non-verbal IQ, and of psychiatric and neurological problems. The study was a 30-year follow-up of 198 participants originally diagnosed with language impairments at 3-9 years. Childhood diagnoses were based on language and cognitive abilities, social maturity, motor development, and psychiatric and neurological signs. At follow-up the participants responded to a questionnaire about literacy, education, employment, economic independence and family status. The response rate was 42% (198/470). At follow-up a majority of the participants reported literacy difficulties, unemployment and low socio-economic status-at rates significantly higher than in the general population. Participants diagnosed as children with specific language impairments had significantly better outcomes than those with additional diagnoses, even when non-verbal IQ was normal or statistically controlled. Childhood measures accounted for up to 52% of the variance in adult outcomes. Psychiatric and neurological comorbidity is relevant for adult outcomes of language impairments even when non-verbal IQ is normal. © 2011 Royal College of Speech & Language Therapists.

  11. Graph theoretical analysis of functional network for comprehension of sign language.

    PubMed

    Liu, Lanfang; Yan, Xin; Liu, Jin; Xia, Mingrui; Lu, Chunming; Emmorey, Karen; Chu, Mingyuan; Ding, Guosheng

    2017-09-15

    Signed languages are natural human languages using the visual-motor modality. Previous neuroimaging studies based on univariate activation analysis show that a widely overlapped cortical network is recruited regardless whether the sign language is comprehended (for signers) or not (for non-signers). Here we move beyond previous studies by examining whether the functional connectivity profiles and the underlying organizational structure of the overlapped neural network may differ between signers and non-signers when watching sign language. Using graph theoretical analysis (GTA) and fMRI, we compared the large-scale functional network organization in hearing signers with non-signers during the observation of sentences in Chinese Sign Language. We found that signed sentences elicited highly similar cortical activations in the two groups of participants, with slightly larger responses within the left frontal and left temporal gyrus in signers than in non-signers. Crucially, further GTA revealed substantial group differences in the topologies of this activation network. Globally, the network engaged by signers showed higher local efficiency (t (24) =2.379, p=0.026), small-worldness (t (24) =2.604, p=0.016) and modularity (t (24) =3.513, p=0.002), and exhibited different modular structures, compared to the network engaged by non-signers. Locally, the left ventral pars opercularis served as a network hub in the signer group but not in the non-signer group. These findings suggest that, despite overlap in cortical activation, the neural substrates underlying sign language comprehension are distinguishable at the network level from those for the processing of gestural action. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Longitudinal Risk and Resilience Factors Predicting Psychiatric Disruption, Mental Health Service Utilization & Military Retention in OIF National Guard Troops

    DTIC Science & Technology

    2008-04-01

    learning disabilities and/or emotional difficulties • Wrote integrated assessment reports • Attended Individual Education Plan ( IEP ) meetings where...worked with wide range of disabilities from autism and cerebral palsy to oppositional defiant disorder and bipolar disorder; developed leadership...therapy program for a nine year old boy with autism to foster the development of social skills; utilized sign language to increase communication

  13. The Impact of Input Quality on Early Sign Development in Native and Non-Native Language Learners

    ERIC Educational Resources Information Center

    Lu, Jenny; Jones, Anna; Morgan, Gary

    2016-01-01

    There is debate about how input variation influences child language. Most deaf children are exposed to a sign language from their non-fluent hearing parents and experience a delay in exposure to accessible language. A small number of children receive language input from their deaf parents who are fluent signers. Thus it is possible to document the…

  14. Identifying Specific Language Impairment in Deaf Children Acquiring British Sign Language: Implications for Theory and Practice

    ERIC Educational Resources Information Center

    Mason, Kathryn; Rowley, Katherine; Marshall, Chloe R.; Atkinson, Joanna R.; Herman, Rosalind; Woll, Bencie; Morgan, Gary

    2010-01-01

    This paper presents the first ever group study of specific language impairment (SLI) in users of sign language. A group of 50 children were referred to the study by teachers and speech and language therapists. Individuals who fitted pre-determined criteria for SLI were then systematically assessed. Here, we describe in detail the performance of 13…

  15. Extricating Manual and Non-Manual Features for Subunit Level Medical Sign Modelling in Automatic Sign Language Classification and Recognition.

    PubMed

    R, Elakkiya; K, Selvamani

    2017-09-22

    Subunit segmenting and modelling in medical sign language is one of the important studies in linguistic-oriented and vision-based Sign Language Recognition (SLR). Many efforts were made in the precedent to focus the functional subunits from the view of linguistic syllables but the problem is implementing such subunit extraction using syllables is not feasible in real-world computer vision techniques. And also, the present recognition systems are designed in such a way that it can detect the signer dependent actions under restricted and laboratory conditions. This research paper aims at solving these two important issues (1) Subunit extraction and (2) Signer independent action on visual sign language recognition. Subunit extraction involved in the sequential and parallel breakdown of sign gestures without any prior knowledge on syllables and number of subunits. A novel Bayesian Parallel Hidden Markov Model (BPaHMM) is introduced for subunit extraction to combine the features of manual and non-manual parameters to yield better results in classification and recognition of signs. Signer independent action aims in using a single web camera for different signer behaviour patterns and for cross-signer validation. Experimental results have proved that the proposed signer independent subunit level modelling for sign language classification and recognition has shown improvement and variations when compared with other existing works.

  16. Segmentation of British Sign Language (BSL): mind the gap!

    PubMed

    Orfanidou, Eleni; McQueen, James M; Adam, Robert; Morgan, Gary

    2015-01-01

    This study asks how users of British Sign Language (BSL) recognize individual signs in connected sign sequences. We examined whether this is achieved through modality-specific or modality-general segmentation procedures. A modality-specific feature of signed languages is that, during continuous signing, there are salient transitions between sign locations. We used the sign-spotting task to ask if and how BSL signers use these transitions in segmentation. A total of 96 real BSL signs were preceded by nonsense signs which were produced in either the target location or another location (with a small or large transition). Half of the transitions were within the same major body area (e.g., head) and half were across body areas (e.g., chest to hand). Deaf adult BSL users (a group of natives and early learners, and a group of late learners) spotted target signs best when there was a minimal transition and worst when there was a large transition. When location changes were present, both groups performed better when transitions were to a different body area than when they were within the same area. These findings suggest that transitions do not provide explicit sign-boundary cues in a modality-specific fashion. Instead, we argue that smaller transitions help recognition in a modality-general way by limiting lexical search to signs within location neighbourhoods, and that transitions across body areas also aid segmentation in a modality-general way, by providing a phonotactic cue to a sign boundary. We propose that sign segmentation is based on modality-general procedures which are core language-processing mechanisms.

  17. Evaluation Study of Pedagogical Methods and E - Learning Material via Web 2.0 for Hearing Impaired People

    NASA Astrophysics Data System (ADS)

    Vrettaros, John; Argiri, Katerina; Stavrou, Pilios; Hrissagis, Kostas; Drigas, Athanasios

    The primary goal of this paper is to study whether WEB 2.0 tools such as blogs, wikis, social networks and typical hypermedia as well as techniques such as lip - reading, video - sign language and learning activities are appropriate to use for learning purpose for deaf and hard of hearing people. In order to check the extent in which the choices mentioned above are compatible with the features of the specific group and maximize the learning results we designed an empirical study which will be presented below. The study was conducted in the context of SYNERGIA, a project of Leonardo da Vinci of Lifelong Learning Programme, in the section of MULTILATERAL PROJECTS TRANSFER OF INNOVATION, The evaluation was conducted on data that came up through questionnaire analysis.

  18. Classroom Interpreting and Visual Information Processing in Mainstream Education for Deaf Students: Live or Memorex®?

    PubMed Central

    Marschark, Marc; Pelz, Jeff B.; Convertino, Carol; Sapere, Patricia; Arndt, Mary Ellen; Seewagen, Rosemarie

    2006-01-01

    This study examined visual information processing and learning in classrooms including both deaf and hearing students. Of particular interest were the effects on deaf students’ learning of live (three-dimensional) versus video-recorded (two-dimensional) sign language interpreting and the visual attention strategies of more and less experienced deaf signers exposed to simultaneous, multiple sources of visual information. Results from three experiments consistently indicated no differences in learning between three-dimensional and two-dimensional presentations among hearing or deaf students. Analyses of students’ allocation of visual attention and the influence of various demographic and experimental variables suggested considerable flexibility in deaf students’ receptive communication skills. Nevertheless, the findings also revealed a robust advantage in learning in favor of hearing students. PMID:16628250

  19. Kinematic parameters of signed verbs.

    PubMed

    Malaia, Evie; Wilbur, Ronnie B; Milkovic, Marina

    2013-10-01

    Sign language users recruit physical properties of visual motion to convey linguistic information. Research on American Sign Language (ASL) indicates that signers systematically use kinematic features (e.g., velocity, deceleration) of dominant hand motion for distinguishing specific semantic properties of verb classes in production ( Malaia & Wilbur, 2012a) and process these distinctions as part of the phonological structure of these verb classes in comprehension ( Malaia, Ranaweera, Wilbur, & Talavage, 2012). These studies are driven by the event visibility hypothesis by Wilbur (2003), who proposed that such use of kinematic features should be universal to sign language (SL) by the grammaticalization of physics and geometry for linguistic purposes. In a prior motion capture study, Malaia and Wilbur (2012a) lent support for the event visibility hypothesis in ASL, but there has not been quantitative data from other SLs to test the generalization to other languages. The authors investigated the kinematic parameters of predicates in Croatian Sign Language ( Hrvatskom Znakovnom Jeziku [HZJ]). Kinematic features of verb signs were affected both by event structure of the predicate (semantics) and phrase position within the sentence (prosody). The data demonstrate that kinematic features of motion in HZJ verb signs are recruited to convey morphological and prosodic information. This is the first crosslinguistic motion capture confirmation that specific kinematic properties of articulator motion are grammaticalized in other SLs to express linguistic features.

  20. Dissociating linguistic and non-linguistic gesture processing: electrophysiological evidence from American Sign Language.

    PubMed

    Grosvald, Michael; Gutierrez, Eva; Hafer, Sarah; Corina, David

    2012-04-01

    A fundamental advance in our understanding of human language would come from a detailed account of how non-linguistic and linguistic manual actions are differentiated in real time by language users. To explore this issue, we targeted the N400, an ERP component known to be sensitive to semantic context. Deaf signers saw 120 American Sign Language sentences, each consisting of a "frame" (a sentence without the last word; e.g. BOY SLEEP IN HIS) followed by a "last item" belonging to one of four categories: a high-close-probability sign (a "semantically reasonable" completion to the sentence; e.g. BED), a low-close-probability sign (a real sign that is nonetheless a "semantically odd" completion to the sentence; e.g. LEMON), a pseudo-sign (phonologically legal but non-lexical form), or a non-linguistic grooming gesture (e.g. the performer scratching her face). We found significant N400-like responses in the incongruent and pseudo-sign contexts, while the gestures elicited a large positivity. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. Generation of Signs Within Semantic and Phonological Categories: Data from Deaf Adults and Children Who Use American Sign Language.

    PubMed

    Beal-Alvarez, Jennifer S; Figueroa, Daileen M

    2017-04-01

    Two key areas of language development include semantic and phonological knowledge. Semantic knowledge relates to word and concept knowledge. Phonological knowledge relates to how language parameters combine to create meaning. We investigated signing deaf adults' and children's semantic and phonological sign generation via one-minute tasks, including animals, foods, and specific handshapes. We investigated the effects of chronological age, age of sign language acquisition/years at school site, gender, presence of a disability, and geographical location (i.e., USA and Puerto Rico) on participants' performance and relations among tasks. In general, the phonological task appeared more difficult than the semantic tasks, students generated more animals than foods, age, and semantic performance correlated for the larger sample of U.S. students, and geographical variation included use of fingerspelling and specific signs. Compared to their peers, deaf students with disabilities generated fewer semantic items. These results provide an initial snapshot of students' semantic and phonological sign generation. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Dissociating linguistic and non-linguistic gesture processing: Electrophysiological evidence from American Sign Language

    PubMed Central

    Grosvald, Michael; Gutierrez, Eva; Hafer, Sarah; Corina, David

    2012-01-01

    A fundamental advance in our understanding of human language would come from a detailed account of how non-linguistic and linguistic manual actions are differentiated in real time by language users. To explore this issue, we targeted the N400, an ERP component known to be sensitive to semantic context. Deaf signers saw 120 American Sign Language sentences, each consisting of a “frame” (a sentence without the last word; e.g. BOY SLEEP IN HIS) followed by a “last item” belonging to one of four categories: a high-cloze-probability sign (a “semantically reasonable” completion to the sentence; e.g. BED), a low-cloze-probability sign (a real sign that is nonetheless a “semantically odd” completion to the sentence; e.g. LEMON), a pseudo-sign (phonologically legal but non-lexical form), or a non-linguistic grooming gesture (e.g. the performer scratching her face). We found significant N400-like responses in the incongruent and pseudo-sign contexts, while the gestures elicited a large positivity. PMID:22341555

  3. Sign Language Studies with Chimpanzees and Children.

    ERIC Educational Resources Information Center

    Van Cantfort, Thomas E.; Rimpau, James B.

    1982-01-01

    Reviews methodologies of sign language studies with chimpanzees and compares major findings of those studies with studies of human children. Considers relevance of input conditions for language acquisition, evidence used to demonstrate linguistic achievements, and application of rigorous testing procedures in developmental psycholinguistics.…

  4. Real-Time Processing of ASL Signs: Delayed First Language Acquisition Affects Organization of the Mental Lexicon

    ERIC Educational Resources Information Center

    Lieberman, Amy M.; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I.

    2015-01-01

    Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of…

  5. Language Policies in Uruguay and Uruguayan Sign Language (LSU)

    ERIC Educational Resources Information Center

    Behares, Luis Ernesto; Brovetto, Claudia; Crespi, Leonardo Peluso

    2012-01-01

    In the first part of this article the authors consider the policies that apply to Uruguayan Sign Language (Lengua de Senas Uruguaya; hereafter LSU) and the Uruguayan Deaf community within the general framework of language policies in Uruguay. By analyzing them succinctly and as a whole, the authors then explain twenty-first-century innovations.…

  6. How Deaf American Sign Language/English Bilingual Children Become Proficient Readers: An Emic Perspective

    ERIC Educational Resources Information Center

    Mounty, Judith L.; Pucci, Concetta T.; Harmon, Kristen C.

    2014-01-01

    A primary tenet underlying American Sign Language/English bilingual education for deaf students is that early access to a visual language, developed in conjunction with language planning principles, provides a foundation for literacy in English. The goal of this study is to obtain an emic perspective on bilingual deaf readers transitioning from…

  7. The Language Development of a Deaf Child with a Cochlear Implant

    ERIC Educational Resources Information Center

    Mouvet, Kimberley; Matthijs, Liesbeth; Loots, Gerrit; Taverniers, Miriam; Van Herreweghe, Mieke

    2013-01-01

    Hearing parents of deaf or partially deaf infants are confronted with the complex question of communication with their child. This question is complicated further by conflicting advice on how to address the child: in spoken language only, in spoken language supported by signs, or in signed language. This paper studies the linguistic environment…

  8. Second Language Acquisition across Modalities: Production Variability in Adult L2 Learners of American Sign Language

    ERIC Educational Resources Information Center

    Hilger, Allison I.; Loucks, Torrey M. J.; Quinto-Pozos, David; Dye, Matthew W. G.

    2015-01-01

    A study was conducted to examine production variability in American Sign Language (ASL) in order to gain insight into the development of motor control in a language produced in another modality. Production variability was characterized through the spatiotemporal index (STI), which represents production stability in whole utterances and is a…

  9. Observations on Word Order in Saudi Arabian Sign Language

    ERIC Educational Resources Information Center

    Sprenger, Kristen; Mathur, Gaurav

    2012-01-01

    This article focuses on the syntactic level of the grammar of Saudi Arabian Sign Language by exploring some word orders that occur in personal narratives in the language. Word order is one of the main ways in which languages indicate the main syntactic roles of subjects, verbs, and objects; others are verbal agreement and nominal case morphology.…

  10. Deaf, Hard-of-Hearing, and Hearing Signing Undergraduates' Attitudes toward Science in Inquiry-Based Biology Laboratory Classes.

    PubMed

    Gormally, Cara

    2017-01-01

    For science learning to be successful, students must develop attitudes toward support future engagement with challenging social issues related to science. This is especially important for increasing participation of students from underrepresented populations. This study investigated how participation in inquiry-based biology laboratory classes affected students' attitudes toward science, focusing on deaf, hard-of-hearing, and hearing signing students in bilingual learning environments (i.e., taught in American Sign Language and English). Analysis of reflection assignments and interviews revealed that the majority of students developed positive attitudes toward science and scientific attitudes after participating in inquiry-based biology laboratory classes. Attitudinal growth appears to be driven by student value of laboratory activities, repeated direct engagement with scientific inquiry, and peer collaboration. Students perceived that hands-on experimentation involving peer collaboration and a positive, welcoming learning environment were key features of inquiry-based laboratories, affording attitudinal growth. Students who did not perceive biology as useful for their majors, careers, or lives did not develop positive attitudes. Students highlighted the importance of the climate of the learning environment for encouraging student contribution and noted both the benefits and pitfalls of teamwork. Informed by students' characterizations of their learning experiences, recommendations are made for inquiry-based learning in college biology. © 2017 C. Gormally. CBE—Life Sciences Education © 2017 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  11. Sign language processing and the mirror neuron system.

    PubMed

    Corina, David P; Knapp, Heather

    2006-05-01

    In this paper we review evidence for frontal and parietal lobe involvement in sign language comprehension and production, and evaluate the extent to which these data can be interpreted within the context of a mirror neuron system for human action observation and execution. We present data from three literatures--aphasia, cortical stimulation, and functional neuroimaging. Generally, we find support for the idea that sign language comprehension and production can be viewed in the context of a broadly-construed frontal-parietal human action observation/execution system. However, sign language data cannot be fully accounted for under a strict interpretation of the mirror neuron system. Additionally, we raise a number of issues concerning the lack of specificity in current accounts of the human action observation/execution system.

  12. Static hand gesture recognition from a video

    NASA Astrophysics Data System (ADS)

    Rokade, Rajeshree S.; Doye, Dharmpal

    2011-10-01

    A sign language (also signed language) is a language which, instead of acoustically conveyed sound patterns, uses visually transmitted sign patterns to convey meaning- "simultaneously combining hand shapes, orientation and movement of the hands". Sign languages commonly develop in deaf communities, which can include interpreters, friends and families of deaf people as well as people who are deaf or hard of hearing themselves. In this paper, we proposed a novel system for recognition of static hand gestures from a video, based on Kohonen neural network. We proposed algorithm to separate out key frames, which include correct gestures from a video sequence. We segment, hand images from complex and non uniform background. Features are extracted by applying Kohonen on key frames and recognition is done.

  13. Deaf-And-Mute Sign Language Generation System

    NASA Astrophysics Data System (ADS)

    Kawai, Hideo; Tamura, Shinichi

    1984-08-01

    We have developed a system which can recognize speech and generate the corresponding animation-like sign language sequence. The system is implemented in a popular personal computer. This has three video-RAM's and a voice recognition board which can recognize only registered voice of a specific speaker. Presently, fourty sign language patterns and fifty finger spellings are stored in two floppy disks. Each sign pattern is composed of one to four sub-patterns. That is, if the pattern is composed of one sub-pattern, it is displayed as a still pattern. If not, it is displayed as a motion pattern. This system will help communications between deaf-and-mute persons and healthy persons. In order to display in high speed, almost programs are written in a machine language.

  14. Language abnormality in deaf people with schizophrenia: a problem with classifiers.

    PubMed

    Chatzidamianos, G; McCarthy, R A; Du Feu, M; Rosselló, J; McKenna, P J

    2018-06-05

    Although there is evidence for language abnormality in schizophrenia, few studies have examined sign language in deaf patients with the disorder. This is of potential interest because a hallmark of sign languages is their use of classifiers (semantic or entity classifiers), a reference-tracking device with few if any parallels in spoken languages. This study aimed to examine classifier production and comprehension in deaf signing adults with schizophrenia. Fourteen profoundly deaf signing adults with schizophrenia and 35 age- and IQ-matched deaf healthy controls completed a battery of tests assessing classifier and noun comprehension and production. The patients showed poorer performance than the healthy controls on comprehension and production of both nouns and entity classifiers, with the deficit being most marked in the production of classifiers. Classifier production errors affected handshape rather than other parameters such as movement and location. The findings suggest that schizophrenia affects language production in deaf patients with schizophrenia in a unique way not seen in hearing patients.

  15. Neural organization of linguistic short-term memory is sensory modality-dependent: evidence from signed and spoken language.

    PubMed

    Pa, Judy; Wilson, Stephen M; Pickell, Herbert; Bellugi, Ursula; Hickok, Gregory

    2008-12-01

    Despite decades of research, there is still disagreement regarding the nature of the information that is maintained in linguistic short-term memory (STM). Some authors argue for abstract phonological codes, whereas others argue for more general sensory traces. We assess these possibilities by investigating linguistic STM in two distinct sensory-motor modalities, spoken and signed language. Hearing bilingual participants (native in English and American Sign Language) performed equivalent STM tasks in both languages during functional magnetic resonance imaging. Distinct, sensory-specific activations were seen during the maintenance phase of the task for spoken versus signed language. These regions have been previously shown to respond to nonlinguistic sensory stimulation, suggesting that linguistic STM tasks recruit sensory-specific networks. However, maintenance-phase activations common to the two languages were also observed, implying some form of common process. We conclude that linguistic STM involves sensory-dependent neural networks, but suggest that sensory-independent neural networks may also exist.

  16. Automatic Mexican sign language and digits recognition using normalized central moments

    NASA Astrophysics Data System (ADS)

    Solís, Francisco; Martínez, David; Espinosa, Oscar; Toxqui, Carina

    2016-09-01

    This work presents a framework for automatic Mexican sign language and digits recognition based on computer vision system using normalized central moments and artificial neural networks. Images are captured by digital IP camera, four LED reflectors and a green background in order to reduce computational costs and prevent the use of special gloves. 42 normalized central moments are computed per frame and used in a Multi-Layer Perceptron to recognize each database. Four versions per sign and digit were used in training phase. 93% and 95% of recognition rates were achieved for Mexican sign language and digits respectively.

  17. Examining the contribution of motor movement and language dominance to increased left lateralization during sign generation in native signers.

    PubMed

    Gutierrez-Sigut, Eva; Payne, Heather; MacSweeney, Mairéad

    2016-08-01

    The neural systems supporting speech and sign processing are very similar, although not identical. In a previous fTCD study of hearing native signers (Gutierrez-Sigut, Daws, et al., 2015) we found stronger left lateralization for sign than speech. Given that this increased lateralization could not be explained by hand movement alone, the contribution of motor movement versus 'linguistic' processes to the strength of hemispheric lateralization during sign production remains unclear. Here we directly contrast lateralization strength of covert versus overt signing during phonological and semantic fluency tasks. To address the possibility that hearing native signers' elevated lateralization indices (LIs) were due to performing a task in their less dominant language, here we test deaf native signers, whose dominant language is British Sign Language (BSL). Signers were more strongly left lateralized for overt than covert sign generation. However, the strength of lateralization was not correlated with the amount of time producing movements of the right hand. Comparisons with previous data from hearing native English speakers suggest stronger laterality indices for sign than speech in both covert and overt tasks. This increased left lateralization may be driven by specific properties of sign production such as the increased use of self-monitoring mechanisms or the nature of phonological encoding of signs. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  18. The Signer and the Sign: Cortical Correlates of Person Identity and Language Processing from Point-Light Displays

    ERIC Educational Resources Information Center

    Campbell, Ruth; Capek, Cheryl M.; Gazarian, Karine; MacSweeney, Mairead; Woll, Bencie; David, Anthony S.; McGuire, Philip K.; Brammer, Michael J.

    2011-01-01

    In this study, the first to explore the cortical correlates of signed language (SL) processing under point-light display conditions, the observer identified either a signer or a lexical sign from a display in which different signers were seen producing a number of different individual signs. Many of the regions activated by point-light under these…

  19. Methodological Note: Analyzing Signs for Recognition & Feature Salience.

    ERIC Educational Resources Information Center

    Shyan, Melissa R.

    1985-01-01

    Presents a method to determine how signs in American Sign Language are recognized by signers. The method uses natural settings and avoids common artificialities found in prior work. A pilot study is described involving language research with Atlantic Bottlenose Dolphins in which the method was successfully used. (SED)

  20. The influence of visual feedback and register changes on sign language production: A kinematic study with deaf signers

    PubMed Central

    EMMOREY, KAREN; GERTSBERG, NELLY; KORPICS, FRANCO; WRIGHT, CHARLES E.

    2009-01-01

    Speakers monitor their speech output by listening to their own voice. However, signers do not look directly at their hands and cannot see their own face. We investigated the importance of a visual perceptual loop for sign language monitoring by examining whether changes in visual input alter sign production. Deaf signers produced American Sign Language (ASL) signs within a carrier phrase under five conditions: blindfolded, wearing tunnel-vision goggles, normal (citation) signing, shouting, and informal signing. Three-dimensional movement trajectories were obtained using an Optotrak Certus system. Informally produced signs were shorter with less vertical movement. Shouted signs were displaced forward and to the right and were produced within a larger volume of signing space, with greater velocity, greater distance traveled, and a longer duration. Tunnel vision caused signers to produce less movement within the vertical dimension of signing space, but blind and citation signing did not differ significantly on any measure, except duration. Thus, signers do not “sign louder” when they cannot see themselves, but they do alter their sign production when vision is restricted. We hypothesize that visual feedback serves primarily to fine-tune the size of signing space rather than as input to a comprehension-based monitor. PMID:20046943

Top