Sample records for sign language machine

  1. SignMT: An Alternative Language Learning Tool

    ERIC Educational Resources Information Center

    Ditcharoen, Nadh; Naruedomkul, Kanlaya; Cercone, Nick

    2010-01-01

    Learning a second language is very difficult, especially, for the disabled; the disability may be a barrier to learn and to utilize information written in text form. We present the SignMT, Thai sign to Thai machine translation system, which is able to translate from Thai sign language into Thai text. In the translation process, SignMT takes into…

  2. Deaf-And-Mute Sign Language Generation System

    NASA Astrophysics Data System (ADS)

    Kawai, Hideo; Tamura, Shinichi

    1984-08-01

    We have developed a system which can recognize speech and generate the corresponding animation-like sign language sequence. The system is implemented in a popular personal computer. This has three video-RAM's and a voice recognition board which can recognize only registered voice of a specific speaker. Presently, fourty sign language patterns and fifty finger spellings are stored in two floppy disks. Each sign pattern is composed of one to four sub-patterns. That is, if the pattern is composed of one sub-pattern, it is displayed as a still pattern. If not, it is displayed as a motion pattern. This system will help communications between deaf-and-mute persons and healthy persons. In order to display in high speed, almost programs are written in a machine language.

  3. Parametric Representation of the Speaker's Lips for Multimodal Sign Language and Speech Recognition

    NASA Astrophysics Data System (ADS)

    Ryumin, D.; Karpov, A. A.

    2017-05-01

    In this article, we propose a new method for parametric representation of human's lips region. The functional diagram of the method is described and implementation details with the explanation of its key stages and features are given. The results of automatic detection of the regions of interest are illustrated. A speed of the method work using several computers with different performances is reported. This universal method allows applying parametrical representation of the speaker's lipsfor the tasks of biometrics, computer vision, machine learning, and automatic recognition of face, elements of sign languages, and audio-visual speech, including lip-reading.

  4. Mechanically Compliant Electronic Materials for Wearable Photovoltaics and Human-Machine Interfaces

    NASA Astrophysics Data System (ADS)

    O'Connor, Timothy Francis, III

    Applications of stretchable electronic materials for human-machine interfaces are described herein. Intrinsically stretchable organic conjugated polymers and stretchable electronic composites were used to develop stretchable organic photovoltaics (OPVs), mechanically robust wearable OPVs, and human-machine interfaces for gesture recognition, American Sign Language Translation, haptic control of robots, and touch emulation for virtual reality, augmented reality, and the transmission of touch. The stretchable and wearable OPVs comprise active layers of poly-3-alkylthiophene:phenyl-C61-butyric acid methyl ester (P3AT:PCBM) and transparent conductive electrodes of poly(3,4-ethylenedioxythiophene)-poly(styrenesulfonate) (PEDOT:PSS) and devices could only be fabricated through a deep understanding of the connection between molecular structure and the co-engineering of electronic performance with mechanical resilience. The talk concludes with the use of composite piezoresistive sensors two smart glove prototypes. The first integrates stretchable strain sensors comprising a carbon-elastomer composite, a wearable microcontroller, low energy Bluetooth, and a 6-axis accelerometer/gyroscope to construct a fully functional gesture recognition glove capable of wirelessly translating American Sign Language to text on a cell phone screen. The second creates a system for the haptic control of a 3D printed robot arm, as well as the transmission of touch and temperature information.

  5. ON THE QUANTITATIVE EVALUATION OF THE TERMINOLOGY OF A VOCABULARY.

    ERIC Educational Resources Information Center

    KRAVETS, L.G.

    THE CREATION OF AN INDUSTRIAL SYSTEM OF MACHINE TRANSLATION WITH AUTOMATIC INDEXING OF THE TRANSLATED MATERIALS PRESUMES THE DEVELOPMENT OF DICTIONARIES WHICH PROVIDE FOR THE IDENTIFICATION OF KEY WORDS AND WORD COMBINATIONS, FOLLOWED BY THEIR TRANSLATION INTO THE DESCRIPTORS OF THE SEARCH LANGUAGE. THREE SIGNS WHICH SHOW THAT A GIVEN WORD IS A…

  6. New generation of human machine interfaces for controlling UAV through depth-based gesture recognition

    NASA Astrophysics Data System (ADS)

    Mantecón, Tomás.; del Blanco, Carlos Roberto; Jaureguizar, Fernando; García, Narciso

    2014-06-01

    New forms of natural interactions between human operators and UAVs (Unmanned Aerial Vehicle) are demanded by the military industry to achieve a better balance of the UAV control and the burden of the human operator. In this work, a human machine interface (HMI) based on a novel gesture recognition system using depth imagery is proposed for the control of UAVs. Hand gesture recognition based on depth imagery is a promising approach for HMIs because it is more intuitive, natural, and non-intrusive than other alternatives using complex controllers. The proposed system is based on a Support Vector Machine (SVM) classifier that uses spatio-temporal depth descriptors as input features. The designed descriptor is based on a variation of the Local Binary Pattern (LBP) technique to efficiently work with depth video sequences. Other major consideration is the especial hand sign language used for the UAV control. A tradeoff between the use of natural hand signs and the minimization of the inter-sign interference has been established. Promising results have been achieved in a depth based database of hand gestures especially developed for the validation of the proposed system.

  7. A biometric authentication model using hand gesture images.

    PubMed

    Fong, Simon; Zhuang, Yan; Fister, Iztok; Fister, Iztok

    2013-10-30

    A novel hand biometric authentication method based on measurements of the user's stationary hand gesture of hand sign language is proposed. The measurement of hand gestures could be sequentially acquired by a low-cost video camera. There could possibly be another level of contextual information, associated with these hand signs to be used in biometric authentication. As an analogue, instead of typing a password 'iloveu' in text which is relatively vulnerable over a communication network, a signer can encode a biometric password using a sequence of hand signs, 'i' , 'l' , 'o' , 'v' , 'e' , and 'u'. Subsequently the features from the hand gesture images are extracted which are integrally fuzzy in nature, to be recognized by a classification model for telling if this signer is who he claimed himself to be, by examining over his hand shape and the postures in doing those signs. It is believed that everybody has certain slight but unique behavioral characteristics in sign language, so are the different hand shape compositions. Simple and efficient image processing algorithms are used in hand sign recognition, including intensity profiling, color histogram and dimensionality analysis, coupled with several popular machine learning algorithms. Computer simulation is conducted for investigating the efficacy of this novel biometric authentication model which shows up to 93.75% recognition accuracy.

  8. Toward The Goal Of Video Deaf Communication Over Public Telephone Lines

    NASA Astrophysics Data System (ADS)

    Healy, Donald J.; Clements, Mark A.

    1986-11-01

    At least 500,000 profoundly deaf persons in the United States communicate primarily by American Sign Language (ASL), a language quite distinct from English and not well suited to writing. Currently, telephone communication for an ASL user is basically limited to use of a teletype machine, which requires both typing skills and proficiency in English. This paper reviews recent research relevant to the development of techniques which would allow manual communication across existing telephone channels using video imagery. Two possibilities for such manual communication are ASL and cued speech. The latter technique uses hand signals to aid lip reading. In either case, conventional television video transmission would require a bandwidth many times that available on a telephone channel. The achievement of visual communication using sign language or cued speech at data rates below 10 kbps, low enough to be transmitted over a public telephone line, will require the development of new data reducing algorithms. Avenues for future research toward this goal are presented.

  9. A biometric authentication model using hand gesture images

    PubMed Central

    2013-01-01

    A novel hand biometric authentication method based on measurements of the user’s stationary hand gesture of hand sign language is proposed. The measurement of hand gestures could be sequentially acquired by a low-cost video camera. There could possibly be another level of contextual information, associated with these hand signs to be used in biometric authentication. As an analogue, instead of typing a password ‘iloveu’ in text which is relatively vulnerable over a communication network, a signer can encode a biometric password using a sequence of hand signs, ‘i’ , ‘l’ , ‘o’ , ‘v’ , ‘e’ , and ‘u’. Subsequently the features from the hand gesture images are extracted which are integrally fuzzy in nature, to be recognized by a classification model for telling if this signer is who he claimed himself to be, by examining over his hand shape and the postures in doing those signs. It is believed that everybody has certain slight but unique behavioral characteristics in sign language, so are the different hand shape compositions. Simple and efficient image processing algorithms are used in hand sign recognition, including intensity profiling, color histogram and dimensionality analysis, coupled with several popular machine learning algorithms. Computer simulation is conducted for investigating the efficacy of this novel biometric authentication model which shows up to 93.75% recognition accuracy. PMID:24172288

  10. Sign Language Translator Application Using OpenCV

    NASA Astrophysics Data System (ADS)

    Triyono, L.; Pratisto, E. H.; Bawono, S. A. T.; Purnomo, F. A.; Yudhanto, Y.; Raharjo, B.

    2018-03-01

    This research focuses on the development of sign language translator application using OpenCV Android based, this application is based on the difference in color. The author also utilizes Support Machine Learning to predict the label. Results of the research showed that the coordinates of the fingertip search methods can be used to recognize a hand gesture to the conditions contained open arms while to figure gesture with the hand clenched using search methods Hu Moments value. Fingertip methods more resilient in gesture recognition with a higher success rate is 95% on the distance variation is 35 cm and 55 cm and variations of light intensity of approximately 90 lux and 100 lux and light green background plain condition compared with the Hu Moments method with the same parameters and the percentage of success of 40%. While the background of outdoor environment applications still can not be used with a success rate of only 6 managed and the rest failed.

  11. Non parametric, self organizing, scalable modeling of spatiotemporal inputs: the sign language paradigm.

    PubMed

    Caridakis, G; Karpouzis, K; Drosopoulos, A; Kollias, S

    2012-12-01

    Modeling and recognizing spatiotemporal, as opposed to static input, is a challenging task since it incorporates input dynamics as part of the problem. The vast majority of existing methods tackle the problem as an extension of the static counterpart, using dynamics, such as input derivatives, at feature level and adopting artificial intelligence and machine learning techniques originally designed for solving problems that do not specifically address the temporal aspect. The proposed approach deals with temporal and spatial aspects of the spatiotemporal domain in a discriminative as well as coupling manner. Self Organizing Maps (SOM) model the spatial aspect of the problem and Markov models its temporal counterpart. Incorporation of adjacency, both in training and classification, enhances the overall architecture with robustness and adaptability. The proposed scheme is validated both theoretically, through an error propagation study, and experimentally, on the recognition of individual signs, performed by different, native Greek Sign Language users. Results illustrate the architecture's superiority when compared to Hidden Markov Model techniques and variations both in terms of classification performance and computational cost. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Labeled Graph Kernel for Behavior Analysis.

    PubMed

    Zhao, Ruiqi; Martinez, Aleix M

    2016-08-01

    Automatic behavior analysis from video is a major topic in many areas of research, including computer vision, multimedia, robotics, biology, cognitive science, social psychology, psychiatry, and linguistics. Two major problems are of interest when analyzing behavior. First, we wish to automatically categorize observed behaviors into a discrete set of classes (i.e., classification). For example, to determine word production from video sequences in sign language. Second, we wish to understand the relevance of each behavioral feature in achieving this classification (i.e., decoding). For instance, to know which behavior variables are used to discriminate between the words apple and onion in American Sign Language (ASL). The present paper proposes to model behavior using a labeled graph, where the nodes define behavioral features and the edges are labels specifying their order (e.g., before, overlaps, start). In this approach, classification reduces to a simple labeled graph matching. Unfortunately, the complexity of labeled graph matching grows exponentially with the number of categories we wish to represent. Here, we derive a graph kernel to quickly and accurately compute this graph similarity. This approach is very general and can be plugged into any kernel-based classifier. Specifically, we derive a Labeled Graph Support Vector Machine (LGSVM) and a Labeled Graph Logistic Regressor (LGLR) that can be readily employed to discriminate between many actions (e.g., sign language concepts). The derived approach can be readily used for decoding too, yielding invaluable information for the understanding of a problem (e.g., to know how to teach a sign language). The derived algorithms allow us to achieve higher accuracy results than those of state-of-the-art algorithms in a fraction of the time. We show experimental results on a variety of problems and datasets, including multimodal data.

  13. Monitoring Different Phonological Parameters of Sign Language Engages the Same Cortical Language Network but Distinctive Perceptual Ones.

    PubMed

    Cardin, Velia; Orfanidou, Eleni; Kästner, Lena; Rönnberg, Jerker; Woll, Bencie; Capek, Cheryl M; Rudner, Mary

    2016-01-01

    The study of signed languages allows the dissociation of sensorimotor and cognitive neural components of the language signal. Here we investigated the neurocognitive processes underlying the monitoring of two phonological parameters of sign languages: handshape and location. Our goal was to determine if brain regions processing sensorimotor characteristics of different phonological parameters of sign languages were also involved in phonological processing, with their activity being modulated by the linguistic content of manual actions. We conducted an fMRI experiment using manual actions varying in phonological structure and semantics: (1) signs of a familiar sign language (British Sign Language), (2) signs of an unfamiliar sign language (Swedish Sign Language), and (3) invented nonsigns that violate the phonological rules of British Sign Language and Swedish Sign Language or consist of nonoccurring combinations of phonological parameters. Three groups of participants were tested: deaf native signers, deaf nonsigners, and hearing nonsigners. Results show that the linguistic processing of different phonological parameters of sign language is independent of the sensorimotor characteristics of the language signal. Handshape and location were processed by different perceptual and task-related brain networks but recruited the same language areas. The semantic content of the stimuli did not influence this process, but phonological structure did, with nonsigns being associated with longer RTs and stronger activations in an action observation network in all participants and in the supramarginal gyrus exclusively in deaf signers. These results suggest higher processing demands for stimuli that contravene the phonological rules of a signed language, independently of previous knowledge of signed languages. We suggest that the phonological characteristics of a language may arise as a consequence of more efficient neural processing for its perception and production.

  14. Writing Signed Languages: What For? What Form?

    PubMed

    Grushkin, Donald A

    2017-01-01

    Signed languages around the world have tended to maintain an "oral," unwritten status. Despite the advantages of possessing a written form of their language, signed language communities typically resist and reject attempts to create such written forms. The present article addresses many of the arguments against written forms of signed languages, and presents the potential advantages of writing signed languages. Following a history of the development of writing in spoken as well as signed language populations, the effects of orthographic types upon literacy and biliteracy are explored. Attempts at writing signed languages have followed two primary paths: "alphabetic" and "icono-graphic." It is argued that for greatest congruency and ease in developing biliteracy strategies in societies where an alphabetic script is used for the spoken language, signed language communities within these societies are best served by adoption of an alphabetic script for writing their signed language.

  15. Towards a Sign Language Synthesizer: a Bridge to Communication Gap of the Hearing/Speech Impaired Community

    NASA Astrophysics Data System (ADS)

    Maarif, H. A.; Akmeliawati, R.; Gunawan, T. S.; Shafie, A. A.

    2013-12-01

    Sign language synthesizer is a method to visualize the sign language movement from the spoken language. The sign language (SL) is one of means used by HSI people to communicate to normal people. But, unfortunately the number of people, including the HSI people, who are familiar with sign language is very limited. These cause difficulties in the communication between the normal people and the HSI people. The sign language is not only hand movement but also the face expression. Those two elements have complimentary aspect each other. The hand movement will show the meaning of each signing and the face expression will show the emotion of a person. Generally, Sign language synthesizer will recognize the spoken language by using speech recognition, the grammatical process will involve context free grammar, and 3D synthesizer will take part by involving recorded avatar. This paper will analyze and compare the existing techniques of developing a sign language synthesizer, which leads to IIUM Sign Language Synthesizer.

  16. Exploring the Ancestral Roots of American Sign Language: Lexical Borrowing from Cistercian Sign Language and French Sign Language

    ERIC Educational Resources Information Center

    Cagle, Keith Martin

    2010-01-01

    American Sign Language (ASL) is the natural and preferred language of the Deaf community in both the United States and Canada. Woodward (1978) estimated that approximately 60% of the ASL lexicon is derived from early 19th century French Sign Language, which is known as "langue des signes francaise" (LSF). The lexicon of LSF and ASL may…

  17. Adapting tests of sign language assessment for other sign languages--a review of linguistic, cultural, and psychometric problems.

    PubMed

    Haug, Tobias; Mann, Wolfgang

    2008-01-01

    Given the current lack of appropriate assessment tools for measuring deaf children's sign language skills, many test developers have used existing tests of other sign languages as templates to measure the sign language used by deaf people in their country. This article discusses factors that may influence the adaptation of assessment tests from one natural sign language to another. Two tests which have been adapted for several other sign languages are focused upon: the Test for American Sign Language and the British Sign Language Receptive Skills Test. A brief description is given of each test as well as insights from ongoing adaptations of these tests for other sign languages. The problems reported in these adaptations were found to be grounded in linguistic and cultural differences, which need to be considered for future test adaptations. Other reported shortcomings of test adaptation are related to the question of how well psychometric measures transfer from one instrument to another.

  18. The road to language learning is iconic: evidence from British Sign Language.

    PubMed

    Thompson, Robin L; Vinson, David P; Woll, Bencie; Vigliocco, Gabriella

    2012-12-01

    An arbitrary link between linguistic form and meaning is generally considered a universal feature of language. However, iconic (i.e., nonarbitrary) mappings between properties of meaning and features of linguistic form are also widely present across languages, especially signed languages. Although recent research has shown a role for sign iconicity in language processing, research on the role of iconicity in sign-language development has been mixed. In this article, we present clear evidence that iconicity plays a role in sign-language acquisition for both the comprehension and production of signs. Signed languages were taken as a starting point because they tend to encode a higher degree of iconic form-meaning mappings in their lexicons than spoken languages do, but our findings are more broadly applicable: Specifically, we hypothesize that iconicity is fundamental to all languages (signed and spoken) and that it serves to bridge the gap between linguistic form and human experience.

  19. Signed Language Working Memory Capacity of Signed Language Interpreters and Deaf Signers

    ERIC Educational Resources Information Center

    Wang, Jihong; Napier, Jemina

    2013-01-01

    This study investigated the effects of hearing status and age of signed language acquisition on signed language working memory capacity. Professional Auslan (Australian sign language)/English interpreters (hearing native signers and hearing nonnative signers) and deaf Auslan signers (deaf native signers and deaf nonnative signers) completed an…

  20. Sign language comprehension: the case of Spanish sign language.

    PubMed

    Rodríguez Ortiz, I R

    2008-01-01

    This study aims to answer the question, how much of Spanish Sign Language interpreting deaf individuals really understand. Study sampling included 36 deaf people (deafness ranging from severe to profound; variety depending on the age at which they learned sign language) and 36 hearing people who had good knowledge of sign language (most were interpreters). Sign language comprehension was assessed using passages of secondary level. After being exposed to the passages, the participants had to tell what they had understood about them, answer a set of related questions, and offer a title for the passage. Sign language comprehension by deaf participants was quite acceptable but not as good as that by hearing signers who, unlike deaf participants, were not only late learners of sign language as a second language but had also learned it through formal training.

  1. What You Don't Know Can Hurt You: The Risk of Language Deprivation by Impairing Sign Language Development in Deaf Children.

    PubMed

    Hall, Wyatte C

    2017-05-01

    A long-standing belief is that sign language interferes with spoken language development in deaf children, despite a chronic lack of evidence supporting this belief. This deserves discussion as poor life outcomes continue to be seen in the deaf population. This commentary synthesizes research outcomes with signing and non-signing children and highlights fully accessible language as a protective factor for healthy development. Brain changes associated with language deprivation may be misrepresented as sign language interfering with spoken language outcomes of cochlear implants. This may lead to professionals and organizations advocating for preventing sign language exposure before implantation and spreading misinformation. The existence of one-time-sensitive-language acquisition window means a strong possibility of permanent brain changes when spoken language is not fully accessible to the deaf child and sign language exposure is delayed, as is often standard practice. There is no empirical evidence for the harm of sign language exposure but there is some evidence for its benefits, and there is growing evidence that lack of language access has negative implications. This includes cognitive delays, mental health difficulties, lower quality of life, higher trauma, and limited health literacy. Claims of cochlear implant- and spoken language-only approaches being more effective than sign language-inclusive approaches are not empirically supported. Cochlear implants are an unreliable standalone first-language intervention for deaf children. Priorities of deaf child development should focus on healthy growth of all developmental domains through a fully-accessible first language foundation such as sign language, rather than auditory deprivation and speech skills.

  2. Spoken Language Activation Alters Subsequent Sign Language Activation in L2 Learners of American Sign Language.

    PubMed

    Williams, Joshua T; Newman, Sharlene D

    2017-02-01

    A large body of literature has characterized unimodal monolingual and bilingual lexicons and how neighborhood density affects lexical access; however there have been relatively fewer studies that generalize these findings to bimodal (M2) second language (L2) learners of sign languages. The goal of the current study was to investigate parallel language activation in M2L2 learners of sign language and to characterize the influence of spoken language and sign language neighborhood density on the activation of ASL signs. A priming paradigm was used in which the neighbors of the sign target were activated with a spoken English word and compared the activation of the targets in sparse and dense neighborhoods. Neighborhood density effects in auditory primed lexical decision task were then compared to previous reports of native deaf signers who were only processing sign language. Results indicated reversed neighborhood density effects in M2L2 learners relative to those in deaf signers such that there were inhibitory effects of handshape density and facilitatory effects of location density. Additionally, increased inhibition for signs in dense handshape neighborhoods was greater for high proficiency L2 learners. These findings support recent models of the hearing bimodal bilingual lexicon, which posit lateral links between spoken language and sign language lexical representations.

  3. Language Policy and Planning: The Case of Italian Sign Language

    ERIC Educational Resources Information Center

    Geraci, Carlo

    2012-01-01

    Italian Sign Language (LIS) is the name of the language used by the Italian Deaf community. The acronym LIS derives from Lingua italiana dei segni ("Italian language of signs"), although nowadays Italians refers to LIS as Lingua dei segni italiana, reflecting the more appropriate phrasing "Italian sign language." Historically,…

  4. Signs of Change: Contemporary Attitudes to Australian Sign Language

    ERIC Educational Resources Information Center

    Slegers, Claudia

    2010-01-01

    This study explores contemporary attitudes to Australian Sign Language (Auslan). Since at least the 1960s, sign languages have been accepted by linguists as natural languages with all of the key ingredients common to spoken languages. However, these visual-spatial languages have historically been subject to ignorance and myth in Australia and…

  5. Early Sign Language Exposure and Cochlear Implantation Benefits.

    PubMed

    Geers, Ann E; Mitchell, Christine M; Warner-Czyz, Andrea; Wang, Nae-Yuh; Eisenberg, Laurie S

    2017-07-01

    Most children with hearing loss who receive cochlear implants (CI) learn spoken language, and parents must choose early on whether to use sign language to accompany speech at home. We address whether parents' use of sign language before and after CI positively influences auditory-only speech recognition, speech intelligibility, spoken language, and reading outcomes. Three groups of children with CIs from a nationwide database who differed in the duration of early sign language exposure provided in their homes were compared in their progress through elementary grades. The groups did not differ in demographic, auditory, or linguistic characteristics before implantation. Children without early sign language exposure achieved better speech recognition skills over the first 3 years postimplant and exhibited a statistically significant advantage in spoken language and reading near the end of elementary grades over children exposed to sign language. Over 70% of children without sign language exposure achieved age-appropriate spoken language compared with only 39% of those exposed for 3 or more years. Early speech perception predicted speech intelligibility in middle elementary grades. Children without sign language exposure produced speech that was more intelligible (mean = 70%) than those exposed to sign language (mean = 51%). This study provides the most compelling support yet available in CI literature for the benefits of spoken language input for promoting verbal development in children implanted by 3 years of age. Contrary to earlier published assertions, there was no advantage to parents' use of sign language either before or after CI. Copyright © 2017 by the American Academy of Pediatrics.

  6. Adapting Tests of Sign Language Assessment for Other Sign Languages--A Review of Linguistic, Cultural, and Psychometric Problems

    ERIC Educational Resources Information Center

    Haug, Tobias; Mann, Wolfgang

    2008-01-01

    Given the current lack of appropriate assessment tools for measuring deaf children's sign language skills, many test developers have used existing tests of other sign languages as templates to measure the sign language used by deaf people in their country. This article discusses factors that may influence the adaptation of assessment tests from…

  7. Legal and Ethical Imperatives for Using Certified Sign Language Interpreters in Health Care Settings: How to "Do No Harm" When "It's (All) Greek" (Sign Language) to You.

    PubMed

    Nonaka, Angela M

    2016-09-01

    Communication obstacles in health care settings adversely impact patient-practitioner interactions by impeding service efficiency, reducing mutual trust and satisfaction, or even endangering health outcomes. When interlocutors are separated by language, interpreters are required. The efficacy of interpreting, however, is constrained not just by interpreters' competence but also by health care providers' facility working with interpreters. Deaf individuals whose preferred form of communication is a signed language often encounter communicative barriers in health care settings. In those environments, signing Deaf people are entitled to equal communicative access via sign language interpreting services according to the Americans with Disabilities Act and Executive Order 13166, the Limited English Proficiency Initiative. Yet, litigation in states across the United States suggests that individual and institutional providers remain uncertain about their legal obligations to provide equal communicative access. This article discusses the legal and ethical imperatives for using professionally certified (vs. ad hoc) sign language interpreters in health care settings. First outlining the legal terrain governing provision of sign language interpreting services, the article then describes different types of "sign language" (e.g., American Sign Language vs. manually coded English) and different forms of "sign language interpreting" (e.g., interpretation vs. transliteration vs. translation; simultaneous vs. consecutive interpreting; individual vs. team interpreting). This is followed by reviews of the formal credentialing process and of specialized forms of sign language interpreting-that is, certified deaf interpreting, trilingual interpreting, and court interpreting. After discussing practical steps for contracting professional sign language interpreters and addressing ethical issues of confidentiality, this article concludes by offering suggestions for working more effectively with Deaf clients via professional sign language interpreters.

  8. Sentence Repetition in Deaf Children with Specific Language Impairment in British Sign Language

    ERIC Educational Resources Information Center

    Marshall, Chloë; Mason, Kathryn; Rowley, Katherine; Herman, Rosalind; Atkinson, Joanna; Woll, Bencie; Morgan, Gary

    2015-01-01

    Children with specific language impairment (SLI) perform poorly on sentence repetition tasks across different spoken languages, but until now, this methodology has not been investigated in children who have SLI in a signed language. Users of a natural sign language encode different sentence meanings through their choice of signs and by altering…

  9. Tactile Signing with One-Handed Perception

    ERIC Educational Resources Information Center

    Mesch, Johanna

    2013-01-01

    Tactile signing among persons with deaf-blindness is not homogenous; rather, like other forms of language, it exhibits variation, especially in turn taking. Early analyses of tactile Swedish Sign Language, tactile Norwegian Sign Language, and tactile French Sign Language focused on tactile communication with four hands, in which partially blind or…

  10. Numeral Incorporation in Japanese Sign Language

    ERIC Educational Resources Information Center

    Ktejik, Mish

    2013-01-01

    This article explores the morphological process of numeral incorporation in Japanese Sign Language. Numeral incorporation is defined and the available research on numeral incorporation in signed language is discussed. The numeral signs in Japanese Sign Language are then introduced and followed by an explanation of the numeral morphemes which are…

  11. The Legal Recognition of Sign Languages

    ERIC Educational Resources Information Center

    De Meulder, Maartje

    2015-01-01

    This article provides an analytical overview of the different types of explicit legal recognition of sign languages. Five categories are distinguished: constitutional recognition, recognition by means of general language legislation, recognition by means of a sign language law or act, recognition by means of a sign language law or act including…

  12. Development of Web-based Distributed Cooperative Development Environmentof Sign-Language Animation System and its Evaluation

    NASA Astrophysics Data System (ADS)

    Yuizono, Takaya; Hara, Kousuke; Nakayama, Shigeru

    A web-based distributed cooperative development environment of sign-language animation system has been developed. We have extended the system from the previous animation system that was constructed as three tiered system which consists of sign-language animation interface layer, sign-language data processing layer, and sign-language animation database. Two components of a web client using VRML plug-in and web servlet are added to the previous system. The systems can support humanoid-model avatar for interoperability, and can use the stored sign language animation data shared on the database. It is noted in the evaluation of this system that the inverse kinematics function of web client improves the sign-language animation making.

  13. On the Conventionalization of Mouth Actions in Australian Sign Language.

    PubMed

    Johnston, Trevor; van Roekel, Jane; Schembri, Adam

    2016-03-01

    This study investigates the conventionalization of mouth actions in Australian Sign Language. Signed languages were once thought of as simply manual languages because the hands produce the signs which individually and in groups are the symbolic units most easily equated with the words, phrases and clauses of spoken languages. However, it has long been acknowledged that non-manual activity, such as movements of the body, head and the face play a very important role. In this context, mouth actions that occur while communicating in signed languages have posed a number of questions for linguists: are the silent mouthings of spoken language words simply borrowings from the respective majority community spoken language(s)? Are those mouth actions that are not silent mouthings of spoken words conventionalized linguistic units proper to each signed language, culturally linked semi-conventional gestural units shared by signers with members of the majority speaking community, or even gestures and expressions common to all humans? We use a corpus-based approach to gather evidence of the extent of the use of mouth actions in naturalistic Australian Sign Language-making comparisons with other signed languages where data is available--and the form/meaning pairings that these mouth actions instantiate.

  14. Comparing Action Gestures and Classifier Verbs of Motion: Evidence from Australian Sign Language, Taiwan Sign Language, and Nonsigners' Gestures without Speech

    ERIC Educational Resources Information Center

    Schembri, Adam; Jones, Caroline; Burnham, Denis

    2005-01-01

    Recent research into signed languages indicates that signs may share some properties with gesture, especially in the use of space in classifier constructions. A prediction of this proposal is that there will be similarities in the representation of motion events by sign-naive gesturers and by native signers of unrelated signed languages. This…

  15. Planning Sign Languages: Promoting Hearing Hegemony? Conceptualizing Sign Language Standardization

    ERIC Educational Resources Information Center

    Eichmann, Hanna

    2009-01-01

    In light of the absence of a codified standard variety in British Sign Language and German Sign Language ("Deutsche Gebardensprache") there have been repeated calls for the standardization of both languages primarily from outside the Deaf community. The paper is based on a recent grounded theory study which explored perspectives on sign…

  16. THE PARADOX OF SIGN LANGUAGE MORPHOLOGY

    PubMed Central

    Aronoff, Mark; Meir, Irit; Sandler, Wendy

    2011-01-01

    Sign languages have two strikingly different kinds of morphological structure: sequential and simultaneous. The simultaneous morphology of two unrelated sign languages, American and Israeli Sign Language, is very similar and is largely inflectional, while what little sequential morphology we have found differs significantly and is derivational. We show that at least two pervasive types of inflectional morphology, verb agreement and classifier constructions, are iconically grounded in spatiotemporal cognition, while the sequential patterns can be traced to normal historical development. We attribute the paucity of sequential morphology in sign languages to their youth. This research both brings sign languages much closer to spoken languages in their morphological structure and shows how the medium of communication contributes to the structure of languages.* PMID:22223926

  17. Imitation, Sign Language Skill and the Developmental Ease of Language Understanding (D-ELU) Model.

    PubMed

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2016-01-01

    Imitation and language processing are closely connected. According to the Ease of Language Understanding (ELU) model (Rönnberg et al., 2013) pre-existing mental representation of lexical items facilitates language understanding. Thus, imitation of manual gestures is likely to be enhanced by experience of sign language. We tested this by eliciting imitation of manual gestures from deaf and hard-of-hearing (DHH) signing and hearing non-signing children at a similar level of language and cognitive development. We predicted that the DHH signing children would be better at imitating gestures lexicalized in their own sign language (Swedish Sign Language, SSL) than unfamiliar British Sign Language (BSL) signs, and that both groups would be better at imitating lexical signs (SSL and BSL) than non-signs. We also predicted that the hearing non-signing children would perform worse than DHH signing children with all types of gestures the first time (T1) we elicited imitation, but that the performance gap between groups would be reduced when imitation was elicited a second time (T2). Finally, we predicted that imitation performance on both occasions would be associated with linguistic skills, especially in the manual modality. A split-plot repeated measures ANOVA demonstrated that DHH signers imitated manual gestures with greater precision than non-signing children when imitation was elicited the second but not the first time. Manual gestures were easier to imitate for both groups when they were lexicalized than when they were not; but there was no difference in performance between familiar and unfamiliar gestures. For both groups, language skills at T1 predicted imitation at T2. Specifically, for DHH children, word reading skills, comprehension and phonological awareness of sign language predicted imitation at T2. For the hearing participants, language comprehension predicted imitation at T2, even after the effects of working memory capacity and motor skills were taken into account. These results demonstrate that experience of sign language enhances the ability to imitate manual gestures once representations have been established, and suggest that the inherent motor patterns of lexical manual gestures are better suited for representation than those of non-signs. This set of findings prompts a developmental version of the ELU model, D-ELU.

  18. Imitation, Sign Language Skill and the Developmental Ease of Language Understanding (D-ELU) Model

    PubMed Central

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2016-01-01

    Imitation and language processing are closely connected. According to the Ease of Language Understanding (ELU) model (Rönnberg et al., 2013) pre-existing mental representation of lexical items facilitates language understanding. Thus, imitation of manual gestures is likely to be enhanced by experience of sign language. We tested this by eliciting imitation of manual gestures from deaf and hard-of-hearing (DHH) signing and hearing non-signing children at a similar level of language and cognitive development. We predicted that the DHH signing children would be better at imitating gestures lexicalized in their own sign language (Swedish Sign Language, SSL) than unfamiliar British Sign Language (BSL) signs, and that both groups would be better at imitating lexical signs (SSL and BSL) than non-signs. We also predicted that the hearing non-signing children would perform worse than DHH signing children with all types of gestures the first time (T1) we elicited imitation, but that the performance gap between groups would be reduced when imitation was elicited a second time (T2). Finally, we predicted that imitation performance on both occasions would be associated with linguistic skills, especially in the manual modality. A split-plot repeated measures ANOVA demonstrated that DHH signers imitated manual gestures with greater precision than non-signing children when imitation was elicited the second but not the first time. Manual gestures were easier to imitate for both groups when they were lexicalized than when they were not; but there was no difference in performance between familiar and unfamiliar gestures. For both groups, language skills at T1 predicted imitation at T2. Specifically, for DHH children, word reading skills, comprehension and phonological awareness of sign language predicted imitation at T2. For the hearing participants, language comprehension predicted imitation at T2, even after the effects of working memory capacity and motor skills were taken into account. These results demonstrate that experience of sign language enhances the ability to imitate manual gestures once representations have been established, and suggest that the inherent motor patterns of lexical manual gestures are better suited for representation than those of non-signs. This set of findings prompts a developmental version of the ELU model, D-ELU. PMID:26909050

  19. Direction Asymmetries in Spoken and Signed Language Interpreting

    ERIC Educational Resources Information Center

    Nicodemus, Brenda; Emmorey, Karen

    2013-01-01

    Spoken language (unimodal) interpreters often prefer to interpret from their non-dominant language (L2) into their native language (L1). Anecdotally, signed language (bimodal) interpreters express the opposite bias, preferring to interpret from L1 (spoken language) into L2 (signed language). We conducted a large survey study ("N" =…

  20. Computerized Sign Language-Based Literacy Training for Deaf and Hard-of-Hearing Children

    ERIC Educational Resources Information Center

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2017-01-01

    Strengthening the connections between sign language and written language may improve reading skills in deaf and hard-of-hearing (DHH) signing children. The main aim of the present study was to investigate whether computerized sign language-based literacy training improves reading skills in DHH signing children who are learning to read. Further,…

  1. Grammar, Gesture, and Meaning in American Sign Language.

    ERIC Educational Resources Information Center

    Liddell, Scott K.

    In sign languages of the Deaf, now recognized as fully legitimate human languages, some signs can meaningfully point toward things or can be meaningfully placed in the space ahead of the signer. Such spatial uses of sign are an obligatory part of fluent grammatical signing. There is no parallel for this in vocally produced languages. This book…

  2. 9. Detail, sign at east comer of Roundhouse Machine Shop ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. Detail, sign at east comer of Roundhouse Machine Shop Extension, Southern Pacific Railroad Carlin Shops, view to northwest (210mm lens). Sign reads, 'Open Valve To Supply Water To City.' The railroad could supply water to all of Carlin. - Southern Pacific Railroad, Carlin Shops, Roundhouse Machine Shop Extension, Foot of Sixth Street, Carlin, Elko County, NV

  3. A Stronger Reason for the Right to Sign Languages

    ERIC Educational Resources Information Center

    Trovato, Sara

    2013-01-01

    Is the right to sign language only the right to a minority language? Holding a capability (not a disability) approach, and building on the psycholinguistic literature on sign language acquisition, I make the point that this right is of a stronger nature, since only sign languages can guarantee that each deaf child will properly develop the…

  4. How sensory-motor systems impact the neural organization for language: direct contrasts between spoken and signed language

    PubMed Central

    Emmorey, Karen; McCullough, Stephen; Mehta, Sonya; Grabowski, Thomas J.

    2014-01-01

    To investigate the impact of sensory-motor systems on the neural organization for language, we conducted an H215O-PET study of sign and spoken word production (picture-naming) and an fMRI study of sign and audio-visual spoken language comprehension (detection of a semantically anomalous sentence) with hearing bilinguals who are native users of American Sign Language (ASL) and English. Directly contrasting speech and sign production revealed greater activation in bilateral parietal cortex for signing, while speaking resulted in greater activation in bilateral superior temporal cortex (STC) and right frontal cortex, likely reflecting auditory feedback control. Surprisingly, the language production contrast revealed a relative increase in activation in bilateral occipital cortex for speaking. We speculate that greater activation in visual cortex for speaking may actually reflect cortical attenuation when signing, which functions to distinguish self-produced from externally generated visual input. Directly contrasting speech and sign comprehension revealed greater activation in bilateral STC for speech and greater activation in bilateral occipital-temporal cortex for sign. Sign comprehension, like sign production, engaged bilateral parietal cortex to a greater extent than spoken language. We hypothesize that posterior parietal activation in part reflects processing related to spatial classifier constructions in ASL and that anterior parietal activation may reflect covert imitation that functions as a predictive model during sign comprehension. The conjunction analysis for comprehension revealed that both speech and sign bilaterally engaged the inferior frontal gyrus (with more extensive activation on the left) and the superior temporal sulcus, suggesting an invariant bilateral perisylvian language system. We conclude that surface level differences between sign and spoken languages should not be dismissed and are critical for understanding the neurobiology of language. PMID:24904497

  5. Do warning signs on electronic gaming machines influence irrational cognitions?

    PubMed

    Monaghan, Sally; Blaszczynski, Alex; Nower, Lia

    2009-08-01

    Electronic gaming machines are popular among problem gamblers; in response, governments have introduced "responsible gaming" legislation incorporating the mandatory display of warning signs on or near electronic gaming machines. These signs are designed to correct irrational and erroneous beliefs through the provision of accurate information on probabilities of winning and the concept of randomness. There is minimal empirical data evaluating the effectiveness of such signs. In this study, 93 undergraduate students were randomly allocated to standard and informative messages displayed on an electronic gaming machine during play in a laboratory setting. Results revealed that a majority of participants incorrectly estimated gambling odds and reported irrational gambling-related cognitions prior to play. In addition, there were no significant between-group differences, and few participants recalled the content of messages or modified their gambling-related cognitions. Signs placed on electronic gaming machines may not modify irrational beliefs or alter gambling behaviour.

  6. Discourses of prejudice in the professions: the case of sign languages

    PubMed Central

    Humphries, Tom; Kushalnagar, Poorna; Mathur, Gaurav; Napoli, Donna Jo; Padden, Carol; Rathmann, Christian; Smith, Scott

    2017-01-01

    There is no evidence that learning a natural human language is cognitively harmful to children. To the contrary, multilingualism has been argued to be beneficial to all. Nevertheless, many professionals advise the parents of deaf children that their children should not learn a sign language during their early years, despite strong evidence across many research disciplines that sign languages are natural human languages. Their recommendations are based on a combination of misperceptions about (1) the difficulty of learning a sign language, (2) the effects of bilingualism, and particularly bimodalism, (3) the bona fide status of languages that lack a written form, (4) the effects of a sign language on acquiring literacy, (5) the ability of technologies to address the needs of deaf children and (6) the effects that use of a sign language will have on family cohesion. We expose these misperceptions as based in prejudice and urge institutions involved in educating professionals concerned with the healthcare, raising and educating of deaf children to include appropriate information about first language acquisition and the importance of a sign language for deaf children. We further urge such professionals to advise the parents of deaf children properly, which means to strongly advise the introduction of a sign language as soon as hearing loss is detected. PMID:28280057

  7. Spatial and Facial Processing in the Signed Discourse of Two Groups of Deaf Signers with Clinical Language Impairment

    ERIC Educational Resources Information Center

    Penn, Claire; Commerford, Ann; Ogilvy, Dale

    2007-01-01

    The linguistic and cognitive profiles of five deaf adults with a sign language disorder were compared with those of matched deaf controls. The test involved a battery of sign language tests, a signed narrative discourse task and a neuropsychological test protocol administered in sign language. Spatial syntax and facial processing were examined in…

  8. One grammar or two? Sign Languages and the Nature of Human Language

    PubMed Central

    Lillo-Martin, Diane C; Gajewski, Jon

    2014-01-01

    Linguistic research has identified abstract properties that seem to be shared by all languages—such properties may be considered defining characteristics. In recent decades, the recognition that human language is found not only in the spoken modality but also in the form of sign languages has led to a reconsideration of some of these potential linguistic universals. In large part, the linguistic analysis of sign languages has led to the conclusion that universal characteristics of language can be stated at an abstract enough level to include languages in both spoken and signed modalities. For example, languages in both modalities display hierarchical structure at sub-lexical and phrasal level, and recursive rule application. However, this does not mean that modality-based differences between signed and spoken languages are trivial. In this article, we consider several candidate domains for modality effects, in light of the overarching question: are signed and spoken languages subject to the same abstract grammatical constraints, or is a substantially different conception of grammar needed for the sign language case? We look at differences between language types based on the use of space, iconicity, and the possibility for simultaneity in linguistic expression. The inclusion of sign languages does support some broadening of the conception of human language—in ways that are applicable for spoken languages as well. Still, the overall conclusion is that one grammar applies for human language, no matter the modality of expression. PMID:25013534

  9. Sign Language Planning: Pragmatism, Pessimism and Principles

    ERIC Educational Resources Information Center

    Turner, Graham H.

    2009-01-01

    This article introduces the present collection of sign language planning studies. Contextualising the analyses against the backdrop of core issues in the theory of language planning and the evolution of applied sign linguistics, it is argued that--while the sociolinguistic circumstances of signed languages worldwide can, in many respects, be…

  10. Problems for a Sign Language Planning Agency

    ERIC Educational Resources Information Center

    Covington, Virginia

    1977-01-01

    American Sign Language is chiefly untaught and nonstandardized. The Communicative Skills Program of the National Association of the Deaf aims to provide sign language classes for hearing personnel and to increase interpreting services. Programs, funding and aims of the Program are outlined. A government sign language planning agency is proposed.…

  11. Absorption of language concepts in the machine mind

    NASA Astrophysics Data System (ADS)

    Kollár, Ján

    2016-06-01

    In our approach, the machine mind is the applicative dynamic system represented by its algorithmically evolvable internal language. By other words, the mind and the language of mind are synonyms. Coming out from Shaumyan's semiotic theory of languages, we present the representation of language concepts in the machine mind as a result of our experiment, to show non-redundancy of the language of mind. To provide useful restriction for further research, we also introduce the hypothesis of semantic saturation in Computer-Computer communication, which indicates that a set of machines is not self-evolvable. The goal of our research is to increase the abstraction of Human-Computer and Computer-Computer communication. If we want humans and machines comunicate as a parent with the child, using different symbols and media, we must find the language of mind commonly usable by both machines and humans. In our opinion, there exist a kind of calm language of thinking, which we try to propose for machines in this paper. We separate the layers of a machine mind, we present the structure of the evolved mind and we discuss the selected properties. We are concentrating on the representation of symbolized concepts in the mind, that are languages, not just grammars, since they have meaning.

  12. A Kinect based sign language recognition system using spatio-temporal features

    NASA Astrophysics Data System (ADS)

    Memiş, Abbas; Albayrak, Songül

    2013-12-01

    This paper presents a sign language recognition system that uses spatio-temporal features on RGB video images and depth maps for dynamic gestures of Turkish Sign Language. Proposed system uses motion differences and accumulation approach for temporal gesture analysis. Motion accumulation method, which is an effective method for temporal domain analysis of gestures, produces an accumulated motion image by combining differences of successive video frames. Then, 2D Discrete Cosine Transform (DCT) is applied to accumulated motion images and temporal domain features transformed into spatial domain. These processes are performed on both RGB images and depth maps separately. DCT coefficients that represent sign gestures are picked up via zigzag scanning and feature vectors are generated. In order to recognize sign gestures, K-Nearest Neighbor classifier with Manhattan distance is performed. Performance of the proposed sign language recognition system is evaluated on a sign database that contains 1002 isolated dynamic signs belongs to 111 words of Turkish Sign Language (TSL) in three different categories. Proposed sign language recognition system has promising success rates.

  13. Towards a Transcription System of Sign Language for 3D Virtual Agents

    NASA Astrophysics Data System (ADS)

    Do Amaral, Wanessa Machado; de Martino, José Mario

    Accessibility is a growing concern in computer science. Since virtual information is mostly presented visually, it may seem that access for deaf people is not an issue. However, for prelingually deaf individuals, those who were deaf since before acquiring and formally learn a language, written information is often of limited accessibility than if presented in signing. Further, for this community, signing is their language of choice, and reading text in a spoken language is akin to using a foreign language. Sign language uses gestures and facial expressions and is widely used by deaf communities. To enabling efficient production of signed content on virtual environment, it is necessary to make written records of signs. Transcription systems have been developed to describe sign languages in written form, but these systems have limitations. Since they were not originally designed with computer animation in mind, in general, the recognition and reproduction of signs in these systems is an easy task only to those who deeply know the system. The aim of this work is to develop a transcription system to provide signed content in virtual environment. To animate a virtual avatar, a transcription system requires explicit enough information, such as movement speed, signs concatenation, sequence of each hold-and-movement and facial expressions, trying to articulate close to reality. Although many important studies in sign languages have been published, the transcription problem remains a challenge. Thus, a notation to describe, store and play signed content in virtual environments offers a multidisciplinary study and research tool, which may help linguistic studies to understand the sign languages structure and grammar.

  14. Bimodal bilingualism as multisensory training?: Evidence for improved audiovisual speech perception after sign language exposure.

    PubMed

    Williams, Joshua T; Darcy, Isabelle; Newman, Sharlene D

    2016-02-15

    The aim of the present study was to characterize effects of learning a sign language on the processing of a spoken language. Specifically, audiovisual phoneme comprehension was assessed before and after 13 weeks of sign language exposure. L2 ASL learners performed this task in the fMRI scanner. Results indicated that L2 American Sign Language (ASL) learners' behavioral classification of the speech sounds improved with time compared to hearing nonsigners. Results indicated increased activation in the supramarginal gyrus (SMG) after sign language exposure, which suggests concomitant increased phonological processing of speech. A multiple regression analysis indicated that learner's rating on co-sign speech use and lipreading ability was correlated with SMG activation. This pattern of results indicates that the increased use of mouthing and possibly lipreading during sign language acquisition may concurrently improve audiovisual speech processing in budding hearing bimodal bilinguals. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. A Comparison of Comprehension Processes in Sign Language Interpreter Videos with or without Captions.

    PubMed

    Debevc, Matjaž; Milošević, Danijela; Kožuh, Ines

    2015-01-01

    One important theme in captioning is whether the implementation of captions in individual sign language interpreter videos can positively affect viewers' comprehension when compared with sign language interpreter videos without captions. In our study, an experiment was conducted using four video clips with information about everyday events. Fifty-one deaf and hard of hearing sign language users alternately watched the sign language interpreter videos with, and without, captions. Afterwards, they answered ten questions. The results showed that the presence of captions positively affected their rates of comprehension, which increased by 24% among deaf viewers and 42% among hard of hearing viewers. The most obvious differences in comprehension between watching sign language interpreter videos with and without captions were found for the subjects of hiking and culture, where comprehension was higher when captions were used. The results led to suggestions for the consistent use of captions in sign language interpreter videos in various media.

  16. Operationalization of Sign Language Phonological Similarity and Its Effects on Lexical Access

    ERIC Educational Resources Information Center

    Williams, Joshua T.; Stone, Adam; Newman, Sharlene D.

    2017-01-01

    Cognitive mechanisms for sign language lexical access are fairly unknown. This study investigated whether phonological similarity facilitates lexical retrieval in sign languages using measures from a new lexical database for American Sign Language. Additionally, it aimed to determine which similarity metric best fits the present data in order to…

  17. Dictionaries of African Sign Languages: An Overview

    ERIC Educational Resources Information Center

    Schmaling, Constanze H.

    2012-01-01

    This article gives an overview of dictionaries of African sign languages that have been published to date most of which have not been widely distributed. After an introduction into the field of sign language lexicography and a discussion of some of the obstacles that authors of sign language dictionaries face in general, I will show problems…

  18. Effects of Iconicity and Semantic Relatedness on Lexical Access in American Sign Language

    ERIC Educational Resources Information Center

    Bosworth, Rain G.; Emmorey, Karen

    2010-01-01

    Iconicity is a property that pervades the lexicon of many sign languages, including American Sign Language (ASL). Iconic signs exhibit a motivated, nonarbitrary mapping between the form of the sign and its meaning. We investigated whether iconicity enhances semantic priming effects for ASL and whether iconic signs are recognized more quickly than…

  19. Impacts of Visual Sonority and Handshape Markedness on Second Language Learning of American Sign Language

    ERIC Educational Resources Information Center

    Williams, Joshua T.; Newman, Sharlene D.

    2016-01-01

    The roles of visual sonority and handshape markedness in sign language acquisition and production were investigated. In Experiment 1, learners were taught sign-nonobject correspondences that varied in sign movement sonority and handshape markedness. Results from a sign-picture matching task revealed that high sonority signs were more accurately…

  20. Sign language recognition and translation: a multidisciplined approach from the field of artificial intelligence.

    PubMed

    Parton, Becky Sue

    2006-01-01

    In recent years, research has progressed steadily in regard to the use of computers to recognize and render sign language. This paper reviews significant projects in the field beginning with finger-spelling hands such as "Ralph" (robotics), CyberGloves (virtual reality sensors to capture isolated and continuous signs), camera-based projects such as the CopyCat interactive American Sign Language game (computer vision), and sign recognition software (Hidden Markov Modeling and neural network systems). Avatars such as "Tessa" (Text and Sign Support Assistant; three-dimensional imaging) and spoken language to sign language translation systems such as Poland's project entitled "THETOS" (Text into Sign Language Automatic Translator, which operates in Polish; natural language processing) are addressed. The application of this research to education is also explored. The "ICICLE" (Interactive Computer Identification and Correction of Language Errors) project, for example, uses intelligent computer-aided instruction to build a tutorial system for deaf or hard-of-hearing children that analyzes their English writing and makes tailored lessons and recommendations. Finally, the article considers synthesized sign, which is being added to educational material and has the potential to be developed by students themselves.

  1. The Road to Language Learning Is Not Entirely Iconic: Iconicity, Neighborhood Density, and Frequency Facilitate Acquisition of Sign Language.

    PubMed

    Caselli, Naomi K; Pyers, Jennie E

    2017-07-01

    Iconic mappings between words and their meanings are far more prevalent than once estimated and seem to support children's acquisition of new words, spoken or signed. We asked whether iconicity's prevalence in sign language overshadows two other factors known to support the acquisition of spoken vocabulary: neighborhood density (the number of lexical items phonologically similar to the target) and lexical frequency. Using mixed-effects logistic regressions, we reanalyzed 58 parental reports of native-signing deaf children's productive acquisition of 332 signs in American Sign Language (ASL; Anderson & Reilly, 2002) and found that iconicity, neighborhood density, and lexical frequency independently facilitated vocabulary acquisition. Despite differences in iconicity and phonological structure between signed and spoken language, signing children, like children learning a spoken language, track statistical information about lexical items and their phonological properties and leverage this information to expand their vocabulary.

  2. Hand and mouth: Cortical correlates of lexical processing in British Sign Language and speechreading English

    PubMed Central

    Capek, Cheryl M.; Waters, Dafydd; Woll, Bencie; MacSweeney, Mairéad; Brammer, Michael J.; McGuire, Philip K.; David, Anthony S.; Campbell, Ruth

    2012-01-01

    Spoken languages use one set of articulators – the vocal tract, whereas signed languages use multiple articulators, including both manual and facial actions. How sensitive are the cortical circuits for language processing to the particular articulators that are observed? This question can only be addressed with participants who use both speech and a signed language. In this study, we used fMRI to compare the processing of speechreading and sign processing in deaf native signers of British Sign Language (BSL) who were also proficient speechreaders. The following questions were addressed: To what extent do these different language types rely on a common brain network? To what extent do the patterns of activation differ? How are these networks affected by the articulators that languages use? Common perisylvian regions were activated both for speechreading English words and for BSL signs. Distinctive activation was also observed reflecting the language form. Speechreading elicited greater activation in the left mid-superior temporal cortex than BSL, whereas BSL processing generated greater activation at the parieto-occipito-temporal junction in both hemispheres. We probed this distinction further within BSL, where manual signs can be accompanied by different sorts of mouth action. BSL signs with speech-like mouth actions showed greater superior temporal activation, while signs made with non-speech-like mouth actions showed more activation in posterior and inferior temporal regions. Distinct regions within the temporal cortex are not only differentially sensitive to perception of the distinctive articulators for speech and for sign, but also show sensitivity to the different articulators within the (signed) language. PMID:18284353

  3. The Influence of Deaf People's Dual Category Status on Sign Language Planning: The British Sign Language (Scotland) Act (2015)

    ERIC Educational Resources Information Center

    De Meulder, Maartje

    2017-01-01

    Through the British Sign Language (Scotland) Act, British Sign Language (BSL) was given legal status in Scotland. The main motives for the Act were a desire to put BSL on a similar footing with Gaelic and the fact that in Scotland, BSL signers are the only group whose first language is not English who must rely on disability discrimination…

  4. Visual cortex entrains to sign language.

    PubMed

    Brookshire, Geoffrey; Lu, Jenny; Nusbaum, Howard C; Goldin-Meadow, Susan; Casasanto, Daniel

    2017-06-13

    Despite immense variability across languages, people can learn to understand any human language, spoken or signed. What neural mechanisms allow people to comprehend language across sensory modalities? When people listen to speech, electrophysiological oscillations in auditory cortex entrain to slow ([Formula: see text]8 Hz) fluctuations in the acoustic envelope. Entrainment to the speech envelope may reflect mechanisms specialized for auditory perception. Alternatively, flexible entrainment may be a general-purpose cortical mechanism that optimizes sensitivity to rhythmic information regardless of modality. Here, we test these proposals by examining cortical coherence to visual information in sign language. First, we develop a metric to quantify visual change over time. We find quasiperiodic fluctuations in sign language, characterized by lower frequencies than fluctuations in speech. Next, we test for entrainment of neural oscillations to visual change in sign language, using electroencephalography (EEG) in fluent speakers of American Sign Language (ASL) as they watch videos in ASL. We find significant cortical entrainment to visual oscillations in sign language <5 Hz, peaking at [Formula: see text]1 Hz. Coherence to sign is strongest over occipital and parietal cortex, in contrast to speech, where coherence is strongest over the auditory cortex. Nonsigners also show coherence to sign language, but entrainment at frontal sites is reduced relative to fluent signers. These results demonstrate that flexible cortical entrainment to language does not depend on neural processes that are specific to auditory speech perception. Low-frequency oscillatory entrainment may reflect a general cortical mechanism that maximizes sensitivity to informational peaks in time-varying signals.

  5. Computerized Sign Language-Based Literacy Training for Deaf and Hard-of-Hearing Children

    PubMed Central

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2017-01-01

    Abstract Strengthening the connections between sign language and written language may improve reading skills in deaf and hard-of-hearing (DHH) signing children. The main aim of the present study was to investigate whether computerized sign language-based literacy training improves reading skills in DHH signing children who are learning to read. Further, longitudinal associations between sign language skills and developing reading skills were investigated. Participants were recruited from Swedish state special schools for DHH children, where pupils are taught in both sign language and spoken language. Reading skills were assessed at five occasions and the intervention was implemented in a cross-over design. Results indicated that reading skills improved over time and that development of word reading was predicted by the ability to imitate unfamiliar lexical signs, but there was only weak evidence that it was supported by the intervention. These results demonstrate for the first time a longitudinal link between sign-based abilities and word reading in DHH signing children who are learning to read. We suggest that the active construction of novel lexical forms may be a supramodal mechanism underlying word reading development. PMID:28961874

  6. Computerized Sign Language-Based Literacy Training for Deaf and Hard-of-Hearing Children.

    PubMed

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2017-10-01

    Strengthening the connections between sign language and written language may improve reading skills in deaf and hard-of-hearing (DHH) signing children. The main aim of the present study was to investigate whether computerized sign language-based literacy training improves reading skills in DHH signing children who are learning to read. Further, longitudinal associations between sign language skills and developing reading skills were investigated. Participants were recruited from Swedish state special schools for DHH children, where pupils are taught in both sign language and spoken language. Reading skills were assessed at five occasions and the intervention was implemented in a cross-over design. Results indicated that reading skills improved over time and that development of word reading was predicted by the ability to imitate unfamiliar lexical signs, but there was only weak evidence that it was supported by the intervention. These results demonstrate for the first time a longitudinal link between sign-based abilities and word reading in DHH signing children who are learning to read. We suggest that the active construction of novel lexical forms may be a supramodal mechanism underlying word reading development. © The Author 2017. Published by Oxford University Press.

  7. ERP correlates of German Sign Language processing in deaf native signers.

    PubMed

    Hänel-Faulhaber, Barbara; Skotara, Nils; Kügow, Monique; Salden, Uta; Bottari, Davide; Röder, Brigitte

    2014-05-10

    The present study investigated the neural correlates of sign language processing of Deaf people who had learned German Sign Language (Deutsche Gebärdensprache, DGS) from their Deaf parents as their first language. Correct and incorrect signed sentences were presented sign by sign on a computer screen. At the end of each sentence the participants had to judge whether or not the sentence was an appropriate DGS sentence. Two types of violations were introduced: (1) semantically incorrect sentences containing a selectional restriction violation (implausible object); (2) morphosyntactically incorrect sentences containing a verb that was incorrectly inflected (i.e., incorrect direction of movement). Event-related brain potentials (ERPs) were recorded from 74 scalp electrodes. Semantic violations (implausible signs) elicited an N400 effect followed by a positivity. Sentences with a morphosyntactic violation (verb agreement violation) elicited a negativity followed by a broad centro-parietal positivity. ERP correlates of semantic and morphosyntactic aspects of DGS clearly differed from each other and showed a number of similarities with those observed in other signed and oral languages. These data suggest a similar functional organization of signed and oral languages despite the visual-spacial modality of sign language.

  8. ERP correlates of German Sign Language processing in deaf native signers

    PubMed Central

    2014-01-01

    Background The present study investigated the neural correlates of sign language processing of Deaf people who had learned German Sign Language (Deutsche Gebärdensprache, DGS) from their Deaf parents as their first language. Correct and incorrect signed sentences were presented sign by sign on a computer screen. At the end of each sentence the participants had to judge whether or not the sentence was an appropriate DGS sentence. Two types of violations were introduced: (1) semantically incorrect sentences containing a selectional restriction violation (implausible object); (2) morphosyntactically incorrect sentences containing a verb that was incorrectly inflected (i.e., incorrect direction of movement). Event-related brain potentials (ERPs) were recorded from 74 scalp electrodes. Results Semantic violations (implausible signs) elicited an N400 effect followed by a positivity. Sentences with a morphosyntactic violation (verb agreement violation) elicited a negativity followed by a broad centro-parietal positivity. Conclusions ERP correlates of semantic and morphosyntactic aspects of DGS clearly differed from each other and showed a number of similarities with those observed in other signed and oral languages. These data suggest a similar functional organization of signed and oral languages despite the visual-spacial modality of sign language. PMID:24884527

  9. Type of Iconicity Matters in the Vocabulary Development of Signing Children

    ERIC Educational Resources Information Center

    Ortega, Gerardo; Sümer, Beyza; Özyürek, Asli

    2017-01-01

    Recent research on signed as well as spoken language shows that the iconic features of the target language might play a role in language development. Here, we ask further whether different types of iconic depictions modulate children's preferences for certain types of sign-referent links during vocabulary development in sign language. Results from…

  10. Directionality Effects in Simultaneous Language Interpreting: The Case of Sign Language Interpreters in the Netherlands

    ERIC Educational Resources Information Center

    van Dijk, Rick; Boers, Eveline; Christoffels, Ingrid; Hermans, Daan

    2011-01-01

    The quality of interpretations produced by sign language interpreters was investigated. Twenty-five experienced interpreters were instructed to interpret narratives from (a) spoken Dutch to Sign Language of the Netherlands (SLN), (b) spoken Dutch to Sign Supported Dutch (SSD), and (c) SLN to spoken Dutch. The quality of the interpreted narratives…

  11. Approaching Sign Language Test Construction: Adaptation of the German Sign Language Receptive Skills Test

    ERIC Educational Resources Information Center

    Haug, Tobias

    2011-01-01

    There is a current need for reliable and valid test instruments in different countries in order to monitor deaf children's sign language acquisition. However, very few tests are commercially available that offer strong evidence for their psychometric properties. A German Sign Language (DGS) test focusing on linguistic structures that are acquired…

  12. Similarities & Differences in Two Brazilian Sign Languages.

    ERIC Educational Resources Information Center

    Ferreira-Brito, Lucinda

    1984-01-01

    mparison of sign language used by Urubu-Kaapor Indians in the Amazonian jungle (UKSL) and sign language used by deaf people in Sao Paulo (SPSL). In the former situation, deaf people are more integrated and accepted into their community than in Sao Paulo, because most hearing individuals are able and willing to use sign language to communicate with…

  13. Motives and Outcomes of New Zealand Sign Language Legislation: A Comparative Study between New Zealand and Finland

    ERIC Educational Resources Information Center

    Reffell, Hayley; McKee, Rachel Locker

    2009-01-01

    The medicalized interpretation of deafness has until recently seen the rights and protections of sign language users embedded in disability law. Yet the rights and protections crucial to sign language users centre predominantly on matters of language access, maintenance and identity. Legislators, motivated by pressure from sign language…

  14. Sociolinguistic Typology and Sign Languages.

    PubMed

    Schembri, Adam; Fenlon, Jordan; Cormier, Kearsy; Johnston, Trevor

    2018-01-01

    This paper examines the possible relationship between proposed social determinants of morphological 'complexity' and how this contributes to linguistic diversity, specifically via the typological nature of the sign languages of deaf communities. We sketch how the notion of morphological complexity, as defined by Trudgill (2011), applies to sign languages. Using these criteria, sign languages appear to be languages with low to moderate levels of morphological complexity. This may partly reflect the influence of key social characteristics of communities on the typological nature of languages. Although many deaf communities are relatively small and may involve dense social networks (both social characteristics that Trudgill claimed may lend themselves to morphological 'complexification'), the picture is complicated by the highly variable nature of the sign language acquisition for most deaf people, and the ongoing contact between native signers, hearing non-native signers, and those deaf individuals who only acquire sign languages in later childhood and early adulthood. These are all factors that may work against the emergence of morphological complexification. The relationship between linguistic typology and these key social factors may lead to a better understanding of the nature of sign language grammar. This perspective stands in contrast to other work where sign languages are sometimes presented as having complex morphology despite being young languages (e.g., Aronoff et al., 2005); in some descriptions, the social determinants of morphological complexity have not received much attention, nor has the notion of complexity itself been specifically explored.

  15. Eye Gaze in Creative Sign Language

    ERIC Educational Resources Information Center

    Kaneko, Michiko; Mesch, Johanna

    2013-01-01

    This article discusses the role of eye gaze in creative sign language. Because eye gaze conveys various types of linguistic and poetic information, it is an intrinsic part of sign language linguistics in general and of creative signing in particular. We discuss various functions of eye gaze in poetic signing and propose a classification of gaze…

  16. Sign Language with Babies: What Difference Does It Make?

    ERIC Educational Resources Information Center

    Barnes, Susan Kubic

    2010-01-01

    Teaching sign language--to deaf or other children with special needs or to hearing children with hard-of-hearing family members--is not new. Teaching sign language to typically developing children has become increasingly popular since the publication of "Baby Signs"[R] (Goodwyn & Acredolo, 1996), now in its third edition. Attention to signing with…

  17. Sign language aphasia due to left occipital lesion in a deaf signer.

    PubMed

    Saito, Kozue; Otsuki, Mika; Ueno, Satoshi

    2007-10-02

    Localization of sign language production and comprehension in deaf people has been described as similar to that of spoken language aphasia. However, sign language employs a visuospatial modality through visual information. We present the first report of a deaf signer who showed substantial sign language aphasia with severe impairment in word production due to a left occipital lesion. This case may indicate the possibility of other localizations of plasticity.

  18. The role of syllables in sign language production.

    PubMed

    Baus, Cristina; Gutiérrez, Eva; Carreiras, Manuel

    2014-01-01

    The aim of the present study was to investigate the functional role of syllables in sign language and how the different phonological combinations influence sign production. Moreover, the influence of age of acquisition was evaluated. Deaf signers (native and non-native) of Catalan Signed Language (LSC) were asked in a picture-sign interference task to sign picture names while ignoring distractor-signs with which they shared two phonological parameters (out of three of the main sign parameters: Location, Movement, and Handshape). The results revealed a different impact of the three phonological combinations. While no effect was observed for the phonological combination Handshape-Location, the combination Handshape-Movement slowed down signing latencies, but only in the non-native group. A facilitatory effect was observed for both groups when pictures and distractors shared Location-Movement. Importantly, linguistic models have considered this phonological combination to be a privileged unit in the composition of signs, as syllables are in spoken languages. Thus, our results support the functional role of syllable units during phonological articulation in sign language production.

  19. Modality-specific processing precedes amodal linguistic processing during L2 sign language acquisition: A longitudinal study.

    PubMed

    Williams, Joshua T; Darcy, Isabelle; Newman, Sharlene D

    2016-02-01

    The present study tracked activation pattern differences in response to sign language processing by late hearing second language learners of American Sign Language. Learners were scanned before the start of their language courses. They were scanned again after their first semester of instruction and their second, for a total of 10 months of instruction. The study aimed to characterize modality-specific to modality-general processing throughout the acquisition of sign language. Results indicated that before the acquisition of sign language, neural substrates related to modality-specific processing were present. After approximately 45 h of instruction, the learners transitioned into processing signs on a phonological basis (e.g., supramarginal gyrus, putamen). After one more semester of input, learners transitioned once more to a lexico-semantic processing stage (e.g., left inferior frontal gyrus) at which language control mechanisms (e.g., left caudate, cingulate gyrus) were activated. During these transitional steps right hemispheric recruitment was observed, with increasing left-lateralization, which is similar to other native signers and L2 learners of spoken language; however, specialization for sign language processing with activation in the inferior parietal lobule (i.e., angular gyrus), even for late learners, was observed. As such, the present study is the first to track L2 acquisition of sign language learners in order to characterize modality-independent and modality-specific mechanisms for bilingual language processing. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Discourses of prejudice in the professions: the case of sign languages.

    PubMed

    Humphries, Tom; Kushalnagar, Poorna; Mathur, Gaurav; Napoli, Donna Jo; Padden, Carol; Rathmann, Christian; Smith, Scott

    2017-09-01

    There is no evidence that learning a natural human language is cognitively harmful to children. To the contrary, multilingualism has been argued to be beneficial to all. Nevertheless, many professionals advise the parents of deaf children that their children should not learn a sign language during their early years, despite strong evidence across many research disciplines that sign languages are natural human languages. Their recommendations are based on a combination of misperceptions about (1) the difficulty of learning a sign language, (2) the effects of bilingualism, and particularly bimodalism, (3) the bona fide status of languages that lack a written form, (4) the effects of a sign language on acquiring literacy, (5) the ability of technologies to address the needs of deaf children and (6) the effects that use of a sign language will have on family cohesion. We expose these misperceptions as based in prejudice and urge institutions involved in educating professionals concerned with the healthcare, raising and educating of deaf children to include appropriate information about first language acquisition and the importance of a sign language for deaf children. We further urge such professionals to advise the parents of deaf children properly, which means to strongly advise the introduction of a sign language as soon as hearing loss is detected. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  1. Sensitivity to visual prosodic cues in signers and nonsigners.

    PubMed

    Brentari, Diane; González, Carolina; Seidl, Amanda; Wilbur, Ronnie

    2011-03-01

    Three studies are presented in this paper that address how nonsigners perceive the visual prosodic cues in a sign language. In Study 1, adult American nonsigners and users of American Sign Language (ASL) were compared on their sensitivity to the visual cues in ASL Intonational Phrases. In Study 2, hearing, nonsigning American infants were tested using the same stimuli used in Study I to see whether maturity, exposure to gesture, or exposure to sign language is necessary to demonstrate this type of sensitivity. Study 3 addresses nonsigners' and signers' strategies for segmenting Prosodic Words in a sign language. Adult participants from six language groups (3 spoken languages and 3 sign languages) were tested.The results of these three studies indicate that nonsigners have a high degree of sensitivity to sign language prosodic cues at the Intonational Phrase level and the Prosodic Word level; these are attributed to modality or'channel' effects of the visual signal.There are also some differences between signers' and nonsigners' sensitivity; these differences are attributed to language experience or language-particular constraints.This work is useful in understanding the gestural competence of nonsigners and the ways in which this type of competence may contribute to the grammaticalization of these properties in a sign language.

  2. Event representations constrain the structure of language: Sign language as a window into universally accessible linguistic biases.

    PubMed

    Strickland, Brent; Geraci, Carlo; Chemla, Emmanuel; Schlenker, Philippe; Kelepir, Meltem; Pfau, Roland

    2015-05-12

    According to a theoretical tradition dating back to Aristotle, verbs can be classified into two broad categories. Telic verbs (e.g., "decide," "sell," "die") encode a logical endpoint, whereas atelic verbs (e.g., "think," "negotiate," "run") do not, and the denoted event could therefore logically continue indefinitely. Here we show that sign languages encode telicity in a seemingly universal way and moreover that even nonsigners lacking any prior experience with sign language understand these encodings. In experiments 1-5, nonsigning English speakers accurately distinguished between telic (e.g., "decide") and atelic (e.g., "think") signs from (the historically unrelated) Italian Sign Language, Sign Language of the Netherlands, and Turkish Sign Language. These results were not due to participants' inferring that the sign merely imitated the action in question. In experiment 6, we used pseudosigns to show that the presence of a salient visual boundary at the end of a gesture was sufficient to elicit telic interpretations, whereas repeated movement without salient boundaries elicited atelic interpretations. Experiments 7-10 confirmed that these visual cues were used by all of the sign languages studied here. Together, these results suggest that signers and nonsigners share universally accessible notions of telicity as well as universally accessible "mapping biases" between telicity and visual form.

  3. Medical Signbank as a Model for Sign Language Planning? A Review of Community Engagement

    ERIC Educational Resources Information Center

    Napier, Jemina; Major, George; Ferrara, Lindsay; Johnston, Trevor

    2015-01-01

    This paper reviews a sign language planning project conducted in Australia with deaf Auslan users. The Medical Signbank project utilised a cooperative language planning process to engage with the Deaf community and sign language interpreters to develop an online interactive resource of health-related signs, in order to address a gap in the health…

  4. Is Teaching Sign Language in Early Childhood Classrooms Feasible for Busy Teachers and Beneficial for Children?

    ERIC Educational Resources Information Center

    Brereton, Amy Elizabeth

    2010-01-01

    Infants' hands are ready to construct words using sign language before their mouths are ready to speak. These research findings may explain the popularity of parents and caregivers teaching and using sign language with infants and toddlers, along with speech. The advantages of using sign language with young children go beyond the infant and…

  5. Use of Information and Communication Technologies in Sign Language Test Development: Results of an International Survey

    ERIC Educational Resources Information Center

    Haug, Tobias

    2015-01-01

    Sign language test development is a relatively new field within sign linguistics, motivated by the practical need for assessment instruments to evaluate language development in different groups of learners (L1, L2). Due to the lack of research on the structure and acquisition of many sign languages, developing an assessment instrument poses…

  6. Adaptation of a Vocabulary Test from British Sign Language to American Sign Language

    ERIC Educational Resources Information Center

    Mann, Wolfgang; Roy, Penny; Morgan, Gary

    2016-01-01

    This study describes the adaptation process of a vocabulary knowledge test for British Sign Language (BSL) into American Sign Language (ASL) and presents results from the first round of pilot testing with 20 deaf native ASL signers. The web-based test assesses the strength of deaf children's vocabulary knowledge by means of different mappings of…

  7. The Neural Correlates of Highly Iconic Structures and Topographic Discourse in French Sign Language as Observed in Six Hearing Native Signers

    ERIC Educational Resources Information Center

    Courtin, C.; Herve, P. -Y.; Petit, L.; Zago, L.; Vigneau, M.; Beaucousin, V.; Jobard, G.; Mazoyer, B.; Mellet, E.; Tzourio-Mazoyer, N.

    2010-01-01

    "Highly iconic" structures in Sign Language enable a narrator to act, switch characters, describe objects, or report actions in four-dimensions. This group of linguistic structures has no real spoken-language equivalent. Topographical descriptions are also achieved in a sign-language specific manner via the use of signing-space and…

  8. Sign Language and Language Acquisition in Man and Ape. New Dimensions in Comparative Pedolinguistics.

    ERIC Educational Resources Information Center

    Peng, Fred C. C., Ed.

    A collection of research materials on sign language and primatology is presented here. The essays attempt to show that: sign language is a legitimate language that can be learned not only by humans but by nonhuman primates as well, and nonhuman primates have the capability to acquire a human language using a different mode. The following…

  9. From the World's Trouble Spots They Arrive in Our Classrooms: Working with Deaf Refugees and Immigrants

    ERIC Educational Resources Information Center

    Moers, Pamela Wright

    2017-01-01

    Pamela Wright Moers has worked with American Sign Language (ASL) and English language instruction for over 25 years, and both her work and her studies have focused on the various uses of language. Her research has been on language endangerment, diversity in sign language, third-world sign languages, and the phonological and semantic structures…

  10. How do typically developing deaf children and deaf children with autism spectrum disorder use the face when comprehending emotional facial expressions in British sign language?

    PubMed

    Denmark, Tanya; Atkinson, Joanna; Campbell, Ruth; Swettenham, John

    2014-10-01

    Facial expressions in sign language carry a variety of communicative features. While emotion can modulate a spoken utterance through changes in intonation, duration and intensity, in sign language specific facial expressions presented concurrently with a manual sign perform this function. When deaf adult signers cannot see facial features, their ability to judge emotion in a signed utterance is impaired (Reilly et al. in Sign Lang Stud 75:113-118, 1992). We examined the role of the face in the comprehension of emotion in sign language in a group of typically developing (TD) deaf children and in a group of deaf children with autism spectrum disorder (ASD). We replicated Reilly et al.'s (Sign Lang Stud 75:113-118, 1992) adult results in the TD deaf signing children, confirming the importance of the face in understanding emotion in sign language. The ASD group performed more poorly on the emotion recognition task than the TD children. The deaf children with ASD showed a deficit in emotion recognition during sign language processing analogous to the deficit in vocal emotion recognition that has been observed in hearing children with ASD.

  11. A Comparison of Comprehension Processes in Sign Language Interpreter Videos with or without Captions

    PubMed Central

    Debevc, Matjaž; Milošević, Danijela; Kožuh, Ines

    2015-01-01

    One important theme in captioning is whether the implementation of captions in individual sign language interpreter videos can positively affect viewers’ comprehension when compared with sign language interpreter videos without captions. In our study, an experiment was conducted using four video clips with information about everyday events. Fifty-one deaf and hard of hearing sign language users alternately watched the sign language interpreter videos with, and without, captions. Afterwards, they answered ten questions. The results showed that the presence of captions positively affected their rates of comprehension, which increased by 24% among deaf viewers and 42% among hard of hearing viewers. The most obvious differences in comprehension between watching sign language interpreter videos with and without captions were found for the subjects of hiking and culture, where comprehension was higher when captions were used. The results led to suggestions for the consistent use of captions in sign language interpreter videos in various media. PMID:26010899

  12. Sign Language and Hearing Preschoolers.

    ERIC Educational Resources Information Center

    Reynolds, Kate E.

    1995-01-01

    Notes that sign language is the third most used second language in the United States and that early childhood is an ideal language-learning time. Describes the experiences of one preschool where American Sign Language has become an integral part of the curriculum. Includes guiding principles, classroom do's and don'ts, and a resource list of…

  13. Pinky Extension as a Phonestheme in Mongolian Sign Language

    ERIC Educational Resources Information Center

    Healy, Christina

    2011-01-01

    Mongolian Sign Language (MSL) is a visual-gestural language that developed from multiple languages interacting as a result of both geographic proximity and political relations and of the natural development of a communication system by deaf community members. Similar to the phonological systems of other signed languages, MSL combines handshapes,…

  14. Gesture, sign, and language: The coming of age of sign language and gesture studies.

    PubMed

    Goldin-Meadow, Susan; Brentari, Diane

    2017-01-01

    How does sign language compare with gesture, on the one hand, and spoken language on the other? Sign was once viewed as nothing more than a system of pictorial gestures without linguistic structure. More recently, researchers have argued that sign is no different from spoken language, with all of the same linguistic structures. The pendulum is currently swinging back toward the view that sign is gestural, or at least has gestural components. The goal of this review is to elucidate the relationships among sign language, gesture, and spoken language. We do so by taking a close look not only at how sign has been studied over the past 50 years, but also at how the spontaneous gestures that accompany speech have been studied. We conclude that signers gesture just as speakers do. Both produce imagistic gestures along with more categorical signs or words. Because at present it is difficult to tell where sign stops and gesture begins, we suggest that sign should not be compared with speech alone but should be compared with speech-plus-gesture. Although it might be easier (and, in some cases, preferable) to blur the distinction between sign and gesture, we argue that distinguishing between sign (or speech) and gesture is essential to predict certain types of learning and allows us to understand the conditions under which gesture takes on properties of sign, and speech takes on properties of gesture. We end by calling for new technology that may help us better calibrate the borders between sign and gesture.

  15. Sign language Web pages.

    PubMed

    Fels, Deborah I; Richards, Jan; Hardman, Jim; Lee, Daniel G

    2006-01-01

    The WORLD WIDE WEB has changed the way people interact. It has also become an important equalizer of information access for many social sectors. However, for many people, including some sign language users, Web accessing can be difficult. For some, it not only presents another barrier to overcome but has left them without cultural equality. The present article describes a system that allows sign language-only Web pages to be created and linked through a video-based technique called sign-linking. In two studies, 14 Deaf participants examined two iterations of signlinked Web pages to gauge the usability and learnability of a signing Web page interface. The first study indicated that signing Web pages were usable by sign language users but that some interface features required improvement. The second study showed increased usability for those features; users consequently couldnavigate sign language information with ease and pleasure.

  16. Exploring the Further Integration of Machine Translation in English-Chinese Cross Language Information Access

    ERIC Educational Resources Information Center

    Wu, Dan; He, Daqing

    2012-01-01

    Purpose: This paper seeks to examine the further integration of machine translation technologies with cross language information access in providing web users the capabilities of accessing information beyond language barriers. Machine translation and cross language information access are related technologies, and yet they have their own unique…

  17. Sociolinguistic Typology and Sign Languages

    PubMed Central

    Schembri, Adam; Fenlon, Jordan; Cormier, Kearsy; Johnston, Trevor

    2018-01-01

    This paper examines the possible relationship between proposed social determinants of morphological ‘complexity’ and how this contributes to linguistic diversity, specifically via the typological nature of the sign languages of deaf communities. We sketch how the notion of morphological complexity, as defined by Trudgill (2011), applies to sign languages. Using these criteria, sign languages appear to be languages with low to moderate levels of morphological complexity. This may partly reflect the influence of key social characteristics of communities on the typological nature of languages. Although many deaf communities are relatively small and may involve dense social networks (both social characteristics that Trudgill claimed may lend themselves to morphological ‘complexification’), the picture is complicated by the highly variable nature of the sign language acquisition for most deaf people, and the ongoing contact between native signers, hearing non-native signers, and those deaf individuals who only acquire sign languages in later childhood and early adulthood. These are all factors that may work against the emergence of morphological complexification. The relationship between linguistic typology and these key social factors may lead to a better understanding of the nature of sign language grammar. This perspective stands in contrast to other work where sign languages are sometimes presented as having complex morphology despite being young languages (e.g., Aronoff et al., 2005); in some descriptions, the social determinants of morphological complexity have not received much attention, nor has the notion of complexity itself been specifically explored. PMID:29515506

  18. On the System of Person-Denoting Signs in Estonian Sign Language: Estonian Name Signs

    ERIC Educational Resources Information Center

    Paales, Liina

    2010-01-01

    This article discusses Estonian personal name signs. According to study there are four personal name sign categories in Estonian Sign Language: (1) arbitrary name signs; (2) descriptive name signs; (3) initialized-descriptive name signs; (4) loan/borrowed name signs. Mostly there are represented descriptive and borrowed personal name signs among…

  19. Approaching sign language test construction: adaptation of the German sign language receptive skills test.

    PubMed

    Haug, Tobias

    2011-01-01

    There is a current need for reliable and valid test instruments in different countries in order to monitor deaf children's sign language acquisition. However, very few tests are commercially available that offer strong evidence for their psychometric properties. A German Sign Language (DGS) test focusing on linguistic structures that are acquired in preschool- and school-aged children (4-8 years old) is urgently needed. Using the British Sign Language Receptive Skills Test, that has been standardized and has sound psychometric properties, as a template for adaptation thus provides a starting point for tests of a sign language that is less documented, such as DGS. This article makes a novel contribution to the field by examining linguistic, cultural, and methodological issues in the process of adapting a test from the source language to the target language. The adapted DGS test has sound psychometric properties and provides the basis for revision prior to standardization. © The Author 2011. Published by Oxford University Press. All rights reserved.

  20. Semantic Fluency in Deaf Children Who Use Spoken and Signed Language in Comparison with Hearing Peers

    ERIC Educational Resources Information Center

    Marshall, C. R.; Jones, A.; Fastelli, A.; Atkinson, J.; Botting, N.; Morgan, G.

    2018-01-01

    Background: Deafness has an adverse impact on children's ability to acquire spoken languages. Signed languages offer a more accessible input for deaf children, but because the vast majority are born to hearing parents who do not sign, their early exposure to sign language is limited. Deaf children as a whole are therefore at high risk of language…

  1. Methodological and Theoretical Issues in the Adaptation of Sign Language Tests: An Example from the Adaptation of a Test to German Sign Language

    ERIC Educational Resources Information Center

    Haug, Tobias

    2012-01-01

    Despite the current need for reliable and valid test instruments in different countries in order to monitor the sign language acquisition of deaf children, very few tests are commercially available that offer strong evidence for their psychometric properties. This mirrors the current state of affairs for many sign languages, where very little…

  2. Examination of Sign Language Education According to the Opinions of Members from a Basic Sign Language Certification Program

    ERIC Educational Resources Information Center

    Akmese, Pelin Pistav

    2016-01-01

    Being hearing impaired limits one's ability to communicate in that it affects all areas of development, particularly speech. One of the methods the hearing impaired use to communicate is sign language. This study, a descriptive study, intends to examine the opinions of individuals who had enrolled in a sign language certification program by using…

  3. Who Is Qualified to Teach American Sign Language?

    ERIC Educational Resources Information Center

    Kanda, Jan; Fleischer, Larry

    1988-01-01

    Teachers of American Sign Language (ASL) can no longer qualify just by being able to sign well or by being deaf. ASL teachers must respect the language and its history, feel comfortable interacting with the deaf community, have completed formal study of language and pedagogy, be familiar with second-language teaching, and engage in personal and…

  4. Simultaneous Communication Supports Learning in Noise by Cochlear Implant Users

    PubMed Central

    Blom, Helen C.; Marschark, Marc; Machmer, Elizabeth

    2017-01-01

    Objectives This study sought to evaluate the potential of using spoken language and signing together (simultaneous communication, SimCom, sign-supported speech) as a means of improving speech recognition, comprehension, and learning by cochlear implant users in noisy contexts. Methods Forty eight college students who were active cochlear implant users, watched videos of three short presentations, the text versions of which were standardized at the 8th grade reading level. One passage was presented in spoken language only, one was presented in spoken language with multi-talker babble background noise, and one was presented via simultaneous communication with the same background noise. Following each passage, participants responded to 10 (standardized) open-ended questions designed to assess comprehension. Indicators of participants’ spoken language and sign language skills were obtained via self-reports and objective assessments. Results When spoken materials were accompanied by signs, scores were significantly higher than when materials were spoken in noise without signs. Participants’ receptive spoken language skills significantly predicted scores in all three conditions; neither their receptive sign skills nor age of implantation predicted performance. Discussion Students who are cochlear implant users typically rely solely on spoken language in the classroom. The present results, however, suggest that there are potential benefits of simultaneous communication for such learners in noisy settings. For those cochlear implant users who know sign language, the redundancy of speech and signs potentially can offset the reduced fidelity of spoken language in noise. Conclusion Accompanying spoken language with signs can benefit learners who are cochlear implant users in noisy situations such as classroom settings. Factors associated with such benefits, such as receptive skills in signed and spoken modalities, classroom acoustics, and material difficulty need to be empirically examined. PMID:28010675

  5. Simultaneous communication supports learning in noise by cochlear implant users.

    PubMed

    Blom, Helen; Marschark, Marc; Machmer, Elizabeth

    2017-01-01

    This study sought to evaluate the potential of using spoken language and signing together (simultaneous communication, SimCom, sign-supported speech) as a means of improving speech recognition, comprehension, and learning by cochlear implant (CI) users in noisy contexts. Forty eight college students who were active CI users, watched videos of three short presentations, the text versions of which were standardized at the 8 th -grade reading level. One passage was presented in spoken language only, one was presented in spoken language with multi-talker babble background noise, and one was presented via simultaneous communication with the same background noise. Following each passage, participants responded to 10 (standardized) open-ended questions designed to assess comprehension. Indicators of participants' spoken language and sign language skills were obtained via self-reports and objective assessments. When spoken materials were accompanied by signs, scores were significantly higher than when materials were spoken in noise without signs. Participants' receptive spoken language skills significantly predicted scores in all three conditions; neither their receptive sign skills nor age of implantation predicted performance. Students who are CI users typically rely solely on spoken language in the classroom. The present results, however, suggest that there are potential benefits of simultaneous communication for such learners in noisy settings. For those CI users who know sign language, the redundancy of speech and signs potentially can offset the reduced fidelity of spoken language in noise. Accompanying spoken language with signs can benefit learners who are CI users in noisy situations such as classroom settings. Factors associated with such benefits, such as receptive skills in signed and spoken modalities, classroom acoustics, and material difficulty need to be empirically examined.

  6. Constraints on Negative Prefixation in Polish Sign Language.

    PubMed

    Tomaszewski, Piotr

    2015-01-01

    The aim of this article is to describe a negative prefix, NEG-, in Polish Sign Language (PJM) which appears to be indigenous to the language. This is of interest given the relative rarity of prefixes in sign languages. Prefixed PJM signs were analyzed on the basis of both a corpus of texts signed by 15 deaf PJM users who are either native or near-native signers, and material including a specified range of prefixed signs as demonstrated by native signers in dictionary form (i.e. signs produced in isolation, not as part of phrases or sentences). In order to define the morphological rules behind prefixation on both the phonological and morphological levels, native PJM users were consulted for their expertise. The research results can enrich models for describing processes of grammaticalization in the context of the visual-gestural modality that forms the basis for sign language structure.

  7. Sociolinguistic Variation and Change in British Sign Language Number Signs: Evidence of Leveling?

    ERIC Educational Resources Information Center

    Stamp, Rose; Schembri, Adam; Fenlon, Jordan; Rentelis, Ramas

    2015-01-01

    This article presents findings from the first major study to investigate lexical variation and change in British Sign Language (BSL) number signs. As part of the BSL Corpus Project, number sign variants were elicited from 249 deaf signers from eight sites throughout the UK. Age, school location, and language background were found to be significant…

  8. The role of syllables in sign language production

    PubMed Central

    Baus, Cristina; Gutiérrez, Eva; Carreiras, Manuel

    2014-01-01

    The aim of the present study was to investigate the functional role of syllables in sign language and how the different phonological combinations influence sign production. Moreover, the influence of age of acquisition was evaluated. Deaf signers (native and non-native) of Catalan Signed Language (LSC) were asked in a picture-sign interference task to sign picture names while ignoring distractor-signs with which they shared two phonological parameters (out of three of the main sign parameters: Location, Movement, and Handshape). The results revealed a different impact of the three phonological combinations. While no effect was observed for the phonological combination Handshape-Location, the combination Handshape-Movement slowed down signing latencies, but only in the non-native group. A facilitatory effect was observed for both groups when pictures and distractors shared Location-Movement. Importantly, linguistic models have considered this phonological combination to be a privileged unit in the composition of signs, as syllables are in spoken languages. Thus, our results support the functional role of syllable units during phonological articulation in sign language production. PMID:25431562

  9. Gesture, sign and language: The coming of age of sign language and gesture studies

    PubMed Central

    Goldin-Meadow, Susan; Brentari, Diane

    2016-01-01

    How does sign language compare to gesture, on the one hand, and to spoken language on the other? At one time, sign was viewed as nothing more than a system of pictorial gestures with no linguistic structure. More recently, researchers have argued that sign is no different from spoken language with all of the same linguistic structures. The pendulum is currently swinging back toward the view that sign is gestural, or at least has gestural components. The goal of this review is to elucidate the relationships among sign language, gesture, and spoken language. We do so by taking a close look not only at how sign has been studied over the last 50 years, but also at how the spontaneous gestures that accompany speech have been studied. We come to the conclusion that signers gesture just as speakers do. Both produce imagistic gestures along with more categorical signs or words. Because, at the moment, it is difficult to tell where sign stops and where gesture begins, we suggest that sign should not be compared to speech alone, but should be compared to speech-plus-gesture. Although it might be easier (and, in some cases, preferable) to blur the distinction between sign and gesture, we argue that making a distinction between sign (or speech) and gesture is essential to predict certain types of learning, and allows us to understand the conditions under which gesture takes on properties of sign, and speech takes on properties of gesture. We end by calling for new technology that may help us better calibrate the borders between sign and gesture. PMID:26434499

  10. The Phonetics of Head and Body Movement in the Realization of American Sign Language Signs.

    PubMed

    Tyrone, Martha E; Mauk, Claude E

    2016-01-01

    Because the primary articulators for sign languages are the hands, sign phonology and phonetics have focused mainly on them and treated other articulators as passive targets. However, there is abundant research on the role of nonmanual articulators in sign language grammar and prosody. The current study examines how hand and head/body movements are coordinated to realize phonetic targets. Kinematic data were collected from 5 deaf American Sign Language (ASL) signers to allow the analysis of movements of the hands, head and body during signing. In particular, we examine how the chin, forehead and torso move during the production of ASL signs at those three phonological locations. Our findings suggest that for signs with a lexical movement toward the head, the forehead and chin move to facilitate convergence with the hand. By comparison, the torso does not move to facilitate convergence with the hand for signs located at the torso. These results imply that the nonmanual articulators serve a phonetic as well as a grammatical or prosodic role in sign languages. Future models of sign phonetics and phonology should take into consideration the movements of the nonmanual articulators in the realization of signs. © 2016 S. Karger AG, Basel.

  11. The Use of Sign Language Pronouns by Native-Signing Children with Autism.

    PubMed

    Shield, Aaron; Meier, Richard P; Tager-Flusberg, Helen

    2015-07-01

    We report the first study on pronoun use by an under-studied research population, children with autism spectrum disorder (ASD) exposed to American Sign Language from birth by their deaf parents. Personal pronouns cause difficulties for hearing children with ASD, who sometimes reverse or avoid them. Unlike speech pronouns, sign pronouns are indexical points to self and other. Despite this transparency, we find evidence from an elicitation task and parental report that signing children with ASD avoid sign pronouns in favor of names. An analysis of spontaneous usage showed that all children demonstrated the ability to point, but only children with better-developed sign language produced pronouns. Differences in language abilities and self-representation may explain these phenomena in sign and speech.

  12. A human mirror neuron system for language: Perspectives from signed languages of the deaf.

    PubMed

    Knapp, Heather Patterson; Corina, David P

    2010-01-01

    Language is proposed to have developed atop the human analog of the macaque mirror neuron system for action perception and production [Arbib M.A. 2005. From monkey-like action recognition to human language: An evolutionary framework for neurolinguistics (with commentaries and author's response). Behavioral and Brain Sciences, 28, 105-167; Arbib M.A. (2008). From grasp to language: Embodied concepts and the challenge of abstraction. Journal de Physiologie Paris 102, 4-20]. Signed languages of the deaf are fully-expressive, natural human languages that are perceived visually and produced manually. We suggest that if a unitary mirror neuron system mediates the observation and production of both language and non-linguistic action, three prediction can be made: (1) damage to the human mirror neuron system should non-selectively disrupt both sign language and non-linguistic action processing; (2) within the domain of sign language, a given mirror neuron locus should mediate both perception and production; and (3) the action-based tuning curves of individual mirror neurons should support the highly circumscribed set of motions that form the "vocabulary of action" for signed languages. In this review we evaluate data from the sign language and mirror neuron literatures and find that these predictions are only partially upheld. 2009 Elsevier Inc. All rights reserved.

  13. Where "Sign Language Studies" Has Led Us in Forty Years: Opening High School and University Education for Deaf People in Viet Nam through Sign Language Analysis, Teaching, and Interpretation

    ERIC Educational Resources Information Center

    Woodward, James; Hoa, Nguyen Thi

    2012-01-01

    This paper discusses how the Nippon Foundation-funded project "Opening University Education to Deaf People in Viet Nam through Sign Language Analysis, Teaching, and Interpretation," also known as the Dong Nai Deaf Education Project, has been implemented through sign language studies from 2000 through 2012. This project has provided deaf…

  14. Standardization of Sign Languages

    ERIC Educational Resources Information Center

    Adam, Robert

    2015-01-01

    Over the years attempts have been made to standardize sign languages. This form of language planning has been tackled by a variety of agents, most notably teachers of Deaf students, social workers, government agencies, and occasionally groups of Deaf people themselves. Their efforts have most often involved the development of sign language books…

  15. Validity of the American Sign Language Discrimination Test

    ERIC Educational Resources Information Center

    Bochner, Joseph H.; Samar, Vincent J.; Hauser, Peter C.; Garrison, Wayne M.; Searls, J. Matt; Sanders, Cynthia A.

    2016-01-01

    American Sign Language (ASL) is one of the most commonly taught languages in North America. Yet, few assessment instruments for ASL proficiency have been developed, none of which have adequately demonstrated validity. We propose that the American Sign Language Discrimination Test (ASL-DT), a recently developed measure of learners' ability to…

  16. Writing Signed Languages: What for? What Form?

    ERIC Educational Resources Information Center

    Grushkin, Donald A.

    2017-01-01

    Signed languages around the world have tended to maintain an "oral," unwritten status. Despite the advantages of possessing a written form of their language, signed language communities typically resist and reject attempts to create such written forms. The present article addresses many of the arguments against written forms of signed…

  17. Audience Effects in American Sign Language Interpretation

    ERIC Educational Resources Information Center

    Weisenberg, Julia

    2009-01-01

    There is a system of English mouthing during interpretation that appears to be the result of language contact between spoken language and signed language. English mouthing is a voiceless visual representation of words on a signer's lips produced concurrently with manual signs. It is a type of borrowing prevalent among English-dominant…

  18. A Field Guide for Sign Language Research.

    ERIC Educational Resources Information Center

    Stokoe, William; Kuschel, Rolf

    Field researchers of sign language are the target of this methodological guide. The prospective researcher is briefed on the rationale of sign language study as language study and as distinct from the study of kinesics. Subjects covered include problems of translating, use of interpreters, and ethics. Instruments for obtaining social and language…

  19. Neural systems supporting linguistic structure, linguistic experience, and symbolic communication in sign language and gesture.

    PubMed

    Newman, Aaron J; Supalla, Ted; Fernandez, Nina; Newport, Elissa L; Bavelier, Daphne

    2015-09-15

    Sign languages used by deaf communities around the world possess the same structural and organizational properties as spoken languages: In particular, they are richly expressive and also tightly grammatically constrained. They therefore offer the opportunity to investigate the extent to which the neural organization for language is modality independent, as well as to identify ways in which modality influences this organization. The fact that sign languages share the visual-manual modality with a nonlinguistic symbolic communicative system-gesture-further allows us to investigate where the boundaries lie between language and symbolic communication more generally. In the present study, we had three goals: to investigate the neural processing of linguistic structure in American Sign Language (using verbs of motion classifier constructions, which may lie at the boundary between language and gesture); to determine whether we could dissociate the brain systems involved in deriving meaning from symbolic communication (including both language and gesture) from those specifically engaged by linguistically structured content (sign language); and to assess whether sign language experience influences the neural systems used for understanding nonlinguistic gesture. The results demonstrated that even sign language constructions that appear on the surface to be similar to gesture are processed within the left-lateralized frontal-temporal network used for spoken languages-supporting claims that these constructions are linguistically structured. Moreover, although nonsigners engage regions involved in human action perception to process communicative, symbolic gestures, signers instead engage parts of the language-processing network-demonstrating an influence of experience on the perception of nonlinguistic stimuli.

  20. Legal Pathways to the Recognition of Sign Languages: A Comparison of the Catalan and Spanish Sign Language Acts

    ERIC Educational Resources Information Center

    Quer, Josep

    2012-01-01

    Despite being minority languages like many others, sign languages have traditionally remained absent from the agendas of policy makers and language planning and policies. In the past two decades, though, this situation has started to change at different paces and to different degrees in several countries. In this article, the author describes the…

  1. Numeral-Incorporating Roots in Numeral Systems: A Comparative Analysis of Two Sign Languages

    ERIC Educational Resources Information Center

    Fuentes, Mariana; Massone, Maria Ignacia; Fernandez-Viader, Maria del Pilar; Makotrinsky, Alejandro; Pulgarin, Francisca

    2010-01-01

    Numeral-incorporating roots in the numeral systems of Argentine Sign Language (LSA) and Catalan Sign Language (LSC), as well as the main features of the number systems of both languages, are described and compared. Informants discussed the use of numerals and roots in both languages (in most cases in natural contexts). Ten informants took part in…

  2. Spoken Language Activation Alters Subsequent Sign Language Activation in L2 Learners of American Sign Language

    ERIC Educational Resources Information Center

    Williams, Joshua T.; Newman, Sharlene D.

    2017-01-01

    A large body of literature has characterized unimodal monolingual and bilingual lexicons and how neighborhood density affects lexical access; however there have been relatively fewer studies that generalize these findings to bimodal (M2) second language (L2) learners of sign languages. The goal of the current study was to investigate parallel…

  3. Lexical access in sign language: a computational model.

    PubMed

    Caselli, Naomi K; Cohen-Goldberg, Ariel M

    2014-01-01

    PSYCHOLINGUISTIC THEORIES HAVE PREDOMINANTLY BEEN BUILT UPON DATA FROM SPOKEN LANGUAGE, WHICH LEAVES OPEN THE QUESTION: How many of the conclusions truly reflect language-general principles as opposed to modality-specific ones? We take a step toward answering this question in the domain of lexical access in recognition by asking whether a single cognitive architecture might explain diverse behavioral patterns in signed and spoken language. Chen and Mirman (2012) presented a computational model of word processing that unified opposite effects of neighborhood density in speech production, perception, and written word recognition. Neighborhood density effects in sign language also vary depending on whether the neighbors share the same handshape or location. We present a spreading activation architecture that borrows the principles proposed by Chen and Mirman (2012), and show that if this architecture is elaborated to incorporate relatively minor facts about either (1) the time course of sign perception or (2) the frequency of sub-lexical units in sign languages, it produces data that match the experimental findings from sign languages. This work serves as a proof of concept that a single cognitive architecture could underlie both sign and word recognition.

  4. Is Lhasa Tibetan Sign Language emerging, endangered, or both?

    PubMed

    Hofer, Theresia

    2017-05-24

    This article offers the first overview of the recent emergence of Tibetan Sign Language (TibSL) in Lhasa, capital of the Tibet Autonomous Region (TAR), China. Drawing on short anthropological fieldwork, in 2007 and 2014, with people and organisations involved in the formalisation and promotion of TibSL, the author discusses her findings within the nine-fold UNESCO model for assessing linguistic vitality and endangerment. She follows the adaptation of this model to assess signed languages by the Institute of Sign Languages and Deaf Studies (iSLanDS) at the University of Central Lancashire. The appraisal shows that TibSL appears to be between "severely" and "definitely" endangered, adding to the extant studies on the widespread phenomenon of sign language endangerment. Possible future influences and developments regarding the vitality and use of TibSL in Central Tibet and across the Tibetan plateau are then discussed and certain additions, not considered within the existing assessment model, suggested. In concluding, the article places the situation of TibSL within the wider circumstances of minority (sign) languages in China, Chinese Sign Language (CSL), and the post-2008 movement to promote and use "pure Tibetan language".

  5. American Sign Language Syntax and Analogical Reasoning Skills Are Influenced by Early Acquisition and Age of Entry to Signing Schools for the Deaf

    PubMed Central

    Henner, Jon; Caldwell-Harris, Catherine L.; Novogrodsky, Rama; Hoffmeister, Robert

    2016-01-01

    Failing to acquire language in early childhood because of language deprivation is a rare and exceptional event, except in one population. Deaf children who grow up without access to indirect language through listening, speech-reading, or sign language experience language deprivation. Studies of Deaf adults have revealed that late acquisition of sign language is associated with lasting deficits. However, much remains unknown about language deprivation in Deaf children, allowing myths and misunderstandings regarding sign language to flourish. To fill this gap, we examined signing ability in a large naturalistic sample of Deaf children attending schools for the Deaf where American Sign Language (ASL) is used by peers and teachers. Ability in ASL was measured using a syntactic judgment test and language-based analogical reasoning test, which are two sub-tests of the ASL Assessment Inventory. The influence of two age-related variables were examined: whether or not ASL was acquired from birth in the home from one or more Deaf parents, and the age of entry to the school for the Deaf. Note that for non-native signers, this latter variable is often the age of first systematic exposure to ASL. Both of these types of age-dependent language experiences influenced subsequent signing ability. Scores on the two tasks declined with increasing age of school entry. The influence of age of starting school was not linear. Test scores were generally lower for Deaf children who entered the school of assessment after the age of 12. The positive influence of signing from birth was found for students at all ages tested (7;6–18;5 years old) and for children of all age-of-entry groupings. Our results reflect a continuum of outcomes which show that experience with language is a continuous variable that is sensitive to maturational age. PMID:28082932

  6. American Sign Language Syntax and Analogical Reasoning Skills Are Influenced by Early Acquisition and Age of Entry to Signing Schools for the Deaf.

    PubMed

    Henner, Jon; Caldwell-Harris, Catherine L; Novogrodsky, Rama; Hoffmeister, Robert

    2016-01-01

    Failing to acquire language in early childhood because of language deprivation is a rare and exceptional event, except in one population. Deaf children who grow up without access to indirect language through listening, speech-reading, or sign language experience language deprivation. Studies of Deaf adults have revealed that late acquisition of sign language is associated with lasting deficits. However, much remains unknown about language deprivation in Deaf children, allowing myths and misunderstandings regarding sign language to flourish. To fill this gap, we examined signing ability in a large naturalistic sample of Deaf children attending schools for the Deaf where American Sign Language (ASL) is used by peers and teachers. Ability in ASL was measured using a syntactic judgment test and language-based analogical reasoning test, which are two sub-tests of the ASL Assessment Inventory. The influence of two age-related variables were examined: whether or not ASL was acquired from birth in the home from one or more Deaf parents, and the age of entry to the school for the Deaf. Note that for non-native signers, this latter variable is often the age of first systematic exposure to ASL. Both of these types of age-dependent language experiences influenced subsequent signing ability. Scores on the two tasks declined with increasing age of school entry. The influence of age of starting school was not linear. Test scores were generally lower for Deaf children who entered the school of assessment after the age of 12. The positive influence of signing from birth was found for students at all ages tested (7;6-18;5 years old) and for children of all age-of-entry groupings. Our results reflect a continuum of outcomes which show that experience with language is a continuous variable that is sensitive to maturational age.

  7. With or without Semantic Mediation: Retrieval of Lexical Representations in Sign Production

    ERIC Educational Resources Information Center

    Navarrete, Eduardo; Caccaro, Arianna; Pavani, Francesco; Mahon, Bradford Z.; Peressotti, Francesca

    2015-01-01

    How are lexical representations retrieved during sign production? Similar to spoken languages, lexical representation in sign language must be accessed through semantics when naming pictures. However, it remains an open issue whether lexical representations in sign language can be accessed via routes that bypass semantics when retrieval is…

  8. The Mechanics of Fingerspelling: Analyzing Ethiopian Sign Language

    ERIC Educational Resources Information Center

    Duarte, Kyle

    2010-01-01

    Ethiopian Sign Language utilizes a fingerspelling system that represents Amharic orthography. Just as each character of the Amharic abugida encodes a consonant-vowel sound pair, each sign in the Ethiopian Sign Language fingerspelling system uses handshape to encode a base consonant, as well as a combination of timing, placement, and orientation to…

  9. Kinship in Mongolian Sign Language

    ERIC Educational Resources Information Center

    Geer, Leah

    2011-01-01

    Information and research on Mongolian Sign Language is scant. To date, only one dictionary is available in the United States (Badnaa and Boll 1995), and even that dictionary presents only a subset of the signs employed in Mongolia. The present study describes the kinship system used in Mongolian Sign Language (MSL) based on data elicited from…

  10. Memory for Nonsemantic Attributes of American Sign Language Signs and English Words

    ERIC Educational Resources Information Center

    Siple, Patricia

    1977-01-01

    Two recognition memory experiments were used to study the retention of language and modality of input. A bilingual list of American Sign Language signs and English words was presented to two deaf and two hearing groups, one instructed to remember mode of input, and one hearing group. Findings are analyzed. (CHK)

  11. Identifying Overlapping Language Communities: The Case of Chiriquí and Panamanian Signed Languages

    ERIC Educational Resources Information Center

    Parks, Elizabeth S.

    2016-01-01

    In this paper, I use a holographic metaphor to explain the identification of overlapping sign language communities in Panama. By visualizing Panama's complex signing communities as emitting community "hotspots" through social drama on multiple stages, I employ ethnographic methods to explore overlapping contours of Panama's sign language…

  12. Awareness of Deaf Sign Language and Gang Signs.

    ERIC Educational Resources Information Center

    Smith, Cynthia; Morgan, Robert L.

    There have been increasing incidents of innocent people who use American Sign Language (ASL) or another form of sign language being victimized by gang violence due to misinterpretation of ASL hand formations. ASL is familiar to learners with a variety of disabilities, particularly those in the deaf community. The problem is that gang members have…

  13. Signed language working memory capacity of signed language interpreters and deaf signers.

    PubMed

    Wang, Jihong; Napier, Jemina

    2013-04-01

    This study investigated the effects of hearing status and age of signed language acquisition on signed language working memory capacity. Professional Auslan (Australian sign language)/English interpreters (hearing native signers and hearing nonnative signers) and deaf Auslan signers (deaf native signers and deaf nonnative signers) completed an Auslan working memory (WM) span task. The results revealed that the hearing signers (i.e., the professional interpreters) significantly outperformed the deaf signers on the Auslan WM span task. However, the results showed no significant differences between the native signers and the nonnative signers in their Auslan working memory capacity. Furthermore, there was no significant interaction between hearing status and age of signed language acquisition. Additionally, the study found no significant differences between the deaf native signers (adults) and the deaf nonnative signers (adults) in their Auslan working memory capacity. The findings are discussed in relation to the participants' memory strategies and their early language experience. The findings present challenges for WM theories.

  14. Mapping language to the world: the role of iconicity in the sign language input.

    PubMed

    Perniss, Pamela; Lu, Jenny C; Morgan, Gary; Vigliocco, Gabriella

    2018-03-01

    Most research on the mechanisms underlying referential mapping has assumed that learning occurs in ostensive contexts, where label and referent co-occur, and that form and meaning are linked by arbitrary convention alone. In the present study, we focus on iconicity in language, that is, resemblance relationships between form and meaning, and on non-ostensive contexts, where label and referent do not co-occur. We approach the question of language learning from the perspective of the language input. Specifically, we look at child-directed language (CDL) in British Sign Language (BSL), a language rich in iconicity due to the affordances of the visual modality. We ask whether child-directed signing exploits iconicity in the language by highlighting the similarity mapping between form and referent. We find that CDL modifications occur more often with iconic signs than with non-iconic signs. Crucially, for iconic signs, modifications are more frequent in non-ostensive contexts than in ostensive contexts. Furthermore, we find that pointing dominates in ostensive contexts, and suggest that caregivers adjust the semiotic resources recruited in CDL to context. These findings offer first evidence for a role of iconicity in the language input and suggest that iconicity may be involved in referential mapping and language learning, particularly in non-ostensive contexts. © 2017 John Wiley & Sons Ltd.

  15. The Processing of Biologically Plausible and Implausible forms in American Sign Language: Evidence for Perceptual Tuning.

    PubMed

    Almeida, Diogo; Poeppel, David; Corina, David

    The human auditory system distinguishes speech-like information from general auditory signals in a remarkably fast and efficient way. Combining psychophysics and neurophysiology (MEG), we demonstrate a similar result for the processing of visual information used for language communication in users of sign languages. We demonstrate that the earliest visual cortical responses in deaf signers viewing American Sign Language (ASL) signs show specific modulations to violations of anatomic constraints that would make the sign either possible or impossible to articulate. These neural data are accompanied with a significantly increased perceptual sensitivity to the anatomical incongruity. The differential effects in the early visual evoked potentials arguably reflect an expectation-driven assessment of somatic representational integrity, suggesting that language experience and/or auditory deprivation may shape the neuronal mechanisms underlying the analysis of complex human form. The data demonstrate that the perceptual tuning that underlies the discrimination of language and non-language information is not limited to spoken languages but extends to languages expressed in the visual modality.

  16. Comic Books: A Learning Tool for Meaningful Acquisition of Written Sign Language

    ERIC Educational Resources Information Center

    Guimarães, Cayley; Oliveira Machado, Milton César; Fernandes, Sueli F.

    2018-01-01

    Deaf people use Sign Language (SL) for intellectual development, communications and other human activities that are mediated by language--such as the expression of complex and abstract thoughts and feelings; and for literature, culture and knowledge. The Brazilian Sign Language (Libras) is a complete linguistic system of visual-spatial manner,…

  17. Standardizing Chinese Sign Language for Use in Post-Secondary Education

    ERIC Educational Resources Information Center

    Lin, Christina Mien-Chun; Gerner de Garcia, Barbara; Chen-Pichler, Deborah

    2009-01-01

    There are over 100 languages in China, including Chinese Sign Language. Given the large population and geographical dispersion of the country's deaf community, sign variation is to be expected. Language barriers due to lexical variation may exist for deaf college students in China, who often live outside their home regions. In presenting an…

  18. Equity in Education: Signed Language and the Courts

    ERIC Educational Resources Information Center

    Snoddon, Kristin

    2009-01-01

    This article examines several legal cases in Canada, the USA, and Australia involving signed language in education for Deaf students. In all three contexts, signed language rights for Deaf students have been viewed from within a disability legislation framework that either does not extend to recognizing language rights in education or that…

  19. Language Facility and Theory of Mind Development in Deaf Children.

    ERIC Educational Resources Information Center

    Jackson, A. Lyn

    2001-01-01

    Deaf children with signing parents, nonnative signing deaf children, children from a hearing impaired unit, oral deaf children, and hearing controls were tested on theory of Mind (ToM) tasks and a British sign language receptive language test. Language ability correlated positively and significantly with ToM ability. Age underpinned the…

  20. Arabic Sign Language: A Perspective

    ERIC Educational Resources Information Center

    Abdel-Fattah, M. A.

    2005-01-01

    Sign language in the Arab World has been recently recognized and documented. Many efforts have been made to establish the sign language used in individual countries, including Jordan, Egypt, Libya, and the Gulf States, by trying to standardize the language and spread it among members of the Deaf community and those concerned. Such efforts produced…

  1. Regional Sign Language Varieties in Contact: Investigating Patterns of Accommodation

    ERIC Educational Resources Information Center

    Stamp, Rose; Schembri, Adam; Evans, Bronwen G.; Cormier, Kearsy

    2016-01-01

    Short-term linguistic accommodation has been observed in a number of spoken language studies. The first of its kind in sign language research, this study aims to investigate the effects of regional varieties in contact and lexical accommodation in British Sign Language (BSL). Twenty-five participants were recruited from Belfast, Glasgow,…

  2. The link between form and meaning in American Sign Language: lexical processing effects.

    PubMed

    Thompson, Robin L; Vinson, David P; Vigliocco, Gabriella

    2009-03-01

    Signed languages exploit iconicity (the transparent relationship between meaning and form) to a greater extent than spoken languages. where it is largely limited to onomatopoeia. In a picture-sign matching experiment measuring reaction times, the authors examined the potential advantage of iconicity both for 1st- and 2nd-language learners of American Sign Language (ASL). The results show that native ASL signers are faster to respond when a specific property iconically represented in a sign is made salient in the corresponding picture, thus providing evidence that a closer mapping between meaning and form can aid in lexical retrieval. While late 2nd-language learners appear to use iconicity as an aid to learning sign (R. Campbell, P. Martin, & T. White, 1992), they did not show the same facilitation effect as native ASL signers, suggesting that the task tapped into more automatic language processes. Overall, the findings suggest that completely arbitrary mappings between meaning and form may not be more advantageous in language and that, rather, arbitrariness may simply be an accident of modality. (c) 2009 APA, all rights reserved

  3. The sign language skills classroom observation: a process for describing sign language proficiency in classroom settings.

    PubMed

    Reeves, J B; Newell, W; Holcomb, B R; Stinson, M

    2000-10-01

    In collaboration with teachers and students at the National Technical Institute for the Deaf (NTID), the Sign Language Skills Classroom Observation (SLSCO) was designed to provide feedback to teachers on their sign language communication skills in the classroom. In the present article, the impetus and rationale for development of the SLSCO is discussed. Previous studies related to classroom signing and observation methodology are reviewed. The procedure for developing the SLSCO is then described. This procedure included (a) interviews with faculty and students at NTID, (b) identification of linguistic features of sign language important for conveying content to deaf students, (c) development of forms for recording observations of classroom signing, (d) analysis of use of the forms, (e) development of a protocol for conducting the SLSCO, and (f) piloting of the SLSCO in classrooms. The results of use of the SLSCO with NTID faculty during a trial year are summarized.

  4. A Kinect-Based Sign Language Hand Gesture Recognition System for Hearing- and Speech-Impaired: A Pilot Study of Pakistani Sign Language.

    PubMed

    Halim, Zahid; Abbas, Ghulam

    2015-01-01

    Sign language provides hearing and speech impaired individuals with an interface to communicate with other members of the society. Unfortunately, sign language is not understood by most of the common people. For this, a gadget based on image processing and pattern recognition can provide with a vital aid for detecting and translating sign language into a vocal language. This work presents a system for detecting and understanding the sign language gestures by a custom built software tool and later translating the gesture into a vocal language. For the purpose of recognizing a particular gesture, the system employs a Dynamic Time Warping (DTW) algorithm and an off-the-shelf software tool is employed for vocal language generation. Microsoft(®) Kinect is the primary tool used to capture video stream of a user. The proposed method is capable of successfully detecting gestures stored in the dictionary with an accuracy of 91%. The proposed system has the ability to define and add custom made gestures. Based on an experiment in which 10 individuals with impairments used the system to communicate with 5 people with no disability, 87% agreed that the system was useful.

  5. Brain correlates of constituent structure in sign language comprehension.

    PubMed

    Moreno, Antonio; Limousin, Fanny; Dehaene, Stanislas; Pallier, Christophe

    2018-02-15

    During sentence processing, areas of the left superior temporal sulcus, inferior frontal gyrus and left basal ganglia exhibit a systematic increase in brain activity as a function of constituent size, suggesting their involvement in the computation of syntactic and semantic structures. Here, we asked whether these areas play a universal role in language and therefore contribute to the processing of non-spoken sign language. Congenitally deaf adults who acquired French sign language as a first language and written French as a second language were scanned while watching sequences of signs in which the size of syntactic constituents was manipulated. An effect of constituent size was found in the basal ganglia, including the head of the caudate and the putamen. A smaller effect was also detected in temporal and frontal regions previously shown to be sensitive to constituent size in written language in hearing French subjects (Pallier et al., 2011). When the deaf participants read sentences versus word lists, the same network of language areas was observed. While reading and sign language processing yielded identical effects of linguistic structure in the basal ganglia, the effect of structure was stronger in all cortical language areas for written language relative to sign language. Furthermore, cortical activity was partially modulated by age of acquisition and reading proficiency. Our results stress the important role of the basal ganglia, within the language network, in the representation of the constituent structure of language, regardless of the input modality. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Words in the bilingual brain: an fNIRS brain imaging investigation of lexical processing in sign-speech bimodal bilinguals

    PubMed Central

    Kovelman, Ioulia; Shalinsky, Mark H.; Berens, Melody S.; Petitto, Laura-Ann

    2014-01-01

    Early bilingual exposure, especially exposure to two languages in different modalities such as speech and sign, can profoundly affect an individual's language, culture, and cognition. Here we explore the hypothesis that bimodal dual language exposure can also affect the brain's organization for language. These changes occur across brain regions universally important for language and parietal regions especially critical for sign language (Newman et al., 2002). We investigated three groups of participants (N = 29) that completed a word repetition task in American Sign Language (ASL) during fNIRS brain imaging. Those groups were (1) hearing ASL-English bimodal bilinguals (n = 5), (2) deaf ASL signers (n = 7), and (3) English monolinguals naïve to sign language (n = 17). The key finding of the present study is that bimodal bilinguals showed reduced activation in left parietal regions relative to deaf ASL signers when asked to use only ASL. In contrast, this group of bimodal signers showed greater activation in left temporo-parietal regions relative to English monolinguals when asked to switch between their two languages (Kovelman et al., 2009). Converging evidence now suggest that bimodal bilingual experience changes the brain bases of language, including the left temporo-parietal regions known to be critical for sign language processing (Emmorey et al., 2007). The results provide insight into the resilience and constraints of neural plasticity for language and bilingualism. PMID:25191247

  7. Atypical Speech and Language Development: A Consensus Study on Clinical Signs in the Netherlands

    ERIC Educational Resources Information Center

    Visser-Bochane, Margot I.; Gerrits, Ellen; van der Schans, Cees P.; Reijneveld, Sijmen A.; Luinge, Margreet R.

    2017-01-01

    Background: Atypical speech and language development is one of the most common developmental difficulties in young children. However, which clinical signs characterize atypical speech-language development at what age is not clear. Aim: To achieve a national and valid consensus on clinical signs and red flags (i.e. most urgent clinical signs) for…

  8. Generation of Signs within Semantic and Phonological Categories: Data from Deaf Adults and Children Who Use American Sign Language

    ERIC Educational Resources Information Center

    Beal-Alvarez, Jennifer S.; Figueroa, Daileen M.

    2017-01-01

    Two key areas of language development include semantic and phonological knowledge. Semantic knowledge relates to word and concept knowledge. Phonological knowledge relates to how language parameters combine to create meaning. We investigated signing deaf adults' and children's semantic and phonological sign generation via one-minute tasks,…

  9. Why Doesn't Everyone Here Speak Sign Language? Questions of Language Policy, Ideology and Economics

    ERIC Educational Resources Information Center

    Rayman, Jennifer

    2009-01-01

    This paper is a thought experiment exploring the possibility of establishing universal bilingualism in Sign Languages. Focusing in the first part on historical examples of inclusive signing societies such as Martha's Vineyard, the author suggests that it is not possible to create such naturally occurring practices of Sign Bilingualism in societies…

  10. New Perspectives on the History of American Sign Language

    ERIC Educational Resources Information Center

    Shaw, Emily; Delaporte, Yves

    2011-01-01

    Examinations of the etymology of American Sign Language have typically involved superficial analyses of signs as they exist over a short period of time. While it is widely known that ASL is related to French Sign Language, there has yet to be a comprehensive study of this historic relationship between their lexicons. This article presents…

  11. Signs as Pictures and Signs as Words: Effect of Language Knowledge on Memory for New Vocabulary.

    ERIC Educational Resources Information Center

    Siple, Patricia; And Others

    1982-01-01

    The role of sensory attributes in a vocabulary learning task was investigated for a non-oral language using deaf and hearing individuals, more or less skilled in the use of sign language. Skilled signers encoded invented signs in terms of linguistic structure rather than as visual-pictorial events. (Author/RD)

  12. The impact of input quality on early sign development in native and non-native language learners.

    PubMed

    Lu, Jenny; Jones, Anna; Morgan, Gary

    2016-05-01

    There is debate about how input variation influences child language. Most deaf children are exposed to a sign language from their non-fluent hearing parents and experience a delay in exposure to accessible language. A small number of children receive language input from their deaf parents who are fluent signers. Thus it is possible to document the impact of quality of input on early sign acquisition. The current study explores the outcomes of differential input in two groups of children aged two to five years: deaf children of hearing parents (DCHP) and deaf children of deaf parents (DCDP). Analysis of child sign language revealed DCDP had a more developed vocabulary and more phonological handshape types compared with DCHP. In naturalistic conversations deaf parents used more sign tokens and more phonological types than hearing parents. Results are discussed in terms of the effects of early input on subsequent language abilities.

  13. American Sign Language

    MedlinePlus

    ... Langue des Signes Française).Today’s ASL includes some elements of LSF plus the original local sign languages, which over the years ... evolves. It can also be used to model the essential elements and organization of natural language. Another NIDCD-funded research team is ...

  14. Lexical prediction via forward models: N400 evidence from German Sign Language.

    PubMed

    Hosemann, Jana; Herrmann, Annika; Steinbach, Markus; Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias

    2013-09-01

    Models of language processing in the human brain often emphasize the prediction of upcoming input-for example in order to explain the rapidity of language understanding. However, the precise mechanisms of prediction are still poorly understood. Forward models, which draw upon the language production system to set up expectations during comprehension, provide a promising approach in this regard. Here, we present an event-related potential (ERP) study on German Sign Language (DGS) which tested the hypotheses of a forward model perspective on prediction. Sign languages involve relatively long transition phases between one sign and the next, which should be anticipated as part of a forward model-based prediction even though they are semantically empty. Native speakers of DGS watched videos of naturally signed DGS sentences which either ended with an expected or a (semantically) unexpected sign. Unexpected signs engendered a biphasic N400-late positivity pattern. Crucially, N400 onset preceded critical sign onset and was thus clearly elicited by properties of the transition phase. The comprehension system thereby clearly anticipated modality-specific information about the realization of the predicted semantic item. These results provide strong converging support for the application of forward models in language comprehension. © 2013 Elsevier Ltd. All rights reserved.

  15. Can Experience with Co-Speech Gesture Influence the Prosody of a Sign Language? Sign Language Prosodic Cues in Bimodal Bilinguals

    ERIC Educational Resources Information Center

    Brentari, Diane; Nadolske, Marie A.; Wolford, George

    2012-01-01

    In this paper the prosodic structure of American Sign Language (ASL) narratives is analyzed in deaf native signers (L1-D), hearing native signers (L1-H), and highly proficient hearing second language signers (L2-H). The results of this study show that the prosodic patterns used by these groups are associated both with their ASL language experience…

  16. Linguistic Policies, Linguistic Planning, and Brazilian Sign Language in Brazil

    ERIC Educational Resources Information Center

    de Quadros, Ronice Muller

    2012-01-01

    This article explains the consolidation of Brazilian Sign Language in Brazil through a linguistic plan that arose from the Brazilian Sign Language Federal Law 10.436 of April 2002 and the subsequent Federal Decree 5695 of December 2005. Two concrete facts that emerged from this existing language plan are discussed: the implementation of bilingual…

  17. Pointing and Reference in Sign Language and Spoken Language: Anchoring vs. Identifying

    ERIC Educational Resources Information Center

    Barberà, Gemma; Zwets, Martine

    2013-01-01

    In both signed and spoken languages, pointing serves to direct an addressee's attention to a particular entity. This entity may be either present or absent in the physical context of the conversation. In this article we focus on pointing directed to nonspeaker/nonaddressee referents in Sign Language of the Netherlands (Nederlandse Gebarentaal,…

  18. Neural Language Processing in Adolescent First-Language Learners: Longitudinal Case Studies in American Sign Language

    PubMed Central

    Ferjan Ramirez, Naja; Leonard, Matthew K.; Davenport, Tristan S.; Torres, Christina; Halgren, Eric; Mayberry, Rachel I.

    2016-01-01

    One key question in neurolinguistics is the extent to which the neural processing system for language requires linguistic experience during early life to develop fully. We conducted a longitudinal anatomically constrained magnetoencephalography (aMEG) analysis of lexico-semantic processing in 2 deaf adolescents who had no sustained language input until 14 years of age, when they became fully immersed in American Sign Language. After 2 to 3 years of language, the adolescents' neural responses to signed words were highly atypical, localizing mainly to right dorsal frontoparietal regions and often responding more strongly to semantically primed words (Ferjan Ramirez N, Leonard MK, Torres C, Hatrak M, Halgren E, Mayberry RI. 2014. Neural language processing in adolescent first-language learners. Cereb Cortex. 24 (10): 2772–2783). Here, we show that after an additional 15 months of language experience, the adolescents' neural responses remained atypical in terms of polarity. While their responses to less familiar signed words still showed atypical localization patterns, the localization of responses to highly familiar signed words became more concentrated in the left perisylvian language network. Our findings suggest that the timing of language experience affects the organization of neural language processing; however, even in adolescence, language representation in the human brain continues to evolve with experience. PMID:25410427

  19. Is Lhasa Tibetan Sign Language emerging, endangered, or both?

    PubMed Central

    Hofer, Theresia

    2017-01-01

    This article offers the first overview of the recent emergence of Tibetan Sign Language (TibSL) in Lhasa, capital of the Tibet Autonomous Region (TAR), China. Drawing on short anthropological fieldwork, in 2007 and 2014, with people and organisations involved in the formalisation and promotion of TibSL, the author discusses her findings within the nine-fold UNESCO model for assessing linguistic vitality and endangerment. She follows the adaptation of this model to assess signed languages by the Institute of Sign Languages and Deaf Studies (iSLanDS) at the University of Central Lancashire. The appraisal shows that TibSL appears to be between “severely” and “definitely” endangered, adding to the extant studies on the widespread phenomenon of sign language endangerment. Possible future influences and developments regarding the vitality and use of TibSL in Central Tibet and across the Tibetan plateau are then discussed and certain additions, not considered within the existing assessment model, suggested. In concluding, the article places the situation of TibSL within the wider circumstances of minority (sign) languages in China, Chinese Sign Language (CSL), and the post-2008 movement to promote and use “pure Tibetan language”. PMID:29033477

  20. Indonesian Sign Language Number Recognition using SIFT Algorithm

    NASA Astrophysics Data System (ADS)

    Mahfudi, Isa; Sarosa, Moechammad; Andrie Asmara, Rosa; Azrino Gustalika, M.

    2018-04-01

    Indonesian sign language (ISL) is generally used for deaf individuals and poor people communication in communicating. They use sign language as their primary language which consists of 2 types of action: sign and finger spelling. However, not all people understand their sign language so that this becomes a problem for them to communicate with normal people. this problem also becomes a factor they are isolated feel from the social life. It needs a solution that can help them to be able to interacting with normal people. Many research that offers a variety of methods in solving the problem of sign language recognition based on image processing. SIFT (Scale Invariant Feature Transform) algorithm is one of the methods that can be used to identify an object. SIFT is claimed very resistant to scaling, rotation, illumination and noise. Using SIFT algorithm for Indonesian sign language recognition number result rate recognition to 82% with the use of a total of 100 samples image dataset consisting 50 sample for training data and 50 sample images for testing data. Change threshold value get affect the result of the recognition. The best value threshold is 0.45 with rate recognition of 94%.

  1. Lexical access in sign language: a computational model

    PubMed Central

    Caselli, Naomi K.; Cohen-Goldberg, Ariel M.

    2014-01-01

    Psycholinguistic theories have predominantly been built upon data from spoken language, which leaves open the question: How many of the conclusions truly reflect language-general principles as opposed to modality-specific ones? We take a step toward answering this question in the domain of lexical access in recognition by asking whether a single cognitive architecture might explain diverse behavioral patterns in signed and spoken language. Chen and Mirman (2012) presented a computational model of word processing that unified opposite effects of neighborhood density in speech production, perception, and written word recognition. Neighborhood density effects in sign language also vary depending on whether the neighbors share the same handshape or location. We present a spreading activation architecture that borrows the principles proposed by Chen and Mirman (2012), and show that if this architecture is elaborated to incorporate relatively minor facts about either (1) the time course of sign perception or (2) the frequency of sub-lexical units in sign languages, it produces data that match the experimental findings from sign languages. This work serves as a proof of concept that a single cognitive architecture could underlie both sign and word recognition. PMID:24860539

  2. Phonological memory in sign language relies on the visuomotor neural system outside the left hemisphere language network.

    PubMed

    Kanazawa, Yuji; Nakamura, Kimihiro; Ishii, Toru; Aso, Toshihiko; Yamazaki, Hiroshi; Omori, Koichi

    2017-01-01

    Sign language is an essential medium for everyday social interaction for deaf people and plays a critical role in verbal learning. In particular, language development in those people should heavily rely on the verbal short-term memory (STM) via sign language. Most previous studies compared neural activations during signed language processing in deaf signers and those during spoken language processing in hearing speakers. For sign language users, it thus remains unclear how visuospatial inputs are converted into the verbal STM operating in the left-hemisphere language network. Using functional magnetic resonance imaging, the present study investigated neural activation while bilinguals of spoken and signed language were engaged in a sequence memory span task. On each trial, participants viewed a nonsense syllable sequence presented either as written letters or as fingerspelling (4-7 syllables in length) and then held the syllable sequence for 12 s. Behavioral analysis revealed that participants relied on phonological memory while holding verbal information regardless of the type of input modality. At the neural level, this maintenance stage broadly activated the left-hemisphere language network, including the inferior frontal gyrus, supplementary motor area, superior temporal gyrus and inferior parietal lobule, for both letter and fingerspelling conditions. Interestingly, while most participants reported that they relied on phonological memory during maintenance, direct comparisons between letters and fingers revealed strikingly different patterns of neural activation during the same period. Namely, the effortful maintenance of fingerspelling inputs relative to letter inputs activated the left superior parietal lobule and dorsal premotor area, i.e., brain regions known to play a role in visuomotor analysis of hand/arm movements. These findings suggest that the dorsal visuomotor neural system subserves verbal learning via sign language by relaying gestural inputs to the classical left-hemisphere language network.

  3. Independent transmission of sign language interpreter in DVB: assessment of image compression

    NASA Astrophysics Data System (ADS)

    Zatloukal, Petr; Bernas, Martin; Dvořák, LukáÅ.¡

    2015-02-01

    Sign language on television provides information to deaf that they cannot get from the audio content. If we consider the transmission of the sign language interpreter over an independent data stream, the aim is to ensure sufficient intelligibility and subjective image quality of the interpreter with minimum bit rate. The work deals with the ROI-based video compression of Czech sign language interpreter implemented to the x264 open source library. The results of this approach are verified in subjective tests with the deaf. They examine the intelligibility of sign language expressions containing minimal pairs for different levels of compression and various resolution of image with interpreter and evaluate the subjective quality of the final image for a good viewing experience.

  4. Facilitating Exposure to Sign Languages of the World: The Case for Mobile Assisted Language Learning

    ERIC Educational Resources Information Center

    Parton, Becky Sue

    2014-01-01

    Foreign sign language instruction is an important, but overlooked area of study. Thus the purpose of this paper was two-fold. First, the researcher sought to determine the level of knowledge and interest in foreign sign language among Deaf teenagers along with their learning preferences. Results from a survey indicated that over a third of the…

  5. The Multimedia Dictionary of American Sign Language: Learning Lessons About Language, Technology, and Business.

    ERIC Educational Resources Information Center

    Wilcox, Sherman

    2003-01-01

    Reports on the the Multimedia Dictionary of American Sign language, which was was conceived in he late 1980s as a melding of the pioneering work in American Sign language lexicography that had been carried out decades earlier and the newly emerging computer technologies that were integrating use of graphical user-interface designs, rapidly…

  6. HAPPEN CAN'T HEAR: An Analysis of Code-Blends in Hearing, Native Signers of American Sign Language

    ERIC Educational Resources Information Center

    Bishop, Michele

    2011-01-01

    Hearing native signers often learn sign language as their first language and acquire features that are characteristic of sign languages but are not present in equivalent ways in English (e.g., grammatical facial expressions and the structured use of space for setting up tokens and surrogates). Previous research has indicated that bimodal…

  7. Deficits in Narrative Abilities in Child British Sign Language Users with Specific Language Impairment

    ERIC Educational Resources Information Center

    Herman, Ros; Rowley, Katherine; Mason, Kathryn; Morgan, Gary

    2014-01-01

    This study details the first ever investigation of narrative skills in a group of 17 deaf signing children who have been diagnosed with disorders in their British Sign Language development compared with a control group of 17 deaf child signers matched for age, gender, education, quantity, and quality of language exposure and non-verbal…

  8. Meeting the Needs of Signers in the Field of Speech and Language Pathology: Some Considerations for Action

    ERIC Educational Resources Information Center

    Cripps, Jody H.; Cooper, Sheryl B.; Supalla, Samuel J.; Evitts, Paul M.

    2016-01-01

    Deaf individuals who use American Sign Language (ASL) are rarely the focus of professionals in speech-language pathology. Although society is widely thought of in terms of those who speak, this norm is not all-inclusive. Many signing individuals exhibit disorders in signed language and need treatment much like their speaking peers. Although there…

  9. Introduction: Sign Language, Sustainable Development, and Equal Opportunities

    ERIC Educational Resources Information Center

    De Clerck, Goedele A. M.

    2017-01-01

    This article has been excerpted from "Introduction: Sign Language, Sustainable Development, and Equal Opportunities" (De Clerck) in "Sign Language, Sustainable Development, and Equal Opportunities: Envisioning the Future for Deaf Students" (G. A. M. De Clerck & P. V. Paul (Eds.) 2016). The idea of exploring various…

  10. Sign Language Echolalia in Deaf Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Shield, Aaron; Cooley, Frances; Meier, Richard P.

    2017-01-01

    Purpose: We present the first study of echolalia in deaf, signing children with autism spectrum disorder (ASD). We investigate the nature and prevalence of sign echolalia in native-signing children with ASD, the relationship between sign echolalia and receptive language, and potential modality differences between sign and speech. Method: Seventeen…

  11. Deficits in narrative abilities in child British Sign Language users with specific language impairment.

    PubMed

    Herman, Ros; Rowley, Katherine; Mason, Kathryn; Morgan, Gary

    2014-01-01

    This study details the first ever investigation of narrative skills in a group of 17 deaf signing children who have been diagnosed with disorders in their British Sign Language development compared with a control group of 17 deaf child signers matched for age, gender, education, quantity, and quality of language exposure and non-verbal intelligence. Children were asked to generate a narrative based on events in a language free video. Narratives were analysed for global structure, information content and local level grammatical devices, especially verb morphology. The language-impaired group produced shorter, less structured and grammatically simpler narratives than controls, with verb morphology particularly impaired. Despite major differences in how sign and spoken languages are articulated, narrative is shown to be a reliable marker of language impairment across the modality boundaries. © 2014 Royal College of Speech and Language Therapists.

  12. Sign Lowering and Phonetic Reduction in American Sign Language.

    PubMed

    Tyrone, Martha E; Mauk, Claude E

    2010-04-01

    This study examines sign lowering as a form of phonetic reduction in American Sign Language. Phonetic reduction occurs in the course of normal language production, when instead of producing a carefully articulated form of a word, the language user produces a less clearly articulated form. When signs are produced in context by native signers, they often differ from the citation forms of signs. In some cases, phonetic reduction is manifested as a sign being produced at a lower location than in the citation form. Sign lowering has been documented previously, but this is the first study to examine it in phonetic detail. The data presented here are tokens of the sign WONDER, as produced by six native signers, in two phonetic contexts and at three signing rates, which were captured by optoelectronic motion capture. The results indicate that sign lowering occurred for all signers, according to the factors we manipulated. Sign production was affected by several phonetic factors that also influence speech production, namely, production rate, phonetic context, and position within an utterance. In addition, we have discovered interesting variations in sign production, which could underlie distinctions in signing style, analogous to accent or voice quality in speech.

  13. Effects of Iconicity and Semantic Relatedness on Lexical Access in American Sign Language

    PubMed Central

    Bosworth, Rain G.; Emmorey, Karen

    2010-01-01

    Iconicity is a property that pervades the lexicon of many sign languages, including American Sign Language (ASL). Iconic signs exhibit a motivated, non-arbitrary mapping between the form of the sign and its meaning. We investigated whether iconicity enhances semantic priming effects for ASL and whether iconic signs are recognized more quickly than non-iconic signs (controlling for strength of iconicity, semantic relatedness, familiarity, and imageability). Twenty deaf signers made lexical decisions to the second item of a prime-target pair. Iconic target signs were preceded by prime signs that were a) iconic and semantically related, b) non-iconic and semantically related, or c) semantically unrelated. In addition, a set of non-iconic target signs was preceded by semantically unrelated primes. Significant facilitation was observed for target signs when preceded by semantically related primes. However, iconicity did not increase the priming effect (e.g., the target sign PIANO was primed equally by the iconic sign GUITAR and the non-iconic sign MUSIC). In addition, iconic signs were not recognized faster or more accurately than non-iconic signs. These results confirm the existence of semantic priming for sign language and suggest that iconicity does not play a robust role in on-line lexical processing. PMID:20919784

  14. How Do Typically Developing Deaf Children and Deaf Children with Autism Spectrum Disorder Use the Face When Comprehending Emotional Facial Expressions in British Sign Language?

    ERIC Educational Resources Information Center

    Denmark, Tanya; Atkinson, Joanna; Campbell, Ruth; Swettenham, John

    2014-01-01

    Facial expressions in sign language carry a variety of communicative features. While emotion can modulate a spoken utterance through changes in intonation, duration and intensity, in sign language specific facial expressions presented concurrently with a manual sign perform this function. When deaf adult signers cannot see facial features, their…

  15. Signs of Resistance: Peer Learning of Sign Languages within "Oral" Schools for the Deaf

    ERIC Educational Resources Information Center

    Anglin-Jaffe, Hannah

    2013-01-01

    This article explores the role of the Deaf child as peer educator. In schools where sign languages were banned, Deaf children became the educators of their Deaf peers in a number of contexts worldwide. This paper analyses how this peer education of sign language worked in context by drawing on two examples from boarding schools for the deaf in…

  16. How Grammar Can Cope with Limited Short-Term Memory: Simultaneity and Seriality in Sign Languages

    ERIC Educational Resources Information Center

    Geraci, Carlo; Gozzi, Marta; Papagno, Costanza; Cecchetto, Carlo

    2008-01-01

    It is known that in American Sign Language (ASL) span is shorter than in English, but this discrepancy has never been systematically investigated using other pairs of signed and spoken languages. This finding is at odds with results showing that short-term memory (STM) for signs has an internal organization similar to STM for words. Moreover, some…

  17. Wavelets for sign language translation

    NASA Astrophysics Data System (ADS)

    Wilson, Beth J.; Anspach, Gretel

    1993-10-01

    Wavelet techniques are applied to help extract the relevant parameters of sign language from video images of a person communicating in American Sign Language or Signed English. The compression and edge detection features of two-dimensional wavelet analysis are exploited to enhance the algorithms under development to classify the hand motion, hand location with respect to the body, and handshape. These three parameters have different processing requirements and complexity issues. The results are described for applying various quadrature mirror filter designs to a filterbank implementation of the desired wavelet transform. The overall project is to develop a system that will translate sign language to English to facilitate communication between deaf and hearing people.

  18. An fMRI Study of Perception and Action in Deaf Signers

    PubMed Central

    Okada, Kayoko; Rogalsky, Corianne; O’Grady, Lucinda; Hanaumi, Leila; Bellugi, Ursula; Corina, David; Hickok, Gregory

    2016-01-01

    Since the discovery of mirror neurons, there has been a great deal of interest in understanding the relationship between perception and action, and the role of the human mirror system in language comprehension and production. Two questions have dominated research. One concerns the role of Broca’s area in speech perception. The other concerns the role of the motor system more broadly in understanding action-related language. The current study investigates both of these questions in a way that bridges research on language with research on manual actions. We studied the neural basis of observing and executing American Sign Language (ASL) object and action signs. In an fMRI experiment, deaf signers produced signs depicting actions and objects as well as observed/comprehended signs of actions and objects. Different patterns of activation were found for observation and execution although with overlap in Broca’s area, providing prima facie support for the claim that the motor system participates in language perception. In contrast, we found no evidence that action related signs differentially involved the motor system compared to object related signs. These findings are discussed in the context of lesion studies of sign language execution and observation. In this broader context, we conclude that the activation in Broca’s area during ASL observation is not causally related to sign language understanding. PMID:26796716

  19. An fMRI study of perception and action in deaf signers.

    PubMed

    Okada, Kayoko; Rogalsky, Corianne; O'Grady, Lucinda; Hanaumi, Leila; Bellugi, Ursula; Corina, David; Hickok, Gregory

    2016-02-01

    Since the discovery of mirror neurons, there has been a great deal of interest in understanding the relationship between perception and action, and the role of the human mirror system in language comprehension and production. Two questions have dominated research. One concerns the role of Broca's area in speech perception. The other concerns the role of the motor system more broadly in understanding action-related language. The current study investigates both of these questions in a way that bridges research on language with research on manual actions. We studied the neural basis of observing and executing American Sign Language (ASL) object and action signs. In an fMRI experiment, deaf signers produced signs depicting actions and objects as well as observed/comprehended signs of actions and objects. Different patterns of activation were found for observation and execution although with overlap in Broca's area, providing prima facie support for the claim that the motor system participates in language perception. In contrast, we found no evidence that action related signs differentially involved the motor system compared to object related signs. These findings are discussed in the context of lesion studies of sign language execution and observation. In this broader context, we conclude that the activation in Broca's area during ASL observation is not causally related to sign language understanding. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Signed language and human action processing: evidence for functional constraints on the human mirror-neuron system.

    PubMed

    Corina, David P; Knapp, Heather Patterson

    2008-12-01

    In the quest to further understand the neural underpinning of human communication, researchers have turned to studies of naturally occurring signed languages used in Deaf communities. The comparison of the commonalities and differences between spoken and signed languages provides an opportunity to determine core neural systems responsible for linguistic communication independent of the modality in which a language is expressed. The present article examines such studies, and in addition asks what we can learn about human languages by contrasting formal visual-gestural linguistic systems (signed languages) with more general human action perception. To understand visual language perception, it is important to distinguish the demands of general human motion processing from the highly task-dependent demands associated with extracting linguistic meaning from arbitrary, conventionalized gestures. This endeavor is particularly important because theorists have suggested close homologies between perception and production of actions and functions of human language and social communication. We review recent behavioral, functional imaging, and neuropsychological studies that explore dissociations between the processing of human actions and signed languages. These data suggest incomplete overlap between the mirror-neuron systems proposed to mediate human action and language.

  1. Neural Language Processing in Adolescent First-Language Learners: Longitudinal Case Studies in American Sign Language.

    PubMed

    Ferjan Ramirez, Naja; Leonard, Matthew K; Davenport, Tristan S; Torres, Christina; Halgren, Eric; Mayberry, Rachel I

    2016-03-01

    One key question in neurolinguistics is the extent to which the neural processing system for language requires linguistic experience during early life to develop fully. We conducted a longitudinal anatomically constrained magnetoencephalography (aMEG) analysis of lexico-semantic processing in 2 deaf adolescents who had no sustained language input until 14 years of age, when they became fully immersed in American Sign Language. After 2 to 3 years of language, the adolescents' neural responses to signed words were highly atypical, localizing mainly to right dorsal frontoparietal regions and often responding more strongly to semantically primed words (Ferjan Ramirez N, Leonard MK, Torres C, Hatrak M, Halgren E, Mayberry RI. 2014. Neural language processing in adolescent first-language learners. Cereb Cortex. 24 (10): 2772-2783). Here, we show that after an additional 15 months of language experience, the adolescents' neural responses remained atypical in terms of polarity. While their responses to less familiar signed words still showed atypical localization patterns, the localization of responses to highly familiar signed words became more concentrated in the left perisylvian language network. Our findings suggest that the timing of language experience affects the organization of neural language processing; however, even in adolescence, language representation in the human brain continues to evolve with experience. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. Teachers' perceptions of promoting sign language phonological awareness in an ASL/English bilingual program.

    PubMed

    Crume, Peter K

    2013-10-01

    The National Reading Panel emphasizes that spoken language phonological awareness (PA) developed at home and school can lead to improvements in reading performance in young children. However, research indicates that many deaf children are good readers even though they have limited spoken language PA. Is it possible that some deaf students benefit from teachers who promote sign language PA instead? The purpose of this qualitative study is to examine teachers' beliefs and instructional practices related to sign language PA. A thematic analysis is conducted on 10 participant interviews at an ASL/English bilingual school for the deaf to understand their views and instructional practices. The findings reveal that the participants had strong beliefs in developing students' structural knowledge of signs and used a variety of instructional strategies to build students' knowledge of sign structures in order to promote their language and literacy skills.

  3. Language choice in bimodal bilingual development.

    PubMed

    Lillo-Martin, Diane; de Quadros, Ronice M; Chen Pichler, Deborah; Fieldsteel, Zoe

    2014-01-01

    Bilingual children develop sensitivity to the language used by their interlocutors at an early age, reflected in differential use of each language by the child depending on their interlocutor. Factors such as discourse context and relative language dominance in the community may mediate the degree of language differentiation in preschool age children. Bimodal bilingual children, acquiring both a sign language and a spoken language, have an even more complex situation. Their Deaf parents vary considerably in access to the spoken language. Furthermore, in addition to code-mixing and code-switching, they use code-blending-expressions in both speech and sign simultaneously-an option uniquely available to bimodal bilinguals. Code-blending is analogous to code-switching sociolinguistically, but is also a way to communicate without suppressing one language. For adult bimodal bilinguals, complete suppression of the non-selected language is cognitively demanding. We expect that bimodal bilingual children also find suppression difficult, and use blending rather than suppression in some contexts. We also expect relative community language dominance to be a factor in children's language choices. This study analyzes longitudinal spontaneous production data from four bimodal bilingual children and their Deaf and hearing interlocutors. Even at the earliest observations, the children produced more signed utterances with Deaf interlocutors and more speech with hearing interlocutors. However, while three of the four children produced >75% speech alone in speech target sessions, they produced <25% sign alone in sign target sessions. All four produced bimodal utterances in both, but more frequently in the sign sessions, potentially because they find suppression of the dominant language more difficult. Our results indicate that these children are sensitive to the language used by their interlocutors, while showing considerable influence from the dominant community language.

  4. Language choice in bimodal bilingual development

    PubMed Central

    Lillo-Martin, Diane; de Quadros, Ronice M.; Chen Pichler, Deborah; Fieldsteel, Zoe

    2014-01-01

    Bilingual children develop sensitivity to the language used by their interlocutors at an early age, reflected in differential use of each language by the child depending on their interlocutor. Factors such as discourse context and relative language dominance in the community may mediate the degree of language differentiation in preschool age children. Bimodal bilingual children, acquiring both a sign language and a spoken language, have an even more complex situation. Their Deaf parents vary considerably in access to the spoken language. Furthermore, in addition to code-mixing and code-switching, they use code-blending—expressions in both speech and sign simultaneously—an option uniquely available to bimodal bilinguals. Code-blending is analogous to code-switching sociolinguistically, but is also a way to communicate without suppressing one language. For adult bimodal bilinguals, complete suppression of the non-selected language is cognitively demanding. We expect that bimodal bilingual children also find suppression difficult, and use blending rather than suppression in some contexts. We also expect relative community language dominance to be a factor in children's language choices. This study analyzes longitudinal spontaneous production data from four bimodal bilingual children and their Deaf and hearing interlocutors. Even at the earliest observations, the children produced more signed utterances with Deaf interlocutors and more speech with hearing interlocutors. However, while three of the four children produced >75% speech alone in speech target sessions, they produced <25% sign alone in sign target sessions. All four produced bimodal utterances in both, but more frequently in the sign sessions, potentially because they find suppression of the dominant language more difficult. Our results indicate that these children are sensitive to the language used by their interlocutors, while showing considerable influence from the dominant community language. PMID:25368591

  5. An Eye Tracking Study on the Perception and Comprehension of Unimodal and Bimodal Linguistic Inputs by Deaf Adolescents

    PubMed Central

    Mastrantuono, Eliana; Saldaña, David; Rodríguez-Ortiz, Isabel R.

    2017-01-01

    An eye tracking experiment explored the gaze behavior of deaf individuals when perceiving language in spoken and sign language only, and in sign-supported speech (SSS). Participants were deaf (n = 25) and hearing (n = 25) Spanish adolescents. Deaf students were prelingually profoundly deaf individuals with cochlear implants (CIs) used by age 5 or earlier, or prelingually profoundly deaf native signers with deaf parents. The effectiveness of SSS has rarely been tested within the same group of children for discourse-level comprehension. Here, video-recorded texts, including spatial descriptions, were alternately transmitted in spoken language, sign language and SSS. The capacity of these communicative systems to equalize comprehension in deaf participants with that of spoken language in hearing participants was tested. Within-group analyses of deaf participants tested if the bimodal linguistic input of SSS favored discourse comprehension compared to unimodal languages. Deaf participants with CIs achieved equal comprehension to hearing controls in all communicative systems while deaf native signers with no CIs achieved equal comprehension to hearing participants if tested in their native sign language. Comprehension of SSS was not increased compared to spoken language, even when spatial information was communicated. Eye movements of deaf and hearing participants were tracked and data of dwell times spent looking at the face or body area of the sign model were analyzed. Within-group analyses focused on differences between native and non-native signers. Dwell times of hearing participants were equally distributed across upper and lower areas of the face while deaf participants mainly looked at the mouth area; this could enable information to be obtained from mouthings in sign language and from lip-reading in SSS and spoken language. Few fixations were directed toward the signs, although these were more frequent when spatial language was transmitted. Both native and non-native signers looked mainly at the face when perceiving sign language, although non-native signers looked significantly more at the body than native signers. This distribution of gaze fixations suggested that deaf individuals – particularly native signers – mainly perceived signs through peripheral vision. PMID:28680416

  6. An Eye Tracking Study on the Perception and Comprehension of Unimodal and Bimodal Linguistic Inputs by Deaf Adolescents.

    PubMed

    Mastrantuono, Eliana; Saldaña, David; Rodríguez-Ortiz, Isabel R

    2017-01-01

    An eye tracking experiment explored the gaze behavior of deaf individuals when perceiving language in spoken and sign language only, and in sign-supported speech (SSS). Participants were deaf ( n = 25) and hearing ( n = 25) Spanish adolescents. Deaf students were prelingually profoundly deaf individuals with cochlear implants (CIs) used by age 5 or earlier, or prelingually profoundly deaf native signers with deaf parents. The effectiveness of SSS has rarely been tested within the same group of children for discourse-level comprehension. Here, video-recorded texts, including spatial descriptions, were alternately transmitted in spoken language, sign language and SSS. The capacity of these communicative systems to equalize comprehension in deaf participants with that of spoken language in hearing participants was tested. Within-group analyses of deaf participants tested if the bimodal linguistic input of SSS favored discourse comprehension compared to unimodal languages. Deaf participants with CIs achieved equal comprehension to hearing controls in all communicative systems while deaf native signers with no CIs achieved equal comprehension to hearing participants if tested in their native sign language. Comprehension of SSS was not increased compared to spoken language, even when spatial information was communicated. Eye movements of deaf and hearing participants were tracked and data of dwell times spent looking at the face or body area of the sign model were analyzed. Within-group analyses focused on differences between native and non-native signers. Dwell times of hearing participants were equally distributed across upper and lower areas of the face while deaf participants mainly looked at the mouth area; this could enable information to be obtained from mouthings in sign language and from lip-reading in SSS and spoken language. Few fixations were directed toward the signs, although these were more frequent when spatial language was transmitted. Both native and non-native signers looked mainly at the face when perceiving sign language, although non-native signers looked significantly more at the body than native signers. This distribution of gaze fixations suggested that deaf individuals - particularly native signers - mainly perceived signs through peripheral vision.

  7. The effect of sign language structure on complex word reading in Chinese deaf adolescents.

    PubMed

    Lu, Aitao; Yu, Yanping; Niu, Jiaxin; Zhang, John X

    2015-01-01

    The present study was carried out to investigate whether sign language structure plays a role in the processing of complex words (i.e., derivational and compound words), in particular, the delay of complex word reading in deaf adolescents. Chinese deaf adolescents were found to respond faster to derivational words than to compound words for one-sign-structure words, but showed comparable performance for two-sign-structure words. For both derivational and compound words, response latencies to one-sign-structure words were shorter than to two-sign-structure words. These results provide strong evidence that the structure of sign language affects written word processing in Chinese. Additionally, differences between derivational and compound words in the one-sign-structure condition indicate that Chinese deaf adolescents acquire print morphological awareness. The results also showed that delayed word reading was found in derivational words with two signs (DW-2), compound words with one sign (CW-1), and compound words with two signs (CW-2), but not in derivational words with one sign (DW-1), with the delay being maximum in DW-2, medium in CW-2, and minimum in CW-1, suggesting that the structure of sign language has an impact on the delayed processing of Chinese written words in deaf adolescents. These results provide insight into the mechanisms about how sign language structure affects written word processing and its delayed processing relative to their hearing peers of the same age.

  8. Research Ethics in Sign Language Communities

    ERIC Educational Resources Information Center

    Harris, Raychelle; Holmes, Heidi M.; Mertens, Donna M.

    2009-01-01

    Codes of ethics exist for most professional associations whose members do research on, for, or with sign language communities. However, these ethical codes are silent regarding the need to frame research ethics from a cultural standpoint, an issue of particular salience for sign language communities. Scholars who write from the perspective of…

  9. Comprehending Sentences with the Body: Action Compatibility in British Sign Language?

    ERIC Educational Resources Information Center

    Vinson, David; Perniss, Pamela; Fox, Neil; Vigliocco, Gabriella

    2017-01-01

    Previous studies show that reading sentences about actions leads to specific motor activity associated with actually performing those actions. We investigate how sign language input may modulate motor activation, using British Sign Language (BSL) sentences, some of which explicitly encode direction of motion, versus written English, where motion…

  10. Sign Language and the Brain: A Review

    ERIC Educational Resources Information Center

    Campbell, Ruth; MacSweeney, Mairead; Waters, Dafydd

    2008-01-01

    How are signed languages processed by the brain? This review briefly outlines some basic principles of brain structure and function and the methodological principles and techniques that have been used to investigate this question. We then summarize a number of different studies exploring brain activity associated with sign language processing…

  11. Real-time processing of ASL signs: Delayed first language acquisition affects organization of the mental lexicon

    PubMed Central

    Lieberman, Amy M.; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I.

    2014-01-01

    Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of linguistic input and for deaf individuals is highly heterogeneous, which is rarely the case for hearing learners of spoken languages. Little is known about how these modality and developmental factors affect real-time lexical processing. In this study, we ask how these factors impact real-time recognition of American Sign Language (ASL) signs using a novel adaptation of the visual world paradigm in deaf adults who learned sign from birth (Experiment 1), and in deaf individuals who were late-learners of ASL (Experiment 2). Results revealed that although both groups of signers demonstrated rapid, incremental processing of ASL signs, only native-signers demonstrated early and robust activation of sub-lexical features of signs during real-time recognition. Our findings suggest that the organization of the mental lexicon into units of both form and meaning is a product of infant language learning and not the sensory and motor modality through which the linguistic signal is sent and received. PMID:25528091

  12. Variation in handshape and orientation in British Sign Language: The case of the ‘1’ hand configuration

    PubMed Central

    Fenlon, Jordan; Schembri, Adam; Rentelis, Ramas; Cormier, Kearsy

    2013-01-01

    This paper investigates phonological variation in British Sign Language (BSL) signs produced with a ‘1’ hand configuration in citation form. Multivariate analyses of 2084 tokens reveals that handshape variation in these signs is constrained by linguistic factors (e.g., the preceding and following phonological environment, grammatical category, indexicality, lexical frequency). The only significant social factor was region. For the subset of signs where orientation was also investigated, only grammatical function was important (the surrounding phonological environment and social factors were not significant). The implications for an understanding of pointing signs in signed languages are discussed. PMID:23805018

  13. The effects of sign language on spoken language acquisition in children with hearing loss: a systematic review protocol.

    PubMed

    Fitzpatrick, Elizabeth M; Stevens, Adrienne; Garritty, Chantelle; Moher, David

    2013-12-06

    Permanent childhood hearing loss affects 1 to 3 per 1000 children and frequently disrupts typical spoken language acquisition. Early identification of hearing loss through universal newborn hearing screening and the use of new hearing technologies including cochlear implants make spoken language an option for most children. However, there is no consensus on what constitutes optimal interventions for children when spoken language is the desired outcome. Intervention and educational approaches ranging from oral language only to oral language combined with various forms of sign language have evolved. Parents are therefore faced with important decisions in the first months of their child's life. This article presents the protocol for a systematic review of the effects of using sign language in combination with oral language intervention on spoken language acquisition. Studies addressing early intervention will be selected in which therapy involving oral language intervention and any form of sign language or sign support is used. Comparison groups will include children in early oral language intervention programs without sign support. The primary outcomes of interest to be examined include all measures of auditory, vocabulary, language, speech production, and speech intelligibility skills. We will include randomized controlled trials, controlled clinical trials, and other quasi-experimental designs that include comparator groups as well as prospective and retrospective cohort studies. Case-control, cross-sectional, case series, and case studies will be excluded. Several electronic databases will be searched (for example, MEDLINE, EMBASE, CINAHL, PsycINFO) as well as grey literature and key websites. We anticipate that a narrative synthesis of the evidence will be required. We will carry out meta-analysis for outcomes if clinical similarity, quantity and quality permit quantitative pooling of data. We will conduct subgroup analyses if possible according to severity/type of hearing disorder, age of identification, and type of hearing technology. This review will provide evidence on the effectiveness of using sign language in combination with oral language therapies for developing spoken language in children with hearing loss who are identified at a young age. The information from this review can provide guidance to parents and intervention specialists, inform policy decisions and provide directions for future research. CRD42013005426.

  14. Cross-Modal Recruitment of Auditory and Orofacial Areas During Sign Language in a Deaf Subject.

    PubMed

    Martino, Juan; Velasquez, Carlos; Vázquez-Bourgon, Javier; de Lucas, Enrique Marco; Gomez, Elsa

    2017-09-01

    Modern sign languages used by deaf people are fully expressive, natural human languages that are perceived visually and produced manually. The literature contains little data concerning human brain organization in conditions of deficient sensory information such as deafness. A deaf-mute patient underwent surgery of a left temporoinsular low-grade glioma. The patient underwent awake surgery with intraoperative electrical stimulation mapping, allowing direct study of the cortical and subcortical organization of sign language. We found a similar distribution of language sites to what has been reported in mapping studies of patients with oral language, including 1) speech perception areas inducing anomias and alexias close to the auditory cortex (at the posterior portion of the superior temporal gyrus and supramarginal gyrus); 2) speech production areas inducing speech arrest (anarthria) at the ventral premotor cortex, close to the lip motor area and away from the hand motor area; and 3) subcortical stimulation-induced semantic paraphasias at the inferior fronto-occipital fasciculus at the temporal isthmus. The intraoperative setup for sign language mapping with intraoperative electrical stimulation in deaf-mute patients is similar to the setup described in patients with oral language. To elucidate the type of language errors, a sign language interpreter in close interaction with the neuropsychologist is necessary. Sign language is perceived visually and produced manually; however, this case revealed a cross-modal recruitment of auditory and orofacial motor areas. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Sign language spotting with a threshold model based on conditional random fields.

    PubMed

    Yang, Hee-Deok; Sclaroff, Stan; Lee, Seong-Whan

    2009-07-01

    Sign language spotting is the task of detecting and recognizing signs in a signed utterance, in a set vocabulary. The difficulty of sign language spotting is that instances of signs vary in both motion and appearance. Moreover, signs appear within a continuous gesture stream, interspersed with transitional movements between signs in a vocabulary and nonsign patterns (which include out-of-vocabulary signs, epentheses, and other movements that do not correspond to signs). In this paper, a novel method for designing threshold models in a conditional random field (CRF) model is proposed which performs an adaptive threshold for distinguishing between signs in a vocabulary and nonsign patterns. A short-sign detector, a hand appearance-based sign verification method, and a subsign reasoning method are included to further improve sign language spotting accuracy. Experiments demonstrate that our system can spot signs from continuous data with an 87.0 percent spotting rate and can recognize signs from isolated data with a 93.5 percent recognition rate versus 73.5 percent and 85.4 percent, respectively, for CRFs without a threshold model, short-sign detection, subsign reasoning, and hand appearance-based sign verification. Our system can also achieve a 15.0 percent sign error rate (SER) from continuous data and a 6.4 percent SER from isolated data versus 76.2 percent and 14.5 percent, respectively, for conventional CRFs.

  16. Operationalization of Sign Language Phonological Similarity and its Effects on Lexical Access.

    PubMed

    Williams, Joshua T; Stone, Adam; Newman, Sharlene D

    2017-07-01

    Cognitive mechanisms for sign language lexical access are fairly unknown. This study investigated whether phonological similarity facilitates lexical retrieval in sign languages using measures from a new lexical database for American Sign Language. Additionally, it aimed to determine which similarity metric best fits the present data in order to inform theories of how phonological similarity is constructed within the lexicon and to aid in the operationalization of phonological similarity in sign language. Sign repetition latencies and accuracy were obtained when native signers were asked to reproduce a sign displayed on a computer screen. Results indicated that, as predicted, phonological similarity facilitated repetition latencies and accuracy as long as there were no strict constraints on the type of sublexical features that overlapped. The data converged to suggest that one similarity measure, MaxD, defined as the overlap of any 4 sublexical features, likely best represents mechanisms of phonological similarity in the mental lexicon. Together, these data suggest that lexical access in sign language is facilitated by phonologically similar lexical representations in memory and the optimal operationalization is defined as liberal constraints on overlap of 4 out of 5 sublexical features-similar to the majority of extant definitions in the literature. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Modelling of internal architecture of kinesin nanomotor as a machine language.

    PubMed

    Khataee, H R; Ibrahim, M Y

    2012-09-01

    Kinesin is a protein-based natural nanomotor that transports molecular cargoes within cells by walking along microtubules. Kinesin nanomotor is considered as a bio-nanoagent which is able to sense the cell through its sensors (i.e. its heads and tail), make the decision internally and perform actions on the cell through its actuator (i.e. its motor domain). The study maps the agent-based architectural model of internal decision-making process of kinesin nanomotor to a machine language using an automata algorithm. The applied automata algorithm receives the internal agent-based architectural model of kinesin nanomotor as a deterministic finite automaton (DFA) model and generates a regular machine language. The generated regular machine language was acceptable by the architectural DFA model of the nanomotor and also in good agreement with its natural behaviour. The internal agent-based architectural model of kinesin nanomotor indicates the degree of autonomy and intelligence of the nanomotor interactions with its cell. Thus, our developed regular machine language can model the degree of autonomy and intelligence of kinesin nanomotor interactions with its cell as a language. Modelling of internal architectures of autonomous and intelligent bio-nanosystems as machine languages can lay the foundation towards the concept of bio-nanoswarms and next phases of the bio-nanorobotic systems development.

  18. From gesture to sign language: conventionalization of classifier constructions by adult hearing learners of British Sign Language.

    PubMed

    Marshall, Chloë R; Morgan, Gary

    2015-01-01

    There has long been interest in why languages are shaped the way they are, and in the relationship between sign language and gesture. In sign languages, entity classifiers are handshapes that encode how objects move, how they are located relative to one another, and how multiple objects of the same type are distributed in space. Previous studies have shown that hearing adults who are asked to use only manual gestures to describe how objects move in space will use gestures that bear some similarities to classifiers. We investigated how accurately hearing adults, who had been learning British Sign Language (BSL) for 1-3 years, produce and comprehend classifiers in (static) locative and distributive constructions. In a production task, learners of BSL knew that they could use their hands to represent objects, but they had difficulty choosing the same, conventionalized, handshapes as native signers. They were, however, highly accurate at encoding location and orientation information. Learners therefore show the same pattern found in sign-naïve gesturers. In contrast, handshape, orientation, and location were comprehended with equal (high) accuracy, and testing a group of sign-naïve adults showed that they too were able to understand classifiers with higher than chance accuracy. We conclude that adult learners of BSL bring their visuo-spatial knowledge and gestural abilities to the tasks of understanding and producing constructions that contain entity classifiers. We speculate that investigating the time course of adult sign language acquisition might shed light on how gesture became (and, indeed, becomes) conventionalized during the genesis of sign languages. Copyright © 2014 Cognitive Science Society, Inc.

  19. Building Languages

    MedlinePlus

    ... Support Services Technology and Audiology Medical and Surgical Solutions Putting it all Together Building Language American Sign Language (ASL) Conceptually Accurate Signed English (CASE) Cued Speech Finger Spelling Listening/Auditory Training ...

  20. What sign language creation teaches us about language.

    PubMed

    Brentari, Diane; Coppola, Marie

    2013-03-01

    How do languages emerge? What are the necessary ingredients and circumstances that permit new languages to form? Various researchers within the disciplines of primatology, anthropology, psychology, and linguistics have offered different answers to this question depending on their perspective. Language acquisition, language evolution, primate communication, and the study of spoken varieties of pidgin and creoles address these issues, but in this article we describe a relatively new and important area that contributes to our understanding of language creation and emergence. Three types of communication systems that use the hands and body to communicate will be the focus of this article: gesture, homesign systems, and sign languages. The focus of this article is to explain why mapping the path from gesture to homesign to sign language has become an important research topic for understanding language emergence, not only for the field of sign languages, but also for language in general. WIREs Cogn Sci 2013, 4:201-211. doi: 10.1002/wcs.1212 For further resources related to this article, please visit the WIREs website. Copyright © 2012 John Wiley & Sons, Ltd.

  1. Advancing Research in Second Language Writing through Computational Tools and Machine Learning Techniques: A Research Agenda

    ERIC Educational Resources Information Center

    Crossley, Scott A.

    2013-01-01

    This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…

  2. Assessing language skills in adult key word signers with intellectual disabilities: Insights from sign linguistics.

    PubMed

    Grove, Nicola; Woll, Bencie

    2017-03-01

    Manual signing is one of the most widely used approaches to support the communication and language skills of children and adults who have intellectual or developmental disabilities, and problems with communication in spoken language. A recent series of papers reporting findings from this population raises critical issues for professionals in the assessment of multimodal language skills of key word signers. Approaches to assessment will differ depending on whether key word signing (KWS) is viewed as discrete from, or related to, natural sign languages. Two available assessments from these different perspectives are compared. Procedures appropriate to the assessment of sign language production are recommended as a valuable addition to the clinician's toolkit. Sign and speech need to be viewed as multimodal, complementary communicative endeavours, rather than as polarities. Whilst narrative has been shown to be a fruitful context for eliciting language samples, assessments for adult users should be designed to suit the strengths, needs and values of adult signers with intellectual disabilities, using materials that are compatible with their life course stage rather than those designed for young children. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. The influence of the visual modality on language structure and conventionalization: insights from sign language and gesture.

    PubMed

    Perniss, Pamela; Özyürek, Asli; Morgan, Gary

    2015-01-01

    For humans, the ability to communicate and use language is instantiated not only in the vocal modality but also in the visual modality. The main examples of this are sign languages and (co-speech) gestures. Sign languages, the natural languages of Deaf communities, use systematic and conventionalized movements of the hands, face, and body for linguistic expression. Co-speech gestures, though non-linguistic, are produced in tight semantic and temporal integration with speech and constitute an integral part of language together with speech. The articles in this issue explore and document how gestures and sign languages are similar or different and how communicative expression in the visual modality can change from being gestural to grammatical in nature through processes of conventionalization. As such, this issue contributes to our understanding of how the visual modality shapes language and the emergence of linguistic structure in newly developing systems. Studying the relationship between signs and gestures provides a new window onto the human ability to recruit multiple levels of representation (e.g., categorical, gradient, iconic, abstract) in the service of using or creating conventionalized communicative systems. Copyright © 2015 Cognitive Science Society, Inc.

  4. Phonological Development in Hearing Learners of a Sign Language: The Influence of Phonological Parameters, Sign Complexity, and Iconicity

    ERIC Educational Resources Information Center

    Ortega, Gerardo; Morgan, Gary

    2015-01-01

    The present study implemented a sign-repetition task at two points in time to hearing adult learners of British Sign Language and explored how each phonological parameter, sign complexity, and iconicity affected sign production over an 11-week (22-hour) instructional period. The results show that training improves articulation accuracy and that…

  5. Processing of Formational, Semantic, and Iconic Information in American Sign Language.

    ERIC Educational Resources Information Center

    Poizner, Howard; And Others

    1981-01-01

    Three experiments examined short-term encoding processes of deaf signers for different aspects of signs from American Sign Language. Results indicated that deaf signers code signs at one level in terms of linguistically significant formational parameters. The semantic and iconic information of signs, however, has little effect on short-term…

  6. Phonological Similarity in American Sign Language.

    ERIC Educational Resources Information Center

    Hildebrandt, Ursula; Corina, David

    2002-01-01

    Investigates deaf and hearing subjects' ratings of American Sign Language (ASL) signs to assess whether linguistic experience shapes judgments of sign similarity. Findings are consistent with linguistic theories that posit movement and location as core structural elements of syllable structure in ASL. (Author/VWL)

  7. Location, Location, Location

    ERIC Educational Resources Information Center

    Cates, Deborah; Gutiérrez, Eva; Hafer, Sarah; Barrett, Ryan; Corina, David

    2013-01-01

    This article presents an analysis of the relationship between sign structure and iconicity in American Sign Language. Historically, linguists have been pressured to downplay the role of form-meaning relationships (iconicity) in signed languages. However, recent inquiries into the role of traditional phonological parameters of signs (handshape,…

  8. Discriminative exemplar coding for sign language recognition with Kinect.

    PubMed

    Sun, Chao; Zhang, Tianzhu; Bao, Bing-Kun; Xu, Changsheng; Mei, Tao

    2013-10-01

    Sign language recognition is a growing research area in the field of computer vision. A challenge within it is to model various signs, varying with time resolution, visual manual appearance, and so on. In this paper, we propose a discriminative exemplar coding (DEC) approach, as well as utilizing Kinect sensor, to model various signs. The proposed DEC method can be summarized as three steps. First, a quantity of class-specific candidate exemplars are learned from sign language videos in each sign category by considering their discrimination. Then, every video of all signs is described as a set of similarities between frames within it and the candidate exemplars. Instead of simply using a heuristic distance measure, the similarities are decided by a set of exemplar-based classifiers through the multiple instance learning, in which a positive (or negative) video is treated as a positive (or negative) bag and those frames similar to the given exemplar in Euclidean space as instances. Finally, we formulate the selection of the most discriminative exemplars into a framework and simultaneously produce a sign video classifier to recognize sign. To evaluate our method, we collect an American sign language dataset, which includes approximately 2000 phrases, while each phrase is captured by Kinect sensor with color, depth, and skeleton information. Experimental results on our dataset demonstrate the feasibility and effectiveness of the proposed approach for sign language recognition.

  9. Promotion in Times of Endangerment: The Sign Language Act in Finland

    ERIC Educational Resources Information Center

    De Meulder, Maartje

    2017-01-01

    The development of sign language recognition legislation is a relatively recent phenomenon in the field of language policy. So far only few authors have documented signing communities' aspirations for recognition legislation, how they work with their governments to achieve legislation which most reflects these goals, and whether and why outcomes…

  10. Italian Sign Language (LIS) Poetry: Iconic Properties and Structural Regularities.

    ERIC Educational Resources Information Center

    Russo, Tommaso; Giuranna, Rosaria; Pizzuto, Elena

    2001-01-01

    Explores and describes from a crosslinguistic perspective, some of the major structural irregularities that characterize poetry in Italian Sign Language and distinguish poetic from nonpoetic texts. Reviews findings of previous studies of signed language poetry, and points out issues that need to be clarified to provide a more accurate description…

  11. Using Sign Language in Your Classroom.

    ERIC Educational Resources Information Center

    Lawrence, Constance D.

    This paper reviews the research on use of American Sign Language in elementary classes that do not include children with hearing impairment and also reports on the use of the manual sign language alphabet in a primary class learning the phonetic sounds of the alphabet. The research reported is overwhelmingly positive in support of using sign…

  12. Reading and American Sign Language: Strategies for Translation.

    ERIC Educational Resources Information Center

    Burkholder, Kim

    1999-01-01

    A hearing teacher for whom American Sign Language is a second language identifies nine strategies developed for reading and telling stories to deaf children. These include: ask obvious questions related to the story, portray written dialog as conversation, emphasize points by saying the same thing with different signs, and adapt the story to…

  13. Phonological Awareness for American Sign Language

    ERIC Educational Resources Information Center

    Corina, David P.; Hafer, Sarah; Welch, Kearnan

    2014-01-01

    This paper examines the concept of phonological awareness (PA) as it relates to the processing of American Sign Language (ASL). We present data from a recently developed test of PA for ASL and examine whether sign language experience impacts the use of metalinguistic routines necessary for completion of our task. Our data show that deaf signers…

  14. Technology to Support Sign Language for Students with Disabilities

    ERIC Educational Resources Information Center

    Donne, Vicki

    2013-01-01

    This systematic review of the literature provides a synthesis of research on the use of technology to support sign language. Background research on the use of sign language with students who are deaf/hard of hearing and students with low incidence disabilities, such as autism, intellectual disability, or communication disorders is provided. The…

  15. Lexical Properties of Slovene Sign Language: A Corpus-Based Study

    ERIC Educational Resources Information Center

    Vintar, Špela

    2015-01-01

    Slovene Sign Language (SZJ) has as yet received little attention from linguists. This article presents some basic facts about SZJ, its history, current status, and a description of the Slovene Sign Language Corpus and Pilot Grammar (SIGNOR) project, which compiled and annotated a representative corpus of SZJ. Finally, selected quantitative data…

  16. Proactive Interference & Language Change in Hearing Adult Students of American Sign Language.

    ERIC Educational Resources Information Center

    Hoemann, Harry W.; Kreske, Catherine M.

    1995-01-01

    Describes a study that found, contrary to previous reports, that a strong, symmetrical release from proactive interference (PI) is the normal outcome for switches between American Sign Language (ASL) signs and English words and with switches between Manual and English alphabet characters. Subjects were college students enrolled in their first ASL…

  17. The Bimodal Bilingual Brain: Effects of Sign Language Experience

    ERIC Educational Resources Information Center

    Emmorey, Karen; McCullough, Stephen

    2009-01-01

    Bimodal bilinguals are hearing individuals who know both a signed and a spoken language. Effects of bimodal bilingualism on behavior and brain organization are reviewed, and an fMRI investigation of the recognition of facial expressions by ASL-English bilinguals is reported. The fMRI results reveal separate effects of sign language and spoken…

  18. Using the Linguistic Landscape to Bridge Languages

    ERIC Educational Resources Information Center

    Mari, Vanessa

    2018-01-01

    In this article Vanessa Mari describes how she uses the linguistic landscape to bridge two or more languages with students learning English. The linguistic landscape is defined by Landry and Bourhis (1997, 25) as "the language of public road signs, advertising billboards, street names, place names, commercial shop signs, and public signs on…

  19. Understanding Communication among Deaf Students Who Sign and Speak: A Trivial Pursuit?

    ERIC Educational Resources Information Center

    Marschark, Marc; Convertino, Carol M.; Macias, Gayle; Monikowski, Christine M.; Sapere, Patricia; Seewagen, Rosemarie

    2007-01-01

    Classroom communication between deaf students was modeled using a question-and-answer game. Participants consisted of student pairs that relied on spoken language, pairs that relied on American Sign Language (ASL), and mixed pairs in which one student used spoken language and one signed. Although the task encouraged students to request…

  20. Neural systems underlying lexical retrieval for sign language.

    PubMed

    Emmorey, Karen; Grabowski, Thomas; McCullough, Stephen; Damasio, Hanna; Ponto, Laura L B; Hichwa, Richard D; Bellugi, Ursula

    2003-01-01

    Positron emission tomography was used to investigate whether signed languages exhibit the same neural organization for lexical retrieval within classical and non-classical language areas as has been described for spoken English. Ten deaf native American sign language (ASL) signers were shown pictures of unique entities (famous persons) and non-unique entities (animals) and were asked to name each stimulus with an overt signed response. Proper name signed responses to famous people were fingerspelled, and common noun responses to animals were both fingerspelled and signed with native ASL signs. In general, retrieving ASL signs activated neural sites similar to those activated by hearing subjects retrieving English words. Naming famous persons activated the left temporal pole (TP), whereas naming animals (whether fingerspelled or signed) activated left inferotemporal (IT) cortex. The retrieval of fingerspelled and native signs generally engaged the same cortical regions, but fingerspelled signs in addition activated a premotor region, perhaps due to the increased motor planning and sequencing demanded by fingerspelling. Native signs activated portions of the left supramarginal gyrus (SMG), an area previously implicated in the retrieval of phonological features of ASL signs. Overall, the findings indicate that similar neuroanatomical areas are involved in lexical retrieval for both signs and words. Copyright 2003 Elsevier Science Ltd.

  1. American Sign Language Curricula: A Review

    ERIC Educational Resources Information Center

    Rosen, Russell S.

    2010-01-01

    There is an exponential growth in the number of schools that offer American Sign Language (ASL) for foreign language credit and the different ASL curricula that were published. This study analyzes different curricula in its assumptions regarding language, learning, and teaching of second languages. It is found that curricula vary in their…

  2. Classification of visual signs in abdominal CT image figures in biomedical literature

    NASA Astrophysics Data System (ADS)

    Xue, Zhiyun; You, Daekeun; Antani, Sameer; Long, L. Rodney; Demner-Fushman, Dina; Thoma, George R.

    2014-03-01

    "Imaging signs" are a critical part of radiology's language. They not only are important for conveying diagnosis, but may also aid in indexing radiology literature and retrieving relevant cases and images. Here we report our work towards representing and categorizing imaging signs of abdominal abnormalities in figures in the radiology literature. Given a region-of-interest (ROI) from a figure, our goal was to assign a correct imaging sign label to that ROI from the following seven: accordion, comb, ring, sandwich, small bowel feces, target, or whirl. As training and test data, we created our own "gold standard" dataset of regions containing imaging signs. We computed 2997 feature attributes to represent imaging sign characteristics for each ROI in training and test sets. Following feature selection they were reduced to 70 attributes and were input to a Support Vector Machine classifier. We applied image-enhancement methods to compensate for variable quality of the images in radiology articles. In particular we developed a method for automatic detection and removal of pointers/markers (arrows, arrowheads, and asterisk symbols) on the images. These pointers/markers are valuable for approximately locating ROIs; however, they degrade the classification because they are often (partially) included in the training ROIs. On a test set of 283 ROIs, our method achieved an overall accuracy of 70% in labeling the seven signs, which we believe is a promising result for using imaging signs to search/retrieve radiology literature. This work is also potentially valuable for the creation of a visual ontology of biomedical imaging entities.

  3. The gradual emergence of phonological form in a new language

    PubMed Central

    Aronoff, Mark; Meir, Irit; Padden, Carol

    2011-01-01

    The division of linguistic structure into a meaningless (phonological) level and a meaningful level of morphemes and words is considered a basic design feature of human language. Although established sign languages, like spoken languages, have been shown to be characterized by this bifurcation, no information has been available about the way in which such structure arises. We report here on a newly emerging sign language, Al-Sayyid Bedouin Sign Language, which functions as a full language but in which a phonological level of structure has not yet emerged. Early indications of formal regularities provide clues to the way in which phonological structure may develop over time. PMID:22223927

  4. The Signs B [Image Omitted] and B-Bent [Image Omitted] in Israeli Sign Language According to the Theory of Phonology as Human Behavior

    ERIC Educational Resources Information Center

    Fuks, Orit; Tobin, Yishai

    2008-01-01

    The purpose of the present research is to examine which of the two factors: (1) the iconic-semiotic factor; or (2) the human-phonetic factor is more relevant in explaining the appearance and distribution of the hand shape B-bent in Israeli Sign Language (ISL). The B-bent shape has been the subject of much attention in sign language research…

  5. [Anesthesia simulators and training devices].

    PubMed

    Hartmannsgruber, M; Good, M; Carovano, R; Lampotang, S; Gravenstein, J S

    1993-07-01

    Simulators and training devices are used extensively by educators in 'high-tech' occupations, especially those requiring an understanding of complex systems and co-ordinated psychomotor skills. Because of advances in computer technology, anaesthetised patients can now be realistically simulated. This paper describes several training devices and a simulator currently being employed in the training of anaesthesia personnel at the University of Florida. This Gainesville Anesthesia Simulator (GAS) comprises a patient mannequin, anaesthesia gas machine, and a full set of normally operating monitoring instruments. The patient can spontaneously breathe, has audible heart and breath sounds, and palpable pulses. The mannequin contains a sophisticated lung model that consumes and eliminates gas according to physiological principles. Interconnected computers controlling the physical signs of the mannequin enable the presentation of a multitude of clinical signs. In addition, the anaesthesia machine, which is functionally intact, has hidden fault activators to challenge the user to correct equipment malfunctions. Concealed sensors monitor the users' actions and responses. A robust data acquisition and control system and a user-friendly scripting language for programming simulation scenarios are key features of GAS and make this system applicable for the training of both the beginning resident and the experienced practitioner. GAS enhances clinical education in anaesthesia by providing a non-threatening environment that fosters learning by doing. Exercises with the simulator are supported by sessions on a number of training devices. These present theoretical and practical interactive courses on the anaesthesia machine and on monitors. An extensive system, for example, introduces the student to the physics and clinical application of transoesophageal echocardiography.(ABSTRACT TRUNCATED AT 250 WORDS)

  6. Visual sign phonology: insights into human reading and language from a natural soundless phonology.

    PubMed

    Petitto, L A; Langdon, C; Stone, A; Andriola, D; Kartheiser, G; Cochran, C

    2016-11-01

    Among the most prevailing assumptions in science and society about the human reading process is that sound and sound-based phonology are critical to young readers. The child's sound-to-letter decoding is viewed as universal and vital to deriving meaning from print. We offer a different view. The crucial link for early reading success is not between segmental sounds and print. Instead the human brain's capacity to segment, categorize, and discern linguistic patterning makes possible the capacity to segment all languages. This biological process includes the segmentation of languages on the hands in signed languages. Exposure to natural sign language in early life equally affords the child's discovery of silent segmental units in visual sign phonology (VSP) that can also facilitate segmental decoding of print. We consider powerful biological evidence about the brain, how it builds sound and sign phonology, and why sound and sign phonology are equally important in language learning and reading. We offer a testable theoretical account, reading model, and predictions about how VSP can facilitate segmentation and mapping between print and meaning. We explain how VSP can be a powerful facilitator of all children's reading success (deaf and hearing)-an account with profound transformative impact on learning to read in deaf children with different language backgrounds. The existence of VSP has important implications for understanding core properties of all human language and reading, challenges assumptions about language and reading as being tied to sound, and provides novel insight into a remarkable biological equivalence in signed and spoken languages. WIREs Cogn Sci 2016, 7:366-381. doi: 10.1002/wcs.1404 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.

  7. Bean Soup Translation: Flexible, Linguistically-Motivated Syntax for Machine Translation

    ERIC Educational Resources Information Center

    Mehay, Dennis Nolan

    2012-01-01

    Machine translation (MT) systems attempt to translate texts from one language into another by translating words from a "source language" and rearranging them into fluent utterances in a "target language." When the two languages organize concepts in very different ways, knowledge of their general sentence structure, or…

  8. Choosing Accommodations: Signed Language Interpreting and the Absence of Choice.

    PubMed

    Burke, Teresa Blankmeyer

    This paper carves out a topic space for discussion about the ethical question of whether input from signing Deaf consumers of interpreting services ought to be included in the provision of signed language interpreter accommodations. The first section provides background about disability accommodations and practices, including how signed language interpreting accommodations are similar and dissimilar to other kinds of disability accommodations. In the second section, I offer a personal narrative of my experience as a Deaf academic who has been excluded from the interpreter selection process, highlighting some of the harmful consequences of such exclusion. In the subsequent two sections, I describe and analyze the process of choosing interpreter accommodations, starting with the process of requesting signed language interpreters and the institutionalization of this process, followed by a brief overview of privacy and autonomy concerns from the standpoint of the signing Deaf consumer. The penultimate section considers some objections to the proposal of involving more consumer choice in signed language accommodations. I conclude the paper with some concrete suggestions for a more Deaf-centered, inclusive process for choosing interpreter accommodations.

  9. Learning an Embodied Visual Language: Four Imitation Strategies Available to Sign Learners

    PubMed Central

    Shield, Aaron; Meier, Richard P.

    2018-01-01

    The parts of the body that are used to produce and perceive signed languages (the hands, face, and visual system) differ from those used to produce and perceive spoken languages (the vocal tract and auditory system). In this paper we address two factors that have important consequences for sign language acquisition. First, there are three types of lexical signs: one-handed, two-handed symmetrical, and two-handed asymmetrical. Natural variation in hand dominance in the population leads to varied input to children learning sign. Children must learn that signs are not specified for the right or left hand but for dominant and non-dominant. Second, we posit that children have at least four imitation strategies available for imitating signs: anatomical (Activate the same muscles as the sign model), which could lead learners to inappropriately use their non-dominant hand; mirroring (Produce a mirror image of the modeled sign), which could lead learners to produce lateral movement reversal errors or to use the non-dominant hand; visual matching (Reproduce what you see from your perspective), which could lead learners to produce inward–outward movement and palm orientation reversals; and reversing (Reproduce what the sign model would see from his/her perspective). This last strategy is the only one that always yields correct phonological forms in signed languages. To test our hypotheses, we turn to evidence from typical and atypical hearing and deaf children as well as from typical adults; the data come from studies of both sign acquisition and gesture imitation. Specifically, we posit that all children initially use a visual matching strategy but typical children switch to a mirroring strategy sometime in the second year of life; typical adults tend to use a mirroring strategy in learning signs and imitating gestures. By contrast, children and adults with autism spectrum disorder (ASD) appear to use the visual matching strategy well into childhood or even adulthood. Finally, we present evidence that sign language exposure changes how adults imitate gestures, switching from a mirroring strategy to the correct reversal strategy. These four strategies for imitation do not exist in speech and as such constitute a unique problem for research in language acquisition. PMID:29899716

  10. Three-dimensional grammar in the brain: Dissociating the neural correlates of natural sign language and manually coded spoken language.

    PubMed

    Jednoróg, Katarzyna; Bola, Łukasz; Mostowski, Piotr; Szwed, Marcin; Boguszewski, Paweł M; Marchewka, Artur; Rutkowski, Paweł

    2015-05-01

    In several countries natural sign languages were considered inadequate for education. Instead, new sign-supported systems were created, based on the belief that spoken/written language is grammatically superior. One such system called SJM (system językowo-migowy) preserves the grammatical and lexical structure of spoken Polish and since 1960s has been extensively employed in schools and on TV. Nevertheless, the Deaf community avoids using SJM for everyday communication, its preferred language being PJM (polski język migowy), a natural sign language, structurally and grammatically independent of spoken Polish and featuring classifier constructions (CCs). Here, for the first time, we compare, with fMRI method, the neural bases of natural vs. devised communication systems. Deaf signers were presented with three types of signed sentences (SJM and PJM with/without CCs). Consistent with previous findings, PJM with CCs compared to either SJM or PJM without CCs recruited the parietal lobes. The reverse comparison revealed activation in the anterior temporal lobes, suggesting increased semantic combinatory processes in lexical sign comprehension. Finally, PJM compared with SJM engaged left posterior superior temporal gyrus and anterior temporal lobe, areas crucial for sentence-level speech comprehension. We suggest that activity in these two areas reflects greater processing efficiency for naturally evolved sign language. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Workplace Concepts in Sign and Text. A Computerized Sign Language Dictionary.

    ERIC Educational Resources Information Center

    Western Pennsylvania School for the Deaf, Pittsburgh.

    This document is a dictionary of essential vocabulary, signs, and illustrations of workplace activities to be used to train deaf or hearing-impaired adults. It contains more than 500 entries with workplace-relevant vocabulary, each including an illustration of the signed word or phrase in American Sign Language, a description of how to make the…

  12. The Verbal System of Catalan Sign Language (LSC)

    ERIC Educational Resources Information Center

    Morales-Lopez, Esperanza; Boldu-Menasanch, Rosa Maria; Alonso-Rodriguez, Jesus Amador; Gras-Ferrer, Victoria; Rodriguez-Gonzalez, Maria Angeles

    2005-01-01

    This article describes the predicative verbal system of Catalan Sign Language (LSC) as it is used by Deaf people in the province of Barcelona. We also present a historical perspective of the research on this topic, which provides insight into the changes that have taken place over the last few decades in sign language linguistics. The principal…

  13. Language and Literacy Acquisition through Parental Mediation in American Sign Language

    ERIC Educational Resources Information Center

    Bailes, Cynthia Neese; Erting, Lynne C.; Thumann-Prezioso, Carlene; Erting, Carol J.

    2009-01-01

    This longitudinal case study examined the language and literacy acquisition of a Deaf child as mediated by her signing Deaf parents during her first three years of life. Results indicate that the parents' interactions with their child were guided by linguistic and cultural knowledge that produced an intuitive use of child-directed signing (CDSi)…

  14. Identifying Movement Patterns and Severity of Associated Pain in Sign Language Interpreters

    ERIC Educational Resources Information Center

    Freeman, Julie K.; Rogers, Janet L.

    2010-01-01

    Our research sought to identify the most common movement patterns and postures performed by sign language interpreters and the frequency and severity of any pain that may be associated with the movements. A survey was developed and mailed to registered sign language interpreters throughout the state of Illinois. For each specific upper extremity…

  15. Historical Development of Hong Kong Sign Language

    ERIC Educational Resources Information Center

    Sze, Felix; Lo, Connie; Lo, Lisa; Chu, Kenny

    2013-01-01

    This article traces the origins of Hong Kong Sign Language (hereafter HKSL) and its subsequent development in relation to the establishment of Deaf education in Hong Kong after World War II. We begin with a detailed description of the history of Deaf education with a particular focus on the role of sign language in such development. We then…

  16. The Effect of New Technologies on Sign Language Research

    ERIC Educational Resources Information Center

    Lucas, Ceil; Mirus, Gene; Palmer, Jeffrey Levi; Roessler, Nicholas James; Frost, Adam

    2013-01-01

    This paper first reviews the fairly established ways of collecting sign language data. It then discusses the new technologies available and their impact on sign language research, both in terms of how data is collected and what new kinds of data are emerging as a result of technology. New data collection methods and new kinds of data are…

  17. Perspectives on the Sign Language Factor in Sub-Saharan Africa: Challenges of Sustainability

    ERIC Educational Resources Information Center

    Lutalo-Kiingi, Sam; De Clerck, Goedele A. M.

    2017-01-01

    This article has been excerpted from "Perspectives on the Sign Language Factor in Sub-Saharan Africa: Challenges of Sustainability" (Lutalo-Kiingi and De Clerck) in "Sign Language, Sustainable Development, and Equal Opportunities: Envisioning the Future for Deaf Students" (G. A. M. De Clerck and P. V. Paul (Eds.) 2016). In this…

  18. Meemul Tziij: An Indigenous Sign Language Complex of Mesoamerica

    ERIC Educational Resources Information Center

    Tree, Erich Fox

    2009-01-01

    This article examines sign languages that belong to a complex of indigenous sign languages in Mesoamerica that K'iche'an Maya people of Guatemala refer to collectively as Meemul Tziij. It explains the relationship between the Meemul Tziij variety of the Yukatek Maya village of Chican (state of Yucatan, Mexico) and the hitherto undescribed Meemul…

  19. The Birth and Rebirth of "Sign Language Studies"

    ERIC Educational Resources Information Center

    Armstrong, David F.

    2012-01-01

    As most readers of this journal are aware, "Sign Language Studies" ("SLS") served for many years as effectively the only serious scholarly outlet for work in the nascent field of sign language linguistics. Now reaching its 40th anniversary, the journal was founded by William C. Stokoe and then edited by him for the first quarter century of its…

  20. Sign Language Use and the Appreciation of Diversity in Hearing Classrooms

    ERIC Educational Resources Information Center

    Brereton, Amy

    2008-01-01

    This article is the result of a year-long study into the effects of sign language use on participation in one mainstream preschool setting. Observations and interviews were the primary data-collection tools used during this investigation. This article focuses on how the use of sign language in the classroom affected the learning community's…

  1. American Sign Language Comprehension Test: A Tool for Sign Language Researchers

    ERIC Educational Resources Information Center

    Hauser, Peter C.; Paludneviciene, Raylene; Riddle, Wanda; Kurz, Kim B.; Emmorey, Karen; Contreras, Jessica

    2016-01-01

    The American Sign Language Comprehension Test (ASL-CT) is a 30-item multiple-choice test that measures ASL receptive skills and is administered through a website. This article describes the development and psychometric properties of the test based on a sample of 80 college students including deaf native signers, hearing native signers, deaf…

  2. Languages Are More than Words: Spanish and American Sign Language in Early Childhood Settings

    ERIC Educational Resources Information Center

    Sherman, Judy; Torres-Crespo, Marisel N.

    2015-01-01

    Capitalizing on preschoolers' inherent enthusiasm and capacity for learning, the authors developed and implemented a dual-language program to enable young children to experience diversity and multiculturalism by learning two new languages: Spanish and American Sign Language. Details of the curriculum, findings, and strategies are shared.

  3. Cross-Linguistic Differences in the Neural Representation of Human Language: Evidence from Users of Signed Languages

    PubMed Central

    Corina, David P.; Lawyer, Laurel A.; Cates, Deborah

    2013-01-01

    Studies of deaf individuals who are users of signed languages have provided profound insight into the neural representation of human language. Case studies of deaf signers who have incurred left- and right-hemisphere damage have shown that left-hemisphere resources are a necessary component of sign language processing. These data suggest that, despite frank differences in the input and output modality of language, core left perisylvian regions universally serve linguistic function. Neuroimaging studies of deaf signers have generally provided support for this claim. However, more fine-tuned studies of linguistic processing in deaf signers are beginning to show evidence of important differences in the representation of signed and spoken languages. In this paper, we provide a critical review of this literature and present compelling evidence for language-specific cortical representations in deaf signers. These data lend support to the claim that the neural representation of language may show substantive cross-linguistic differences. We discuss the theoretical implications of these findings with respect to an emerging understanding of the neurobiology of language. PMID:23293624

  4. Ada (Trade Name) Compiler Validation Summary Report: International Business Machines Corporation. IBM Development System for the Ada Language System, Version 1.1.0, IBM 4381 under MVS.

    DTIC Science & Technology

    1988-05-22

    TITLE (andSubtile) 5. TYPE OF REPORT & PERIOD COVERED Ada Compler Validation Summary Report: 22 May 1987 to 22 May 1988 International Business Machines...IBM Development System for the Ada Language System, Version 1.1.0, International Business Machines Corporation, Wright-Patterson AFB. IBM 4381 under...SUMMARY REPORT: International Business Machines Corporation IBM Development System f’or the Ada Language System, Version 1.1.0 IBM 4381 under MVS

  5. Ada Compiler Validation Summary Report: Certificate Number: 890420W1. 10066 International Business Machines Corporation, IBM Development System for the Ada Language, AIX/RT Ada Compiler, Version 1.1.1, IBM RT PC 6150-125

    DTIC Science & Technology

    1989-04-20

    International Business Machines Corporation, IBM Development System. for the Ada Language AIX/RT Ada Compiler, Version 1.1.1, Wright-Patterson APB...Certificate Number: 890420V1.10066 International Business Machines Corporation IBM Development System for the Ada Language AIX/RT Ada Compiler, Version 1.1.1...TEST INFORMATION The compiler was tested using command scripts provided by International Business Machines Corporation and reviewed by the validation

  6. Evaluating Effects of Language Recognition on Language Rights and the Vitality of New Zealand Sign Language

    ERIC Educational Resources Information Center

    McKee, Rachel Locker; Manning, Victoria

    2015-01-01

    Status planning through legislation made New Zealand Sign Language (NZSL) an official language in 2006. But this strong symbolic action did not create resources or mechanisms to further the aims of the act. In this article we discuss the extent to which legal recognition and ensuing language-planning activities by state and community have affected…

  7. Learning To See: American Sign Language as a Second Language. Language in Education: Theory and Practice 76.

    ERIC Educational Resources Information Center

    Wilcox, Sherman; Wilcox, Phyllis

    During the last decade, the study of American Sign Language (ASL) as a second language has become enormously popular. More and more schools and universities recognize the important role that ASL can play in foreign language education. This monograph provides a comprehensive introduction to the history and structure of ASL, to the Deaf community…

  8. Deaf Education Policy as Language Policy: A Comparative Analysis of Sweden and the United States

    ERIC Educational Resources Information Center

    Hult, Francis M.; Compton, Sarah E.

    2012-01-01

    The role of languages is a central issue in deaf education. The function of sign languages in education and deaf students' opportunities to develop linguistic abilities in both sign languages and the dominant language(s) of a society are key considerations (Hogan-Brun 2009; Reagan 2010, 53; Swanwick 2010a). Accordingly, what Kaplan and Baldauf…

  9. Directionality effects in simultaneous language interpreting: the case of sign language interpreters in The Netherlands.

    PubMed

    Van Dijk, Rick; Boers, Eveline; Christoffels, Ingrid; Hermans, Daan

    2011-01-01

    The quality of interpretations produced by sign language interpreters was investigated. Twenty-five experienced interpreters were instructed to interpret narratives from (a) spoken Dutch to Sign Language of The Netherlands (SLN), (b) spoken Dutch to Sign Supported Dutch (SSD), and (c) SLN to spoken Dutch. The quality of the interpreted narratives was assessed by 5 certified sign language interpreters who did not participate in the study. Two measures were used to assess interpreting quality: the propositional accuracy of the interpreters' interpretations and a subjective quality measure. The results showed that the interpreted narratives in the SLN-to-Dutch interpreting direction were of lower quality (on both measures) than the interpreted narratives in the Dutch-to-SLN and Dutch-to-SSD directions. Furthermore, interpreters who had begun acquiring SLN when they entered the interpreter training program performed as well in all 3 interpreting directions as interpreters who had acquired SLN from birth.

  10. "Thinking-for-Writing": A Prolegomenon on Writing Signed Languages.

    PubMed

    Rosen, Russell S; Hartman, Maria C; Wang, Ye

    2017-01-01

    In his article in this American Annals of the Deaf special issue that also includes the present article, Grushkin argues that the writing difficulties of many deaf and hard of hearing children result primarily from the orthographic nature of the writing system; he proposes a new system based on features found in signed languages. In response, the present authors review the literature on D/HH children's writing difficulties, outline the main percepts of and assumptions about writing signed languages, discuss "thinking-for-writing" as a process in developing writing skills, offer research designs to test the effectiveness of writing signed language systems, and provide strategies for adopting "thinking-for-writing" in education. They conclude that until empirical studies show that writing signed languages effectively reflects writers' "thinking-for-writing," the alphabetic orthographic system of English should still be used, and ways should be found to teach D/HH children to use English writing to express their thoughts.

  11. Students who are deaf and hard of hearing and use sign language: considerations and strategies for developing spoken language and literacy skills.

    PubMed

    Nussbaum, Debra; Waddy-Smith, Bettie; Doyle, Jane

    2012-11-01

    There is a core body of knowledge, experience, and skills integral to facilitating auditory, speech, and spoken language development when working with the general population of students who are deaf and hard of hearing. There are additional issues, strategies, and challenges inherent in speech habilitation/rehabilitation practices essential to the population of deaf and hard of hearing students who also use sign language. This article will highlight philosophical and practical considerations related to practices used to facilitate spoken language development and associated literacy skills for children and adolescents who sign. It will discuss considerations for planning and implementing practices that acknowledge and utilize a student's abilities in sign language, and address how to link these skills to developing and using spoken language. Included will be considerations for children from early childhood through high school with a broad range of auditory access, language, and communication characteristics. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  12. The psychotherapist and the sign language interpreter.

    PubMed

    de Bruin, Ed; Brugmans, Petra

    2006-01-01

    Specialized psychotherapy for deaf people in the Dutch and Western European mental health systems is still a rather young specialism. A key policy principle in Dutch mental health care for the deaf is that they should receive treatment in the language most accessible to them, which is usually Dutch Sign Language (Nederlandse Gebarentaal or NGT). Although psychotherapists for the deaf are trained to use sign language, situations will always arise in which a sign language interpreter is needed. Most psychotherapists have the opinion that working with a sign language interpreter in therapy sessions can be a valuable alternative option but also see it as a second-best solution because of its impact on the therapeutic process. This paper describes our years of collaborationship as a therapist and a sign language interpreter. If this collaborationship is optimal, it can generate a certain "therapeutic power" in the therapy sessions. Achieving this depends largely on the interplay between the therapist and the interpreter, which in our case is the result of literature research and our experiences during the last 17 years. We analyze this special collaborative relationship, which has several dimensions and recurrent themes like, the role conception of the interpreter, situational interpreting, organizing the interpretation setting, or managing therapeutic phenomena during therapy sessions.

  13. Sign Vocabulary in Deaf Toddlers Exposed to Sign Language since Birth

    ERIC Educational Resources Information Center

    Rinaldi, Pasquale; Caselli, Maria Cristina; Di Renzo, Alessio; Gulli, Tiziana; Volterra, Virginia

    2014-01-01

    Lexical comprehension and production is directly evaluated for the first time in deaf signing children below the age of 3 years. A Picture Naming Task was administered to 8 deaf signing toddlers (aged 2-3 years) who were exposed to Sign Language since birth. Results were compared with data of hearing speaking controls. In both deaf and hearing…

  14. Referential shift in Nicaraguan Sign Language: a transition from lexical to spatial devices

    PubMed Central

    Kocab, Annemarie; Pyers, Jennie; Senghas, Ann

    2015-01-01

    Even the simplest narratives combine multiple strands of information, integrating different characters and their actions by expressing multiple perspectives of events. We examined the emergence of referential shift devices, which indicate changes among these perspectives, in Nicaraguan Sign Language (NSL). Sign languages, like spoken languages, mark referential shift grammatically with a shift in deictic perspective. In addition, sign languages can mark the shift with a point or a movement of the body to a specified spatial location in the three-dimensional space in front of the signer, capitalizing on the spatial affordances of the manual modality. We asked whether the use of space to mark referential shift emerges early in a new sign language by comparing the first two age cohorts of deaf signers of NSL. Eight first-cohort signers and 10 second-cohort signers watched video vignettes and described them in NSL. Narratives were coded for lexical (use of words) and spatial (use of signing space) devices. Although the cohorts did not differ significantly in the number of perspectives represented, second-cohort signers used referential shift devices to explicitly mark a shift in perspective in more of their narratives. Furthermore, while there was no significant difference between cohorts in the use of non-spatial, lexical devices, there was a difference in spatial devices, with second-cohort signers using them in significantly more of their narratives. This suggests that spatial devices have only recently increased as systematic markers of referential shift. Spatial referential shift devices may have emerged more slowly because they depend on the establishment of fundamental spatial conventions in the language. While the modality of sign languages can ultimately engender the syntactic use of three-dimensional space, we propose that a language must first develop systematic spatial distinctions before harnessing space for grammatical functions. PMID:25713541

  15. Referential shift in Nicaraguan Sign Language: a transition from lexical to spatial devices.

    PubMed

    Kocab, Annemarie; Pyers, Jennie; Senghas, Ann

    2014-01-01

    Even the simplest narratives combine multiple strands of information, integrating different characters and their actions by expressing multiple perspectives of events. We examined the emergence of referential shift devices, which indicate changes among these perspectives, in Nicaraguan Sign Language (NSL). Sign languages, like spoken languages, mark referential shift grammatically with a shift in deictic perspective. In addition, sign languages can mark the shift with a point or a movement of the body to a specified spatial location in the three-dimensional space in front of the signer, capitalizing on the spatial affordances of the manual modality. We asked whether the use of space to mark referential shift emerges early in a new sign language by comparing the first two age cohorts of deaf signers of NSL. Eight first-cohort signers and 10 second-cohort signers watched video vignettes and described them in NSL. Narratives were coded for lexical (use of words) and spatial (use of signing space) devices. Although the cohorts did not differ significantly in the number of perspectives represented, second-cohort signers used referential shift devices to explicitly mark a shift in perspective in more of their narratives. Furthermore, while there was no significant difference between cohorts in the use of non-spatial, lexical devices, there was a difference in spatial devices, with second-cohort signers using them in significantly more of their narratives. This suggests that spatial devices have only recently increased as systematic markers of referential shift. Spatial referential shift devices may have emerged more slowly because they depend on the establishment of fundamental spatial conventions in the language. While the modality of sign languages can ultimately engender the syntactic use of three-dimensional space, we propose that a language must first develop systematic spatial distinctions before harnessing space for grammatical functions.

  16. A dictionary of Astronomy for the French Sign Language (LSF)

    NASA Astrophysics Data System (ADS)

    Proust, Dominique; Abbou, Daniel; Chab, Nasro

    2011-06-01

    Since a few years, the french deaf communauty have access to astronomy at Paris-Meudon observatory through a specific teaching adapted from the French Sign Language (Langue des Signes Françcaise, LSF) including direct observations with the observatory telescopes. From this experience, an encyclopedic dictionary of astronomy The Hands in the Stars is now available, containing more than 200 astronomical concepts. Many of them did not existed in Sign Language and can be now fully expressed and explained.

  17. Articulatory Suppression Effects on Short-Term Memory of Signed Digits and Lexical Items in Hearing Bimodal-Bilingual Adults

    ERIC Educational Resources Information Center

    Liu, Hsiu Tan; Squires, Bonita; Liu, Chun Jung

    2016-01-01

    We can gain a better understanding of short-term memory processes by studying different language codes and modalities. Three experiments were conducted to investigate: (a) Taiwanese Sign Language (TSL) digit spans in Chinese/TSL hearing bilinguals (n = 32); (b) American Sign Language (ASL) digit spans in English/ASL hearing bilinguals (n = 15);…

  18. The Link between Form and Meaning in British Sign Language: Effects of Iconicity for Phonological Decisions

    ERIC Educational Resources Information Center

    Thompson, Robin L.; Vinson, David P.; Vigliocco, Gabriella

    2010-01-01

    Signed languages exploit the visual/gestural modality to create iconic expression across a wide range of basic conceptual structures in which the phonetic resources of the language are built up into an analogue of a mental image (Taub, 2001). Previously, we demonstrated a processing advantage when iconic properties of signs were made salient in a…

  19. Evidence for Website Claims about the Benefits of Teaching Sign Language to Infants and Toddlers with Normal Hearing

    ERIC Educational Resources Information Center

    Nelson, Lauri H.; White, Karl R.; Grewe, Jennifer

    2012-01-01

    The development of proficient communication skills in infants and toddlers is an important component to child development. A popular trend gaining national media attention is teaching sign language to babies with normal hearing whose parents also have normal hearing. Thirty-three websites were identified that advocate sign language for hearing…

  20. The Effect of Sign Language Rehearsal on Deaf Subjects' Immediate and Delayed Recall of English Word Lists.

    ERIC Educational Resources Information Center

    Bonvillian, John D.; And Others

    1987-01-01

    The relationship between sign language rehearsal and written free recall was examined by having deaf college students rehearse the sign language equivalents of printed English words. Studies of both immediate and delayed memory suggested that word recall increased as a function of total rehearsal frequency and frequency of appearance in rehearsal…

  1. Constructing an Online Test Framework, Using the Example of a Sign Language Receptive Skills Test

    ERIC Educational Resources Information Center

    Haug, Tobias; Herman, Rosalind; Woll, Bencie

    2015-01-01

    This paper presents the features of an online test framework for a receptive skills test that has been adapted, based on a British template, into different sign languages. The online test includes features that meet the needs of the different sign language versions. Features such as usability of the test, automatic saving of scores, and score…

  2. Gesture in Multiparty Interaction: A Study of Embodied Discourse in Spoken English and American Sign Language

    ERIC Educational Resources Information Center

    Shaw, Emily P.

    2013-01-01

    This dissertation is an examination of gesture in two game nights: one in spoken English between four hearing friends and another in American Sign Language between four Deaf friends. Analyses of gesture have shown there exists a complex integration of manual gestures with speech. Analyses of sign language have implicated the body as a medium…

  3. The British Sign Language Variant of Stokoe Notation: Report on a Type-Design Project.

    ERIC Educational Resources Information Center

    Thoutenhoofd, Ernst

    2003-01-01

    Explores the outcome of a publicly-funded research project titled "Redesign of the British Sign Language (BSL) Notation System with a New Font for Use in ICT." The aim of the project was to redesign the British Sign Language variant of Stokoe notation for practical use in information technology systems and software, such as lexical…

  4. Deaf Students' Receptive and Expressive American Sign Language Skills: Comparisons and Relations

    ERIC Educational Resources Information Center

    Beal-Alvarez, Jennifer S.

    2014-01-01

    This article presents receptive and expressive American Sign Language skills of 85 students, 6 through 22 years of age at a residential school for the deaf using the American Sign Language Receptive Skills Test and the Ozcaliskan Motion Stimuli. Results are presented by ages and indicate that students' receptive skills increased with age and…

  5. The physiognomic unity of sign, word, and gesture.

    PubMed

    Cornejo, Carlos; Musa, Roberto

    2017-01-01

    Goldin-Meadow & Brentari (G-M&B) are implicitly going against the dominant paradigm in language research, namely, the "speech as written language" metaphor that portrays vocal sounds and bodily signs as means of delivering stable word meanings. We argue that Heinz Werner's classical research on the physiognomic properties of language supports and complements their view of sign and gesture as a unified system.

  6. Bilingual Word Recognition in Deaf and Hearing Signers: Effects of Proficiency and Language Dominance on Cross-Language Activation

    ERIC Educational Resources Information Center

    Morford, Jill P.; Kroll, Judith F.; Piñar, Pilar; Wilkinson, Erin

    2014-01-01

    Recent evidence demonstrates that American Sign Language (ASL) signs are active during print word recognition in deaf bilinguals who are highly proficient in both ASL and English. In the present study, we investigate whether signs are active during print word recognition in two groups of unbalanced bilinguals: deaf ASL-dominant and hearing…

  7. Post-glossectomy in lingual carcinomas: a scope for sign language in rehabilitation

    PubMed Central

    Cumberbatch, Keren; Jones, Thaon

    2017-01-01

    The treatment option for cancers of the tongue is glossectomy, which may be partial, sub-total, or total, depending on the size of the tumour. Glossectomies result in speech deficits for these patients, and rehabilitative therapy involving communication modalities is highly recommended. Sign language is a possible therapeutic solution for post-glossectomy oral cancer patients. Patients with tongue cancers who have undergone total glossectomy as a surgical treatment can utilise sign language to replace their loss of speech production and maintain their engagement in life. This manuscript emphasises the importance of sign language in rehabilitation strategies in post-glossectomy patients. PMID:28947881

  8. Post-glossectomy in lingual carcinomas: a scope for sign language in rehabilitation.

    PubMed

    Rajendra Santosh, Arvind Babu; Cumberbatch, Keren; Jones, Thaon

    2017-01-01

    The treatment option for cancers of the tongue is glossectomy, which may be partial, sub-total, or total, depending on the size of the tumour. Glossectomies result in speech deficits for these patients, and rehabilitative therapy involving communication modalities is highly recommended. Sign language is a possible therapeutic solution for post-glossectomy oral cancer patients. Patients with tongue cancers who have undergone total glossectomy as a surgical treatment can utilise sign language to replace their loss of speech production and maintain their engagement in life. This manuscript emphasises the importance of sign language in rehabilitation strategies in post-glossectomy patients.

  9. On the linguistic status of ‘agreement’ in sign languages

    PubMed Central

    LILLO-MARTIN, DIANE; MEIER, RICHARD P.

    2013-01-01

    In signed languages, the arguments of verbs can be marked by a system of verbal modification that has been termed “agreement” (more neutrally, “directionality”). Fundamental issues regarding directionality remain unresolved and the phenomenon has characteristics that call into question its analysis as agreement. We conclude that directionality marks person in American Sign Language, and the ways person marking interacts with syntactic phenomena are largely analogous to morpho-syntactic properties of familiar agreement systems. Overall, signed languages provide a crucial test for how gestural and linguistic mechanisms can jointly contribute to the satisfaction of fundamental aspects of linguistic structure. PMID:23495262

  10. Order of the major constituents in sign languages: implications for all language

    PubMed Central

    Napoli, Donna Jo; Sutton-Spence, Rachel

    2014-01-01

    A survey of reports of sign order from 42 sign languages leads to a handful of generalizations. Two accounts emerge, one amodal and the other modal. We argue that universal pressures are at work with respect to some generalizations, but that pressure from the visual modality is at work with respect to others. Together, these pressures conspire to make all sign languages order their major constituents SOV or SVO. This study leads us to the conclusion that the order of S with regard to verb phrase (VP) may be driven by sensorimotor system concerns that feed universal grammar. PMID:24860523

  11. Iconicity as a General Property of Language: Evidence from Spoken and Signed Languages

    PubMed Central

    Perniss, Pamela; Thompson, Robin L.; Vigliocco, Gabriella

    2010-01-01

    Current views about language are dominated by the idea of arbitrary connections between linguistic form and meaning. However, if we look beyond the more familiar Indo-European languages and also include both spoken and signed language modalities, we find that motivated, iconic form-meaning mappings are, in fact, pervasive in language. In this paper, we review the different types of iconic mappings that characterize languages in both modalities, including the predominantly visually iconic mappings found in signed languages. Having shown that iconic mapping are present across languages, we then proceed to review evidence showing that language users (signers and speakers) exploit iconicity in language processing and language acquisition. While not discounting the presence and importance of arbitrariness in language, we put forward the idea that iconicity need also be recognized as a general property of language, which may serve the function of reducing the gap between linguistic form and conceptual representation to allow the language system to “hook up” to motor, perceptual, and affective experience. PMID:21833282

  12. Input Processing at First Exposure to a Sign Language

    ERIC Educational Resources Information Center

    Ortega, Gerardo; Morgan, Gary

    2015-01-01

    There is growing interest in learners' cognitive capacities to process a second language (L2) at first exposure to the target language. Evidence suggests that L2 learners are capable of processing novel words by exploiting phonological information from their first language (L1). Hearing adult learners of a sign language, however, cannot fall back…

  13. We Need to Communicate! Helping Hearing Parents of Deaf Children Learn American Sign Language

    ERIC Educational Resources Information Center

    Weaver, Kimberly A.; Starner, Thad

    2011-01-01

    Language immersion from birth is crucial to a child's language development. However, language immersion can be particularly challenging for hearing parents of deaf children to provide as they may have to overcome many difficulties while learning American Sign Language (ASL). We are in the process of creating a mobile application to help hearing…

  14. A Case of Specific Language Impairment in a Deaf Signer of American Sign Language

    ERIC Educational Resources Information Center

    Quinto-Pozos, David; Singleton, Jenny L.; Hauser, Peter C.

    2017-01-01

    This article describes the case of a deaf native signer of American Sign Language (ASL) with a specific language impairment (SLI). School records documented normal cognitive development but atypical language development. Data include school records; interviews with the child, his mother, and school professionals; ASL and English evaluations; and a…

  15. Bilingual Education for Deaf Children in Sweden

    ERIC Educational Resources Information Center

    Svartholm, Kristina

    2010-01-01

    In 1981, Swedish Sign Language gained recognition by the Swedish Parliament as the language of deaf people, a decision that made Sweden the first country in the world to give a sign language the status of a language. Swedish was designated as a second language for deaf people, and the need for bilingualism among them was officially asserted. This…

  16. English to Sanskrit Machine Translation Using Transfer Based approach

    NASA Astrophysics Data System (ADS)

    Pathak, Ganesh R.; Godse, Sachin P.

    2010-11-01

    Translation is one of the needs of global society for communicating thoughts and ideas of one country with other country. Translation is the process of interpretation of text meaning and subsequent production of equivalent text, also called as communicating same meaning (message) in another language. In this paper we gave detail information on how to convert source language text in to target language text using Transfer Based Approach for machine translation. Here we implemented English to Sanskrit machine translator using transfer based approach. English is global language used for business and communication but large amount of population in India is not using and understand the English. Sanskrit is ancient language of India most of the languages in India are derived from Sanskrit. Sanskrit can be act as an intermediate language for multilingual translation.

  17. Functional changes in people with different hearing status and experiences of using Chinese sign language: an fMRI study.

    PubMed

    Li, Qiang; Xia, Shuang; Zhao, Fei; Qi, Ji

    2014-01-01

    The purpose of this study was to assess functional changes in the cerebral cortex in people with different sign language experience and hearing status whilst observing and imitating Chinese Sign Language (CSL) using functional magnetic resonance imaging (fMRI). 50 participants took part in the study, and were divided into four groups according to their hearing status and experience of using sign language: prelingual deafness signer group (PDS), normal hearing non-signer group (HnS), native signer group with normal hearing (HNS), and acquired signer group with normal hearing (HLS). fMRI images were scanned from all subjects when they performed block-designed tasks that involved observing and imitating sign language stimuli. Nine activation areas were found in response to undertaking either observation or imitation CSL tasks and three activated areas were found only when undertaking the imitation task. Of those, the PDS group had significantly greater activation areas in terms of the cluster size of the activated voxels in the bilateral superior parietal lobule, cuneate lobe and lingual gyrus in response to undertaking either the observation or the imitation CSL task than the HnS, HNS and HLS groups. The PDS group also showed significantly greater activation in the bilateral inferior frontal gyrus which was also found in the HNS or the HLS groups but not in the HnS group. This indicates that deaf signers have better sign language proficiency, because they engage more actively with the phonetic and semantic elements. In addition, the activations of the bilateral superior temporal gyrus and inferior parietal lobule were only found in the PDS group and HNS group, and not in the other two groups, which indicates that the area for sign language processing appears to be sensitive to the age of language acquisition. After reading this article, readers will be able to: discuss the relationship between sign language and its neural mechanisms. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. 28 CFR 55.19 - Written materials.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Judicial Administration DEPARTMENT OF JUSTICE (CONTINUED) IMPLEMENTATION OF THE PROVISIONS OF THE VOTING RIGHTS ACT REGARDING LANGUAGE MINORITY GROUPS Minority Language Materials and Assistance § 55.19 Written... will be lost if a separate minority language ballot or voting machine is used. (d) Voting machines...

  19. 28 CFR 55.19 - Written materials.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Judicial Administration DEPARTMENT OF JUSTICE (CONTINUED) IMPLEMENTATION OF THE PROVISIONS OF THE VOTING RIGHTS ACT REGARDING LANGUAGE MINORITY GROUPS Minority Language Materials and Assistance § 55.19 Written... will be lost if a separate minority language ballot or voting machine is used. (d) Voting machines...

  20. Impacts of Visual Sonority and Handshape Markedness on Second Language Learning of American Sign Language

    PubMed Central

    Newman, Sharlene D.

    2016-01-01

    The roles of visual sonority and handshape markedness in sign language acquisition and production were investigated. In Experiment 1, learners were taught sign-nonobject correspondences that varied in sign movement sonority and handshape markedness. Results from a sign-picture matching task revealed that high sonority signs were more accurately matched, especially when the sign contained a marked handshape. In Experiment 2, learners produced these familiar signs in addition to novel signs, which differed based on sonority and markedness. Results from a key-release reaction time reproduction task showed that learners tended to produce high sonority signs much more quickly than low sonority signs, especially when the sign contained an unmarked handshape. This effect was only present in familiar signs. Sign production accuracy rates revealed that high sonority signs were more accurate than low sonority signs. Similarly, signs with unmarked handshapes were produced more accurately than those with marked handshapes. Together, results from Experiments 1 and 2 suggested that signs that contain high sonority movements are more easily processed, both perceptually and productively, and handshape markedness plays a differential role in perception and production. PMID:26644551

  1. Static sign language recognition using 1D descriptors and neural networks

    NASA Astrophysics Data System (ADS)

    Solís, José F.; Toxqui, Carina; Padilla, Alfonso; Santiago, César

    2012-10-01

    A frame work for static sign language recognition using descriptors which represents 2D images in 1D data and artificial neural networks is presented in this work. The 1D descriptors were computed by two methods, first one consists in a correlation rotational operator.1 and second is based on contour analysis of hand shape. One of the main problems in sign language recognition is segmentation; most of papers report a special color in gloves or background for hand shape analysis. In order to avoid the use of gloves or special clothing, a thermal imaging camera was used to capture images. Static signs were picked up from 1 to 9 digits of American Sign Language, a multilayer perceptron reached 100% recognition with cross-validation.

  2. Cultural transmission through infant signs: Objects and actions in U.S. and Taiwan.

    PubMed

    Wang, Wen; Vallotton, Claire

    2016-08-01

    Infant signs are intentionally taught/learned symbolic gestures which can be used to represent objects, actions, requests, and mental state. Through infant signs, parents and infants begin to communicate specific concepts earlier than children's first spoken language. This study examines whether cultural differences in language are reflected in children's and parents' use of infant signs. Parents speaking East Asian languages with their children utilize verbs more often than do English-speaking mothers; and compared to their English-learning peers, Chinese children are more likely to learn verbs as they first acquire spoken words. By comparing parents' and infants' use of infant signs in the U.S. and Taiwan, we investigate cultural differences of noun/object versus verb/action bias before children's first language. Parents reported their own and their children's use of first infant signs retrospectively. Results show that cultural differences in parents' and children's infant sign use were consistent with research on early words, reflecting cultural differences in communication functions (referential versus regulatory) and child-rearing goals (independent versus interdependent). The current study provides evidence that intergenerational transmission of culture through symbols begins prior to oral language. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Benefits of augmentative signs in word learning: Evidence from children who are deaf/hard of hearing and children with specific language impairment.

    PubMed

    van Berkel-van Hoof, Lian; Hermans, Daan; Knoors, Harry; Verhoeven, Ludo

    2016-12-01

    Augmentative signs may facilitate word learning in children with vocabulary difficulties, for example, children who are Deaf/Hard of Hearing (DHH) and children with Specific Language Impairment (SLI). Despite the fact that augmentative signs may aid second language learning in populations with a typical language development, empirical evidence in favor of this claim is lacking. We aim to investigate whether augmentative signs facilitate word learning for DHH children, children with SLI, and typically developing (TD) children. Whereas previous studies taught children new labels for familiar objects, the present study taught new labels for new objects. In our word learning experiment children were presented with pictures of imaginary creatures and pseudo words. Half of the words were accompanied by an augmentative pseudo sign. The children were tested for their receptive word knowledge. The DHH children benefitted significantly from augmentative signs, but the children with SLI and TD age-matched peers did not score significantly different on words from either the sign or no-sign condition. These results suggest that using Sign-Supported speech in classrooms of bimodal bilingual DHH children may support their spoken language development. The difference between earlier research findings and the present results may be caused by a difference in methodology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Australian Aboriginal Deaf People and Aboriginal Sign Language

    ERIC Educational Resources Information Center

    Power, Des

    2013-01-01

    Many Australian Aboriginal people use a sign language ("hand talk") that mirrors their local spoken language and is used both in culturally appropriate settings when speech is taboo or counterindicated and for community communication. The characteristics of these languages are described, and early European settlers' reports of deaf…

  5. Sign Language and Pantomime Production Differentially Engage Frontal and Parietal Cortices

    ERIC Educational Resources Information Center

    Emmorey, Karen; McCullough, Stephen; Mehta, Sonya; Ponto, Laura L. B.; Grabowski, Thomas J.

    2011-01-01

    We investigated the functional organisation of neural systems supporting language production when the primary language articulators are also used for meaningful, but nonlinguistic, expression such as pantomime. Fourteen hearing nonsigners and 10 deaf native users of American Sign Language (ASL) participated in an H[subscript 2][superscript…

  6. Uncovering Translingual Practices in Teaching Parents Classical ASL Varieties

    ERIC Educational Resources Information Center

    Snoddon, Kristin

    2017-01-01

    The view of sign languages as bounded systems is often important for deaf community empowerment and for pedagogical practice in terms of supporting deaf children's language acquisition and second language learners' communicative competence. Conversely, the notion of translanguaging in the American Sign Language (ASL) community highlights a number…

  7. The neural correlates of highly iconic structures and topographic discourse in French Sign Language as observed in six hearing native signers.

    PubMed

    Courtin, C; Hervé, P-Y; Petit, L; Zago, L; Vigneau, M; Beaucousin, V; Jobard, G; Mazoyer, B; Mellet, E; Tzourio-Mazoyer, N

    2010-09-01

    "Highly iconic" structures in Sign Language enable a narrator to act, switch characters, describe objects, or report actions in four-dimensions. This group of linguistic structures has no real spoken-language equivalent. Topographical descriptions are also achieved in a sign-language specific manner via the use of signing-space and spatial-classifier signs. We used functional magnetic resonance imaging (fMRI) to compare the neural correlates of topographic discourse and highly iconic structures in French Sign Language (LSF) in six hearing native signers, children of deaf adults (CODAs), and six LSF-naïve monolinguals. LSF materials consisted of videos of a lecture excerpt signed without spatially organized discourse or highly iconic structures (Lect LSF), a tale signed using highly iconic structures (Tale LSF), and a topographical description using a diagrammatic format and spatial-classifier signs (Topo LSF). We also presented texts in spoken French (Lect French, Tale French, Topo French) to all participants. With both languages, the Topo texts activated several different regions that are involved in mental navigation and spatial working memory. No specific correlate of LSF spatial discourse was evidenced. The same regions were more activated during Tale LSF than Lect LSF in CODAs, but not in monolinguals, in line with the presence of signing-space structure in both conditions. Motion processing areas and parts of the fusiform gyrus and precuneus were more active during Tale LSF in CODAs; no such effect was observed with French or in LSF-naïve monolinguals. These effects may be associated with perspective-taking and acting during personal transfers. 2010 Elsevier Inc. All rights reserved.

  8. Early Sign Language Experience Goes along with an Increased Cross-Modal Gain for Affective Prosodic Recognition in Congenitally Deaf CI Users

    ERIC Educational Resources Information Center

    Fengler, Ineke; Delfau, Pia-Céline; Röder, Brigitte

    2018-01-01

    It is yet unclear whether congenitally deaf cochlear implant (CD CI) users' visual and multisensory emotion perception is influenced by their history in sign language acquisition. We hypothesized that early-signing CD CI users, relative to late-signing CD CI users and hearing, non-signing controls, show better facial expression recognition and…

  9. Visual Iconicity Across Sign Languages: Large-Scale Automated Video Analysis of Iconic Articulators and Locations

    PubMed Central

    Östling, Robert; Börstell, Carl; Courtaux, Servane

    2018-01-01

    We use automatic processing of 120,000 sign videos in 31 different sign languages to show a cross-linguistic pattern for two types of iconic form–meaning relationships in the visual modality. First, we demonstrate that the degree of inherent plurality of concepts, based on individual ratings by non-signers, strongly correlates with the number of hands used in the sign forms encoding the same concepts across sign languages. Second, we show that certain concepts are iconically articulated around specific parts of the body, as predicted by the associational intuitions by non-signers. The implications of our results are both theoretical and methodological. With regard to theoretical implications, we corroborate previous research by demonstrating and quantifying, using a much larger material than previously available, the iconic nature of languages in the visual modality. As for the methodological implications, we show how automatic methods are, in fact, useful for performing large-scale analysis of sign language data, to a high level of accuracy, as indicated by our manual error analysis. PMID:29867684

  10. Ada Compiler Validation Summary Report: Certificate Number: 880318W1. 09041, International Business Machines Corporation, IBM Development System for the Ada Language, Version 2.1.0, IBM 4381 under VM/HPO, Host and Target

    DTIC Science & Technology

    1988-03-28

    International Business Machines Corporation IBM Development System for the Ada Language, Version 2.1.0 IBM 4381 under VM/HPO, host and target DTIC...necessary and identify by block number) International Business Machines Corporation, IBM Development System for the Ada Language, Version 2.1.0, IBM...in the compiler listed in this declaration. I declare that International Business Machines Corporation is the owner of record of the object code of the

  11. Ada Compiler Validation Summary Report: Certificate Number 89020W1. 10073: International Business Machines Corporation, IBM Development System for the Ada Language, VM/CMS Ada Compiler, Version 2.1.1, IBM 3083 (Host and Target)

    DTIC Science & Technology

    1989-04-20

    International Business Machines Corporation) IBM Development System for the Ada Language, VN11/CMS Ada Compiler, Version 2.1.1, Wright-Patterson AFB, IBM 3083...890420W1.10073 International Business Machines Corporation IBM Development System for the Ada Language VM/CMS Ada Compiler Version 2.1.1 IBM 3083... International Business Machines Corporation and reviewed by the validation team. The compiler was tested using all default option settings except for the

  12. Ada Compiler Validation Summary Report: Certificate Number: 890420W1. 10075 International Business Machines Corporation. IBM Development System, for the Ada Language CMS/MVS Ada Cross Compiler, Version 2.1.1 IBM 3083 Host and IBM 4381 Target

    DTIC Science & Technology

    1989-04-20

    International business Machines Corporati,:i IBM Development System for the Ada Language, CMS/MVS Ada Cross Compiler, Version 2.1.1, Wright-Patterson AFB, IBM...VALIDATION SUMMARY REPORT: Certificate Number: 890420W1.10075 International Business Machines Corporation IBM Development System for the Ada Language CMS...command scripts provided by International Business Machines Corporation and reviewed by the validation team. The compiler was tested using all default

  13. When does a system become phonological? Handshape production in gesturers, signers, and homesigners

    PubMed Central

    Coppola, Marie; Mazzoni, Laura; Goldin-Meadow, Susan

    2013-01-01

    Sign languages display remarkable crosslinguistic consistencies in the use of handshapes. In particular, handshapes used in classifier predicates display a consistent pattern in finger complexity: classifier handshapes representing objects display more finger complexity than those representing how objects are handled. Here we explore the conditions under which this morphophonological phenomenon arises. In Study 1, we ask whether hearing individuals in Italy and the United States, asked to communicate using only their hands, show the same pattern of finger complexity found in the classifier handshapes of two sign languages: Italian Sign Language (LIS) and American Sign Language (ASL). We find that they do not: gesturers display more finger complexity in handling handshapes than in object handshapes. The morphophonological pattern found in conventional sign languages is therefore not a codified version of the pattern invented by hearing individuals on the spot. In Study 2, we ask whether continued use of gesture as a primary communication system results in a pattern that is more similar to the morphophonological pattern found in conventional sign languages or to the pattern found in gesturers. Homesigners have not acquired a signed or spoken language and instead use a self-generated gesture system to communicate with their hearing family members and friends. We find that homesigners pattern more like signers than like gesturers: their finger complexity in object handshapes is higher than that of gesturers (indeed as high as signers); and their finger complexity in handling handshapes is lower than that of gesturers (but not quite as low as signers). Generally, our findings indicate two markers of the phonologization of handshape in sign languages: increasing finger complexity in object handshapes, and decreasing finger complexity in handling handshapes. These first indicators of phonology appear to be present in individuals developing a gesture system without benefit of a linguistic community. Finally, we propose that iconicity, morphology and phonology each play an important role in the system of sign language classifiers to create the earliest markers of phonology at the morphophonological interface. PMID:23723534

  14. Syntactic priming in American Sign Language.

    PubMed

    Hall, Matthew L; Ferreira, Victor S; Mayberry, Rachel I

    2015-01-01

    Psycholinguistic studies of sign language processing provide valuable opportunities to assess whether language phenomena, which are primarily studied in spoken language, are fundamentally shaped by peripheral biology. For example, we know that when given a choice between two syntactically permissible ways to express the same proposition, speakers tend to choose structures that were recently used, a phenomenon known as syntactic priming. Here, we report two experiments testing syntactic priming of a noun phrase construction in American Sign Language (ASL). Experiment 1 shows that second language (L2) signers with normal hearing exhibit syntactic priming in ASL and that priming is stronger when the head noun is repeated between prime and target (the lexical boost effect). Experiment 2 shows that syntactic priming is equally strong among deaf native L1 signers, deaf late L1 learners, and hearing L2 signers. Experiment 2 also tested for, but did not find evidence of, phonological or semantic boosts to syntactic priming in ASL. These results show that despite the profound differences between spoken and signed languages in terms of how they are produced and perceived, the psychological representation of sentence structure (as assessed by syntactic priming) operates similarly in sign and speech.

  15. Space languages

    NASA Technical Reports Server (NTRS)

    Hays, Dan

    1987-01-01

    Applications of linguistic principles to potential problems of human and machine communication in space settings are discussed. Variations in language among speakers of different backgrounds and change in language forms resulting from new experiences or reduced contact with other groups need to be considered in the design of intelligent machine systems.

  16. The paradigm compiler: Mapping a functional language for the connection machine

    NASA Technical Reports Server (NTRS)

    Dennis, Jack B.

    1989-01-01

    The Paradigm Compiler implements a new approach to compiling programs written in high level languages for execution on highly parallel computers. The general approach is to identify the principal data structures constructed by the program and to map these structures onto the processing elements of the target machine. The mapping is chosen to maximize performance as determined through compile time global analysis of the source program. The source language is Sisal, a functional language designed for scientific computations, and the target language is Paris, the published low level interface to the Connection Machine. The data structures considered are multidimensional arrays whose dimensions are known at compile time. Computations that build such arrays usually offer opportunities for highly parallel execution; they are data parallel. The Connection Machine is an attractive target for these computations, and the parallel for construct of the Sisal language is a convenient high level notation for data parallel algorithms. The principles and organization of the Paradigm Compiler are discussed.

  17. "We Communicated That Way for a Reason": Language Practices and Language Ideologies among Hearing Adults Whose Parents Are Deaf

    ERIC Educational Resources Information Center

    Pizer, Ginger; Walters, Keith; Meier, Richard P.

    2013-01-01

    Families with deaf parents and hearing children are often bilingual and bimodal, with both a spoken language and a signed one in regular use among family members. When interviewed, 13 American hearing adults with deaf parents reported widely varying language practices, sign language abilities, and social affiliations with Deaf and Hearing…

  18. Psychometric properties of a sign language version of the Mini International Neuropsychiatric Interview (MINI).

    PubMed

    Øhre, Beate; Saltnes, Hege; von Tetzchner, Stephen; Falkum, Erik

    2014-05-22

    There is a need for psychiatric assessment instruments that enable reliable diagnoses in persons with hearing loss who have sign language as their primary language. The objective of this study was to assess the validity of the Norwegian Sign Language (NSL) version of the Mini International Neuropsychiatric Interview (MINI). The MINI was translated into NSL. Forty-one signing patients consecutively referred to two specialised psychiatric units were assessed with a diagnostic interview by clinical experts and with the MINI. Inter-rater reliability was assessed with Cohen's kappa and "observed agreement". There was 65% agreement between MINI diagnoses and clinical expert diagnoses. Kappa values indicated fair to moderate agreement, and observed agreement was above 76% for all diagnoses. The MINI diagnosed more co-morbid conditions than did the clinical expert interview (mean diagnoses: 1.9 versus 1.2). Kappa values indicated moderate to substantial agreement, and "observed agreement" was above 88%. The NSL version performs similarly to other MINI versions and demonstrates adequate reliability and validity as a diagnostic instrument for assessing mental disorders in persons who have sign language as their primary and preferred language.

  19. The Beneficial Role of L1 Spoken Language Skills on Initial L2 Sign Language Learning: Cognitive and Linguistic Predictors of M2L2 Acquisition

    ERIC Educational Resources Information Center

    Williams, Joshua T.; Darcy, Isabelle; Newman, Sharlene D.

    2017-01-01

    Understanding how language modality (i.e., signed vs. spoken) affects second language outcomes in hearing adults is important both theoretically and pedagogically, as it can determine the specificity of second language (L2) theory and inform how best to teach a language that uses a new modality. The present study investigated which…

  20. Combining Machine Learning and Natural Language Processing to Assess Literary Text Comprehension

    ERIC Educational Resources Information Center

    Balyan, Renu; McCarthy, Kathryn S.; McNamara, Danielle S.

    2017-01-01

    This study examined how machine learning and natural language processing (NLP) techniques can be leveraged to assess the interpretive behavior that is required for successful literary text comprehension. We compared the accuracy of seven different machine learning classification algorithms in predicting human ratings of student essays about…

  1. Cerebral organization of oral and signed language responses: case study evidence from amytal and cortical stimulation studies.

    PubMed

    Mateer, C A; Rapport, R L; Kettrick, C

    1984-01-01

    A normally hearing left-handed patient familiar with American Sign Language (ASL) was assessed under sodium amytal conditions and with left cortical stimulation in both oral speech and signed English. Lateralization was mixed but complementary in each language mode: the right hemisphere perfusion severely disrupted motoric aspects of both types of language expression, the left hemisphere perfusion specifically disrupted features of grammatical and semantic usage in each mode of expression. Both semantic and syntactic aspects of oral and signed responses were altered during left posterior temporal-parietal stimulation. Findings are discussed in terms of the neurological organization of ASL and linguistic organization in cases of early left hemisphere damage.

  2. Hierarchically Structured Non-Intrusive Sign Language Recognition. Chapter 2

    NASA Technical Reports Server (NTRS)

    Zieren, Jorg; Zieren, Jorg; Kraiss, Karl-Friedrich

    2007-01-01

    This work presents a hierarchically structured approach at the nonintrusive recognition of sign language from a monocular frontal view. Robustness is achieved through sophisticated localization and tracking methods, including a combined EM/CAMSHIFT overlap resolution procedure and the parallel pursuit of multiple hypotheses about hands position and movement. This allows handling of ambiguities and automatically corrects tracking errors. A biomechanical skeleton model and dynamic motion prediction using Kalman filters represents high level knowledge. Classification is performed by Hidden Markov Models. 152 signs from German sign language were recognized with an accuracy of 97.6%.

  3. The history of sign language and deaf education in Turkey.

    PubMed

    Kemaloğlu, Yusuf Kemal; Kemaloğlu, Pınar Yaprak

    2012-01-01

    Sign language is the natural language of the prelingually deaf people particularly without hearing-speech rehabilitation. Otorhinolaryngologists, regarding health as complete physical, mental and psychosocial well-being, aim hearing by diagnosing deafness as deviance from normality. However, it's obvious that the perception conflicted with the behavior which does not meet the mental and social well-being of the individual also contradicts with the definition mentioned above. This article aims to investigate the effects of hearing-speech target ignoring the sign language in Turkish population and its consistency with the history through statistical data, scientific publications and historical documents and to support critical perspective on this issue. The study results showed that maximum 50% of the deaf benefited from hearing-speech program for last 60 years before hearing screening programs; however, systems including sign language in education were not generated. In the light of these data, it is clear that the approach ignoring sign language particularly before the development of screening programs is not reasonable. In addition, considering sign language being part of the Anatolian history from Hittites to Ottomans, it is a question to be answered that why evaluation, habilitation and education systems excluding sign language are still the only choice for deaf individuals in Turkey. Despite legislative amendments in the last 6-7 years, the primary cause of failure to come into force is probably because of inadequate conception of the issue content and importance, as well as limited effort to offer solutions by academicians and authorized politicians. Within this context, this paper aims to make a positive effect on this issue offering a review for the medical staff, particularly otorhinolaryngologists and audiologists.

  4. Language Justice for Sign Language Peoples: The UN Convention on the Rights of Persons with Disabilities

    ERIC Educational Resources Information Center

    Batterbury, Sarah C. E.

    2012-01-01

    Sign Language Peoples (SLPs) across the world have developed their own languages and visuo-gestural-tactile cultures embodying their collective sense of Deafhood (Ladd 2003). Despite this, most nation-states treat their respective SLPs as disabled individuals, favoring disability benefits, cochlear implants, and mainstream education over language…

  5. Visual Sonority Modulates Infants' Attraction to Sign Language

    ERIC Educational Resources Information Center

    Stone, Adam; Petitto, Laura-Ann; Bosworth, Rain

    2018-01-01

    The infant brain may be predisposed to identify perceptually salient cues that are common to both signed and spoken languages. Recent theory based on spoken languages has advanced sonority as one of these potential language acquisition cues. Using a preferential looking paradigm with an infrared eye tracker, we explored visual attention of hearing…

  6. Language lateralization of hearing native signers: A functional transcranial Doppler sonography (fTCD) study of speech and sign production

    PubMed Central

    Gutierrez-Sigut, Eva; Daws, Richard; Payne, Heather; Blott, Jonathan; Marshall, Chloë; MacSweeney, Mairéad

    2016-01-01

    Neuroimaging studies suggest greater involvement of the left parietal lobe in sign language compared to speech production. This stronger activation might be linked to the specific demands of sign encoding and proprioceptive monitoring. In Experiment 1 we investigate hemispheric lateralization during sign and speech generation in hearing native users of English and British Sign Language (BSL). Participants exhibited stronger lateralization during BSL than English production. In Experiment 2 we investigated whether this increased lateralization index could be due exclusively to the higher motoric demands of sign production. Sign naïve participants performed a phonological fluency task in English and a non-sign repetition task. Participants were left lateralized in the phonological fluency task but there was no consistent pattern of lateralization for the non-sign repetition in these hearing non-signers. The current data demonstrate stronger left hemisphere lateralization for producing signs than speech, which was not primarily driven by motoric articulatory demands. PMID:26605960

  7. The Use of Sign Language Pronouns by Native-Signing Children with Autism

    ERIC Educational Resources Information Center

    Shield, Aaron; Meier, Richard P.; Tager-Flusberg, Helen

    2015-01-01

    We report the first study on pronoun use by an under-studied research population, children with autism spectrum disorder (ASD) exposed to American Sign Language from birth by their deaf parents. Personal pronouns cause difficulties for hearing children with ASD, who sometimes reverse or avoid them. Unlike speech pronouns, sign pronouns are…

  8. Signing Science! Andy And Tonya Are Just Like Me! They Wear Hearing Aids And Know My Language!?

    ERIC Educational Resources Information Center

    Vesel, Judy

    2005-01-01

    Are these students talking about their classmates? No, they are describing the Signing Avatar characters--3-D figures who appear on the EnViSci Network Web site and sign the resources and activities in American Sign Language (ASL) or Signed English (SE). During the 2003?04 school year, students in schools for the deaf and hard of hearing…

  9. Signs in Which Handshape and Hand Orientation Are either Not Visible or Are Only Partially Visible: What Is the Consequence for Lexical Recognition?

    ERIC Educational Resources Information Center

    ten Holt, G. A.; van Doorn, A. J.; de Ridder, H.; Reinders, M. J. T.; Hendriks, E. A.

    2009-01-01

    We present the results of an experiment on lexical recognition of human sign language signs in which the available perceptual information about handshape and hand orientation was manipulated. Stimuli were videos of signs from Sign Language of the Netherlands (SLN). The videos were processed to create four conditions: (1) one in which neither…

  10. Sign Language Echolalia in Deaf Children With Autism Spectrum Disorder

    PubMed Central

    Cooley, Frances; Meier, Richard P.

    2017-01-01

    Purpose We present the first study of echolalia in deaf, signing children with autism spectrum disorder (ASD). We investigate the nature and prevalence of sign echolalia in native-signing children with ASD, the relationship between sign echolalia and receptive language, and potential modality differences between sign and speech. Method Seventeen deaf children with ASD and 18 typically developing (TD) deaf children were video-recorded in a series of tasks. Data were coded for type of signs produced (spontaneous, elicited, echo, or nonecho repetition). Echoes were coded as pure or partial, and timing and reduplication of echoes were coded. Results Seven of the 17 deaf children with ASD produced signed echoes, but none of the TD deaf children did. The echoic children had significantly lower receptive language scores than did both the nonechoic children with ASD and the TD children. Modality differences also were found in terms of the directionality, timing, and reduplication of echoes. Conclusions Deaf children with ASD sometimes echo signs, just as hearing children with ASD sometimes echo words, and TD deaf children and those with ASD do so at similar stages of linguistic development, when comprehension is relatively low. The sign language modality might provide a powerful new framework for analyzing the purpose and function of echolalia in deaf children with ASD. PMID:28586822

  11. Sign Language Echolalia in Deaf Children With Autism Spectrum Disorder.

    PubMed

    Shield, Aaron; Cooley, Frances; Meier, Richard P

    2017-06-10

    We present the first study of echolalia in deaf, signing children with autism spectrum disorder (ASD). We investigate the nature and prevalence of sign echolalia in native-signing children with ASD, the relationship between sign echolalia and receptive language, and potential modality differences between sign and speech. Seventeen deaf children with ASD and 18 typically developing (TD) deaf children were video-recorded in a series of tasks. Data were coded for type of signs produced (spontaneous, elicited, echo, or nonecho repetition). Echoes were coded as pure or partial, and timing and reduplication of echoes were coded. Seven of the 17 deaf children with ASD produced signed echoes, but none of the TD deaf children did. The echoic children had significantly lower receptive language scores than did both the nonechoic children with ASD and the TD children. Modality differences also were found in terms of the directionality, timing, and reduplication of echoes. Deaf children with ASD sometimes echo signs, just as hearing children with ASD sometimes echo words, and TD deaf children and those with ASD do so at similar stages of linguistic development, when comprehension is relatively low. The sign language modality might provide a powerful new framework for analyzing the purpose and function of echolalia in deaf children with ASD.

  12. Aphasia in a prelingually deaf woman.

    PubMed

    Chiarello, C; Knight, R; Mandel, M

    1982-03-01

    A left parietal infarct in a prelingually deaf person resulted in an aphasia for both American Sign Language (ASL) and written and finger-spelled English. Originally the patient had a nearly global aphasia affecting all language systems. By five to seven weeks post-onset her symptoms resembled those of hearing aphasics with posterior lesions: fluent but paraphasic signing, anomia, impaired comprehension and repetition, alexia, and agraphia with elements of neologistic jargon. In addition, there was a pronounced sequential movement copying disorder, reduced short-term verbal memory and acalculia. In general, the patient's sign errors showed a consistent disruption in the structure of ASL signs which parallels the speech errors of oral aphasic patients. We conclude that most aphasic symptoms are not modality-dependent, but rather reflect a disruption of linguistic processes common to all human languages. This case confirms the importance of the left hemisphere in the processing of sign language. Furthermore, the results indicate that the left supramarginal and angular gyri are necessary substrates for the comprehension of visual/gestural languages.

  13. Sign Language Acquisition and Use by Single-Generation Deaf Adults in Australia Who Attended Specific Educational Settings for Deaf and Hard of Hearing Children

    ERIC Educational Resources Information Center

    Winn, Stephen

    2007-01-01

    This article examines the acquisition and use of Australian Sign Language (Auslan) by 53 profoundly deaf adults (31 male, 22 female) who attended educational units for deaf and hard of hearing children. The results indicate that, regardless of age, the acquisition of sign language, particularly Auslan, by deaf people occurred primarily through…

  14. Content Questions In American Sign Language: An RRG Analysis

    DTIC Science & Technology

    2004-12-08

    a temporal framework, someone might sign (29) DURING FIVE YEAR YONDER GALLAUDET … During my five years at Gallaudet …. Until a new topic is...Language: A Teacher’s Resource on Grammar and Culture. Washington, D.C.: Gallaudet University Press. BATTISON, ROBBIN. 1978. Loan Signs from...Typology and Syntactic Description, ed. by Timothy Shopen. Cambridge, MA: Cambridge University Press. —. In press b. Clause Types. Language Typology

  15. American Sign Language Syntactic and Narrative Comprehension in Skilled and Less Skilled Readers: Bilingual and Bimodal Evidence for the Linguistic Basis of Reading

    ERIC Educational Resources Information Center

    Chamberlain, Charlene; Mayberry, Rachel I.

    2008-01-01

    We tested the hypothesis that syntactic and narrative comprehension of a natural sign language can serve as the linguistic basis for skilled reading. Thirty-one adults who were deaf from birth and used American Sign Language (ASL) were classified as skilled or less skilled readers using an eighth-grade criterion. Proficiency with ASL syntax, and…

  16. Kinematic Parameters of Signed Verbs

    ERIC Educational Resources Information Center

    Malaia, Evie; Wilbur, Ronnie B.; Milkovic, Marina

    2013-01-01

    Purpose: Sign language users recruit physical properties of visual motion to convey linguistic information. Research on American Sign Language (ASL) indicates that signers systematically use kinematic features (e.g., velocity, deceleration) of dominant hand motion for distinguishing specific semantic properties of verb classes in production…

  17. Language and literacy development of deaf and hard-of-hearing children: successes and challenges.

    PubMed

    Lederberg, Amy R; Schick, Brenda; Spencer, Patricia E

    2013-01-01

    Childhood hearing loss presents challenges to language development, especially spoken language. In this article, we review existing literature on deaf and hard-of-hearing (DHH) children's patterns and trajectories of language as well as development of theory of mind and literacy. Individual trajectories vary significantly, reflecting access to early identification/intervention, advanced technologies (e.g., cochlear implants), and perceptually accessible language models. DHH children develop sign language in a similar manner as hearing children develop spoken language, provided they are in a language-rich environment. This occurs naturally for DHH children of deaf parents, who constitute 5% of the deaf population. For DHH children of hearing parents, sign language development depends on the age that they are exposed to a perceptually accessible 1st language as well as the richness of input. Most DHH children are born to hearing families who have spoken language as a goal, and such development is now feasible for many children. Some DHH children develop spoken language in bilingual (sign-spoken language) contexts. For the majority of DHH children, spoken language development occurs in either auditory-only contexts or with sign supports. Although developmental trajectories of DHH children with hearing parents have improved with early identification and appropriate interventions, the majority of children are still delayed compared with hearing children. These DHH children show particular weaknesses in the development of grammar. Language deficits and differences have cascading effects in language-related areas of development, such as theory of mind and literacy development.

  18. Child Modifiability as a Predictor of Language Abilities in Deaf Children Who Use American Sign Language.

    PubMed

    Mann, Wolfgang; Peña, Elizabeth D; Morgan, Gary

    2015-08-01

    This research explored the use of dynamic assessment (DA) for language-learning abilities in signing deaf children from deaf and hearing families. Thirty-seven deaf children, aged 6 to 11 years, were identified as either stronger (n = 26) or weaker (n = 11) language learners according to teacher or speech-language pathologist report. All children received 2 scripted, mediated learning experience sessions targeting vocabulary knowledge—specifically, the use of semantic categories that were carried out in American Sign Language. Participant responses to learning were measured in terms of an index of child modifiability. This index was determined separately at the end of the 2 individual sessions. It combined ratings reflecting each child's learning abilities and responses to mediation, including social-emotional behavior, cognitive arousal, and cognitive elaboration. Group results showed that modifiability ratings were significantly better for stronger language learners than for weaker language learners. The strongest predictors of language ability were cognitive arousal and cognitive elaboration. Mediator ratings of child modifiability (i.e., combined score of social-emotional factors and cognitive factors) are highly sensitive to language-learning abilities in deaf children who use sign language as their primary mode of communication. This method can be used to design targeted interventions.

  19. Graph theoretical analysis of functional network for comprehension of sign language.

    PubMed

    Liu, Lanfang; Yan, Xin; Liu, Jin; Xia, Mingrui; Lu, Chunming; Emmorey, Karen; Chu, Mingyuan; Ding, Guosheng

    2017-09-15

    Signed languages are natural human languages using the visual-motor modality. Previous neuroimaging studies based on univariate activation analysis show that a widely overlapped cortical network is recruited regardless whether the sign language is comprehended (for signers) or not (for non-signers). Here we move beyond previous studies by examining whether the functional connectivity profiles and the underlying organizational structure of the overlapped neural network may differ between signers and non-signers when watching sign language. Using graph theoretical analysis (GTA) and fMRI, we compared the large-scale functional network organization in hearing signers with non-signers during the observation of sentences in Chinese Sign Language. We found that signed sentences elicited highly similar cortical activations in the two groups of participants, with slightly larger responses within the left frontal and left temporal gyrus in signers than in non-signers. Crucially, further GTA revealed substantial group differences in the topologies of this activation network. Globally, the network engaged by signers showed higher local efficiency (t (24) =2.379, p=0.026), small-worldness (t (24) =2.604, p=0.016) and modularity (t (24) =3.513, p=0.002), and exhibited different modular structures, compared to the network engaged by non-signers. Locally, the left ventral pars opercularis served as a network hub in the signer group but not in the non-signer group. These findings suggest that, despite overlap in cortical activation, the neural substrates underlying sign language comprehension are distinguishable at the network level from those for the processing of gestural action. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Case Studies of Multilingual/Multicultural Asian Deaf Adults: Strategies for Success.

    PubMed

    Wang, Qiuying; Andrews, Jean; Liu, Hsiu Tan; Liu, Chun Jung

    2016-01-01

    Case studies of adult d/Deaf or Hard of Hearing Multilingual Learners (DMLs) are few, especially studies of DMLs who learn more than one sign language and read logographic and alphabetic scripts. To reduce this paucity, two descriptive case studies are presented. Written questionnaires, face-to-face interviews, and self-appraisals of language-use rubrics were used to explore (a) the language and literacy histories of two adult Asian DMLs who had learned multiple languages: Chinese (spoken/written), English (written), Chinese Sign Language, and American Sign Language; and (b) how each language was used in different cultural communities with diverse conversational partners. Home literacy environment, family support, visual access to languages, peer and sibling support, role models, encouragement, perseverance, and Deaf identity all played vital roles in the participants' academic success. The findings provide insights into the acquisition of multiple languages and bi-literacy through social communication and academic content.

  1. Language Promotes False-Belief Understanding

    PubMed Central

    Pyers, Jennie E.; Senghas, Ann

    2010-01-01

    Developmental studies have identified a strong correlation in the timing of language development and false-belief understanding. However, the nature of this relationship remains unresolved. Does language promote false-belief understanding, or does it merely facilitate development that could occur independently, albeit on a delayed timescale? We examined language development and false-belief understanding in deaf learners of an emerging sign language in Nicaragua. The use of mental-state vocabulary and performance on a low-verbal false-belief task were assessed, over 2 years, in adult and adolescent users of Nicaraguan Sign Language. Results show that those adults who acquired a nascent form of the language during childhood produce few mental-state signs and fail to exhibit false-belief understanding. Furthermore, those whose language developed over the period of the study correspondingly developed in false-belief understanding. Thus, language learning, over and above social experience, drives the development of a mature theory of mind. PMID:19515119

  2. Children creating language: how Nicaraguan sign language acquired a spatial grammar.

    PubMed

    Senghas, A; Coppola, M

    2001-07-01

    It has long been postulated that language is not purely learned, but arises from an interaction between environmental exposure and innate abilities. The innate component becomes more evident in rare situations in which the environment is markedly impoverished. The present study investigated the language production of a generation of deaf Nicaraguans who had not been exposed to a developed language. We examined the changing use of early linguistic structures (specifically, spatial modulations) in a sign language that has emerged since the Nicaraguan group first came together: In tinder two decades, sequential cohorts of learners systematized the grammar of this new sign language. We examined whether the systematicity being added to the language stems from children or adults: our results indicate that such changes originate in children aged 10 and younger Thus, sequential cohorts of interacting young children collectively: possess the capacity not only to learn, but also to create, language.

  3. The Impact of Input Quality on Early Sign Development in Native and Non-Native Language Learners

    ERIC Educational Resources Information Center

    Lu, Jenny; Jones, Anna; Morgan, Gary

    2016-01-01

    There is debate about how input variation influences child language. Most deaf children are exposed to a sign language from their non-fluent hearing parents and experience a delay in exposure to accessible language. A small number of children receive language input from their deaf parents who are fluent signers. Thus it is possible to document the…

  4. Identifying Specific Language Impairment in Deaf Children Acquiring British Sign Language: Implications for Theory and Practice

    ERIC Educational Resources Information Center

    Mason, Kathryn; Rowley, Katherine; Marshall, Chloe R.; Atkinson, Joanna R.; Herman, Rosalind; Woll, Bencie; Morgan, Gary

    2010-01-01

    This paper presents the first ever group study of specific language impairment (SLI) in users of sign language. A group of 50 children were referred to the study by teachers and speech and language therapists. Individuals who fitted pre-determined criteria for SLI were then systematically assessed. Here, we describe in detail the performance of 13…

  5. MACHINE TRANSLATION RESEARCH DURING THE PAST TWO YEARS.

    ERIC Educational Resources Information Center

    LEHMANN, W.P.

    THE AUTHOR RECOUNTS THE RISE IN IMPORTANCE OF MACHINE TRANSLATION, WHICH TOGETHER WITH LANGUAGE LEARNING AND TEACHING COMPRISE THE MAJOR FIELDS OF APPLIED LINGUISTICS. MUCH OF THE RECENT THEORETICAL WORK ON LANGUAGE DEALS WITH THE PROBLEM OF THE RELATIONSHIP BETWEEN THE SURFACE SYNTACTIC STRUCTURE OF LANGUAGE AND THE UNDERLYING STRUCTURE. THE…

  6. Impacts of Visual Sonority and Handshape Markedness on Second Language Learning of American Sign Language.

    PubMed

    Williams, Joshua T; Newman, Sharlene D

    2016-04-01

    The roles of visual sonority and handshape markedness in sign language acquisition and production were investigated. In Experiment 1, learners were taught sign-nonobject correspondences that varied in sign movement sonority and handshape markedness. Results from a sign-picture matching task revealed that high sonority signs were more accurately matched, especially when the sign contained a marked handshape. In Experiment 2, learners produced these familiar signs in addition to novel signs, which differed based on sonority and markedness. Results from a key-release reaction time reproduction task showed that learners tended to produce high sonority signs much more quickly than low sonority signs, especially when the sign contained an unmarked handshape. This effect was only present in familiar signs. Sign production accuracy rates revealed that high sonority signs were more accurate than low sonority signs. Similarly, signs with unmarked handshapes were produced more accurately than those with marked handshapes. Together, results from Experiments 1 and 2 suggested that signs that contain high sonority movements are more easily processed, both perceptually and productively, and handshape markedness plays a differential role in perception and production. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Extricating Manual and Non-Manual Features for Subunit Level Medical Sign Modelling in Automatic Sign Language Classification and Recognition.

    PubMed

    R, Elakkiya; K, Selvamani

    2017-09-22

    Subunit segmenting and modelling in medical sign language is one of the important studies in linguistic-oriented and vision-based Sign Language Recognition (SLR). Many efforts were made in the precedent to focus the functional subunits from the view of linguistic syllables but the problem is implementing such subunit extraction using syllables is not feasible in real-world computer vision techniques. And also, the present recognition systems are designed in such a way that it can detect the signer dependent actions under restricted and laboratory conditions. This research paper aims at solving these two important issues (1) Subunit extraction and (2) Signer independent action on visual sign language recognition. Subunit extraction involved in the sequential and parallel breakdown of sign gestures without any prior knowledge on syllables and number of subunits. A novel Bayesian Parallel Hidden Markov Model (BPaHMM) is introduced for subunit extraction to combine the features of manual and non-manual parameters to yield better results in classification and recognition of signs. Signer independent action aims in using a single web camera for different signer behaviour patterns and for cross-signer validation. Experimental results have proved that the proposed signer independent subunit level modelling for sign language classification and recognition has shown improvement and variations when compared with other existing works.

  8. Segmentation of British Sign Language (BSL): mind the gap!

    PubMed

    Orfanidou, Eleni; McQueen, James M; Adam, Robert; Morgan, Gary

    2015-01-01

    This study asks how users of British Sign Language (BSL) recognize individual signs in connected sign sequences. We examined whether this is achieved through modality-specific or modality-general segmentation procedures. A modality-specific feature of signed languages is that, during continuous signing, there are salient transitions between sign locations. We used the sign-spotting task to ask if and how BSL signers use these transitions in segmentation. A total of 96 real BSL signs were preceded by nonsense signs which were produced in either the target location or another location (with a small or large transition). Half of the transitions were within the same major body area (e.g., head) and half were across body areas (e.g., chest to hand). Deaf adult BSL users (a group of natives and early learners, and a group of late learners) spotted target signs best when there was a minimal transition and worst when there was a large transition. When location changes were present, both groups performed better when transitions were to a different body area than when they were within the same area. These findings suggest that transitions do not provide explicit sign-boundary cues in a modality-specific fashion. Instead, we argue that smaller transitions help recognition in a modality-general way by limiting lexical search to signs within location neighbourhoods, and that transitions across body areas also aid segmentation in a modality-general way, by providing a phonotactic cue to a sign boundary. We propose that sign segmentation is based on modality-general procedures which are core language-processing mechanisms.

  9. Kinematic parameters of signed verbs.

    PubMed

    Malaia, Evie; Wilbur, Ronnie B; Milkovic, Marina

    2013-10-01

    Sign language users recruit physical properties of visual motion to convey linguistic information. Research on American Sign Language (ASL) indicates that signers systematically use kinematic features (e.g., velocity, deceleration) of dominant hand motion for distinguishing specific semantic properties of verb classes in production ( Malaia & Wilbur, 2012a) and process these distinctions as part of the phonological structure of these verb classes in comprehension ( Malaia, Ranaweera, Wilbur, & Talavage, 2012). These studies are driven by the event visibility hypothesis by Wilbur (2003), who proposed that such use of kinematic features should be universal to sign language (SL) by the grammaticalization of physics and geometry for linguistic purposes. In a prior motion capture study, Malaia and Wilbur (2012a) lent support for the event visibility hypothesis in ASL, but there has not been quantitative data from other SLs to test the generalization to other languages. The authors investigated the kinematic parameters of predicates in Croatian Sign Language ( Hrvatskom Znakovnom Jeziku [HZJ]). Kinematic features of verb signs were affected both by event structure of the predicate (semantics) and phrase position within the sentence (prosody). The data demonstrate that kinematic features of motion in HZJ verb signs are recruited to convey morphological and prosodic information. This is the first crosslinguistic motion capture confirmation that specific kinematic properties of articulator motion are grammaticalized in other SLs to express linguistic features.

  10. Dissociating linguistic and non-linguistic gesture processing: electrophysiological evidence from American Sign Language.

    PubMed

    Grosvald, Michael; Gutierrez, Eva; Hafer, Sarah; Corina, David

    2012-04-01

    A fundamental advance in our understanding of human language would come from a detailed account of how non-linguistic and linguistic manual actions are differentiated in real time by language users. To explore this issue, we targeted the N400, an ERP component known to be sensitive to semantic context. Deaf signers saw 120 American Sign Language sentences, each consisting of a "frame" (a sentence without the last word; e.g. BOY SLEEP IN HIS) followed by a "last item" belonging to one of four categories: a high-close-probability sign (a "semantically reasonable" completion to the sentence; e.g. BED), a low-close-probability sign (a real sign that is nonetheless a "semantically odd" completion to the sentence; e.g. LEMON), a pseudo-sign (phonologically legal but non-lexical form), or a non-linguistic grooming gesture (e.g. the performer scratching her face). We found significant N400-like responses in the incongruent and pseudo-sign contexts, while the gestures elicited a large positivity. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Generation of Signs Within Semantic and Phonological Categories: Data from Deaf Adults and Children Who Use American Sign Language.

    PubMed

    Beal-Alvarez, Jennifer S; Figueroa, Daileen M

    2017-04-01

    Two key areas of language development include semantic and phonological knowledge. Semantic knowledge relates to word and concept knowledge. Phonological knowledge relates to how language parameters combine to create meaning. We investigated signing deaf adults' and children's semantic and phonological sign generation via one-minute tasks, including animals, foods, and specific handshapes. We investigated the effects of chronological age, age of sign language acquisition/years at school site, gender, presence of a disability, and geographical location (i.e., USA and Puerto Rico) on participants' performance and relations among tasks. In general, the phonological task appeared more difficult than the semantic tasks, students generated more animals than foods, age, and semantic performance correlated for the larger sample of U.S. students, and geographical variation included use of fingerspelling and specific signs. Compared to their peers, deaf students with disabilities generated fewer semantic items. These results provide an initial snapshot of students' semantic and phonological sign generation. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Dissociating linguistic and non-linguistic gesture processing: Electrophysiological evidence from American Sign Language

    PubMed Central

    Grosvald, Michael; Gutierrez, Eva; Hafer, Sarah; Corina, David

    2012-01-01

    A fundamental advance in our understanding of human language would come from a detailed account of how non-linguistic and linguistic manual actions are differentiated in real time by language users. To explore this issue, we targeted the N400, an ERP component known to be sensitive to semantic context. Deaf signers saw 120 American Sign Language sentences, each consisting of a “frame” (a sentence without the last word; e.g. BOY SLEEP IN HIS) followed by a “last item” belonging to one of four categories: a high-cloze-probability sign (a “semantically reasonable” completion to the sentence; e.g. BED), a low-cloze-probability sign (a real sign that is nonetheless a “semantically odd” completion to the sentence; e.g. LEMON), a pseudo-sign (phonologically legal but non-lexical form), or a non-linguistic grooming gesture (e.g. the performer scratching her face). We found significant N400-like responses in the incongruent and pseudo-sign contexts, while the gestures elicited a large positivity. PMID:22341555

  13. The Role of Automata and Machine Theory in School and College Mathematics Syllabuses.

    ERIC Educational Resources Information Center

    Holcombe, M.

    1981-01-01

    The introduction of certain topics in the theory of machines and languages into school and college mathematics courses in place of the more usual discussion of groups and formal logic is proposed. Examples of machines and languages and their interconnections suitable for such courses are outlined. (MP)

  14. Sign Language Studies with Chimpanzees and Children.

    ERIC Educational Resources Information Center

    Van Cantfort, Thomas E.; Rimpau, James B.

    1982-01-01

    Reviews methodologies of sign language studies with chimpanzees and compares major findings of those studies with studies of human children. Considers relevance of input conditions for language acquisition, evidence used to demonstrate linguistic achievements, and application of rigorous testing procedures in developmental psycholinguistics.…

  15. Ada Compiler Validation Summary Report. Certificate Number: 891129W1. 10198 International Business Machines Corporation, the IBM Development System for the Ada Language AIX/RT Follow-on, Version 1.1 IBM RT Follow-on. Completion of On-Site Testing: 29 November 1989

    DTIC Science & Technology

    1989-11-29

    nvmbe’j International Business Machines Corporation Wright-Patterson AFB, The IBM Development System for the Ada Language AIX/RT follow-on, Version 1.1...Certificate Number: 891129W1.10198 International Business Machines Corporation The IBM Development System for the Ada Language AIX/RT Follow-on, Version 1.1 IBM...scripts provided by International Business Machines Corporation and reviewed by the validation team. The compiler was tested using all the following

  16. Real-Time Processing of ASL Signs: Delayed First Language Acquisition Affects Organization of the Mental Lexicon

    ERIC Educational Resources Information Center

    Lieberman, Amy M.; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I.

    2015-01-01

    Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of…

  17. Language Policies in Uruguay and Uruguayan Sign Language (LSU)

    ERIC Educational Resources Information Center

    Behares, Luis Ernesto; Brovetto, Claudia; Crespi, Leonardo Peluso

    2012-01-01

    In the first part of this article the authors consider the policies that apply to Uruguayan Sign Language (Lengua de Senas Uruguaya; hereafter LSU) and the Uruguayan Deaf community within the general framework of language policies in Uruguay. By analyzing them succinctly and as a whole, the authors then explain twenty-first-century innovations.…

  18. How Deaf American Sign Language/English Bilingual Children Become Proficient Readers: An Emic Perspective

    ERIC Educational Resources Information Center

    Mounty, Judith L.; Pucci, Concetta T.; Harmon, Kristen C.

    2014-01-01

    A primary tenet underlying American Sign Language/English bilingual education for deaf students is that early access to a visual language, developed in conjunction with language planning principles, provides a foundation for literacy in English. The goal of this study is to obtain an emic perspective on bilingual deaf readers transitioning from…

  19. The Language Development of a Deaf Child with a Cochlear Implant

    ERIC Educational Resources Information Center

    Mouvet, Kimberley; Matthijs, Liesbeth; Loots, Gerrit; Taverniers, Miriam; Van Herreweghe, Mieke

    2013-01-01

    Hearing parents of deaf or partially deaf infants are confronted with the complex question of communication with their child. This question is complicated further by conflicting advice on how to address the child: in spoken language only, in spoken language supported by signs, or in signed language. This paper studies the linguistic environment…

  20. Second Language Acquisition across Modalities: Production Variability in Adult L2 Learners of American Sign Language

    ERIC Educational Resources Information Center

    Hilger, Allison I.; Loucks, Torrey M. J.; Quinto-Pozos, David; Dye, Matthew W. G.

    2015-01-01

    A study was conducted to examine production variability in American Sign Language (ASL) in order to gain insight into the development of motor control in a language produced in another modality. Production variability was characterized through the spatiotemporal index (STI), which represents production stability in whole utterances and is a…

  1. Observations on Word Order in Saudi Arabian Sign Language

    ERIC Educational Resources Information Center

    Sprenger, Kristen; Mathur, Gaurav

    2012-01-01

    This article focuses on the syntactic level of the grammar of Saudi Arabian Sign Language by exploring some word orders that occur in personal narratives in the language. Word order is one of the main ways in which languages indicate the main syntactic roles of subjects, verbs, and objects; others are verbal agreement and nominal case morphology.…

  2. When does Iconicity in Sign Language Matter?

    PubMed Central

    Baus, Cristina; Carreiras, Manuel; Emmorey, Karen

    2012-01-01

    We examined whether iconicity in American Sign Language (ASL) enhances translation performance for new learners and proficient signers. Fifteen hearing nonsigners and 15 proficient ASL-English bilinguals performed a translation recognition task and a production translation task. Nonsigners were taught 28 ASL verbs (14 iconic; 14 non-iconic) prior to performing these tasks. Only new learners benefited from sign iconicity, recognizing iconic translations faster and more accurately and exhibiting faster forward (English-ASL) and backward (ASL-English) translation times for iconic signs. In contrast, proficient ASL-English bilinguals exhibited slower recognition and translation times for iconic signs. We suggest iconicity aids memorization in the early stages of adult sign language learning, but for fluent L2 signers, iconicity interacts with other variables that slow translation (specifically, the iconic signs had more translation equivalents than the non-iconic signs). Iconicity may also have slowed translation performance by forcing conceptual mediation for iconic signs, which is slower than translating via direct lexical links. PMID:23543899

  3. Using the Hands to Represent Objects in Space: Gesture as a Substrate for Signed Language Acquisition.

    PubMed

    Janke, Vikki; Marshall, Chloë R

    2017-01-01

    An ongoing issue of interest in second language research concerns what transfers from a speaker's first language to their second. For learners of a sign language, gesture is a potential substrate for transfer. Our study provides a novel test of gestural production by eliciting silent gesture from novices in a controlled environment. We focus on spatial relationships, which in sign languages are represented in a very iconic way using the hands, and which one might therefore predict to be easy for adult learners to acquire. However, a previous study by Marshall and Morgan (2015) revealed that this was only partly the case: in a task that required them to express the relative locations of objects, hearing adult learners of British Sign Language (BSL) could represent objects' locations and orientations correctly, but had difficulty selecting the correct handshapes to represent the objects themselves. If hearing adults are indeed drawing upon their gestural resources when learning sign languages, then their difficulties may have stemmed from their having in manual gesture only a limited repertoire of handshapes to draw upon, or, alternatively, from having too broad a repertoire. If the first hypothesis is correct, the challenge for learners is to extend their handshape repertoire, but if the second is correct, the challenge is instead to narrow down to the handshapes appropriate for that particular sign language. 30 sign-naïve hearing adults were tested on Marshall and Morgan's task. All used some handshapes that were different from those used by native BSL signers and learners, and the set of handshapes used by the group as a whole was larger than that employed by native signers and learners. Our findings suggest that a key challenge when learning to express locative relations might be reducing from a very large set of gestural resources, rather than supplementing a restricted one, in order to converge on the conventionalized classifier system that forms part of the grammar of the language being learned.

  4. Sign language processing and the mirror neuron system.

    PubMed

    Corina, David P; Knapp, Heather

    2006-05-01

    In this paper we review evidence for frontal and parietal lobe involvement in sign language comprehension and production, and evaluate the extent to which these data can be interpreted within the context of a mirror neuron system for human action observation and execution. We present data from three literatures--aphasia, cortical stimulation, and functional neuroimaging. Generally, we find support for the idea that sign language comprehension and production can be viewed in the context of a broadly-construed frontal-parietal human action observation/execution system. However, sign language data cannot be fully accounted for under a strict interpretation of the mirror neuron system. Additionally, we raise a number of issues concerning the lack of specificity in current accounts of the human action observation/execution system.

  5. Static hand gesture recognition from a video

    NASA Astrophysics Data System (ADS)

    Rokade, Rajeshree S.; Doye, Dharmpal

    2011-10-01

    A sign language (also signed language) is a language which, instead of acoustically conveyed sound patterns, uses visually transmitted sign patterns to convey meaning- "simultaneously combining hand shapes, orientation and movement of the hands". Sign languages commonly develop in deaf communities, which can include interpreters, friends and families of deaf people as well as people who are deaf or hard of hearing themselves. In this paper, we proposed a novel system for recognition of static hand gestures from a video, based on Kohonen neural network. We proposed algorithm to separate out key frames, which include correct gestures from a video sequence. We segment, hand images from complex and non uniform background. Features are extracted by applying Kohonen on key frames and recognition is done.

  6. Conventions for sign and speech transcription of child bimodal bilingual corpora in ELAN.

    PubMed

    Chen Pichler, Deborah; Hochgesang, Julie A; Lillo-Martin, Diane; de Quadros, Ronice Müller

    2010-01-01

    This article extends current methodologies for the linguistic analysis of sign language acquisition to cases of bimodal bilingual acquisition. Using ELAN, we are transcribing longitudinal spontaneous production data from hearing children of Deaf parents who are learning either American Sign Language (ASL) and American English (AE), or Brazilian Sign Language (Libras, also referred to as Língua de Sinais Brasileira/LSB in some texts) and Brazilian Portuguese (BP). Our goal is to construct corpora that can be mined for a wide range of investigations on various topics in acquisition. Thus, it is important that we maintain consistency in transcription for both signed and spoken languages. This article documents our transcription conventions, including the principles behind our approach. Using this document, other researchers can chose to follow similar conventions or develop new ones using our suggestions as a starting point.

  7. Conventions for sign and speech transcription of child bimodal bilingual corpora in ELAN

    PubMed Central

    Chen Pichler, Deborah; Hochgesang, Julie A.; Lillo-Martin, Diane; de Quadros, Ronice Müller

    2011-01-01

    This article extends current methodologies for the linguistic analysis of sign language acquisition to cases of bimodal bilingual acquisition. Using ELAN, we are transcribing longitudinal spontaneous production data from hearing children of Deaf parents who are learning either American Sign Language (ASL) and American English (AE), or Brazilian Sign Language (Libras, also referred to as Língua de Sinais Brasileira/LSB in some texts) and Brazilian Portuguese (BP). Our goal is to construct corpora that can be mined for a wide range of investigations on various topics in acquisition. Thus, it is important that we maintain consistency in transcription for both signed and spoken languages. This article documents our transcription conventions, including the principles behind our approach. Using this document, other researchers can chose to follow similar conventions or develop new ones using our suggestions as a starting point. PMID:21625371

  8. An Investigation into the Relationship of Foreign Language Learning Motivation and Sign Language Use among Deaf and Hard of Hearing Hungarians

    ERIC Educational Resources Information Center

    Kontra, Edit H.; Csizer, Kata

    2013-01-01

    The aim of this study is to point out the relationship between foreign language learning motivation and sign language use among hearing impaired Hungarians. In the article we concentrate on two main issues: first, to what extent hearing impaired people are motivated to learn foreign languages in a European context; second, to what extent sign…

  9. Language abnormality in deaf people with schizophrenia: a problem with classifiers.

    PubMed

    Chatzidamianos, G; McCarthy, R A; Du Feu, M; Rosselló, J; McKenna, P J

    2018-06-05

    Although there is evidence for language abnormality in schizophrenia, few studies have examined sign language in deaf patients with the disorder. This is of potential interest because a hallmark of sign languages is their use of classifiers (semantic or entity classifiers), a reference-tracking device with few if any parallels in spoken languages. This study aimed to examine classifier production and comprehension in deaf signing adults with schizophrenia. Fourteen profoundly deaf signing adults with schizophrenia and 35 age- and IQ-matched deaf healthy controls completed a battery of tests assessing classifier and noun comprehension and production. The patients showed poorer performance than the healthy controls on comprehension and production of both nouns and entity classifiers, with the deficit being most marked in the production of classifiers. Classifier production errors affected handshape rather than other parameters such as movement and location. The findings suggest that schizophrenia affects language production in deaf patients with schizophrenia in a unique way not seen in hearing patients.

  10. Neural organization of linguistic short-term memory is sensory modality-dependent: evidence from signed and spoken language.

    PubMed

    Pa, Judy; Wilson, Stephen M; Pickell, Herbert; Bellugi, Ursula; Hickok, Gregory

    2008-12-01

    Despite decades of research, there is still disagreement regarding the nature of the information that is maintained in linguistic short-term memory (STM). Some authors argue for abstract phonological codes, whereas others argue for more general sensory traces. We assess these possibilities by investigating linguistic STM in two distinct sensory-motor modalities, spoken and signed language. Hearing bilingual participants (native in English and American Sign Language) performed equivalent STM tasks in both languages during functional magnetic resonance imaging. Distinct, sensory-specific activations were seen during the maintenance phase of the task for spoken versus signed language. These regions have been previously shown to respond to nonlinguistic sensory stimulation, suggesting that linguistic STM tasks recruit sensory-specific networks. However, maintenance-phase activations common to the two languages were also observed, implying some form of common process. We conclude that linguistic STM involves sensory-dependent neural networks, but suggest that sensory-independent neural networks may also exist.

  11. Language as a multimodal phenomenon: implications for language learning, processing and evolution

    PubMed Central

    Vigliocco, Gabriella; Perniss, Pamela; Vinson, David

    2014-01-01

    Our understanding of the cognitive and neural underpinnings of language has traditionally been firmly based on spoken Indo-European languages and on language studied as speech or text. However, in face-to-face communication, language is multimodal: speech signals are invariably accompanied by visual information on the face and in manual gestures, and sign languages deploy multiple channels (hands, face and body) in utterance construction. Moreover, the narrow focus on spoken Indo-European languages has entrenched the assumption that language is comprised wholly by an arbitrary system of symbols and rules. However, iconicity (i.e. resemblance between aspects of communicative form and meaning) is also present: speakers use iconic gestures when they speak; many non-Indo-European spoken languages exhibit a substantial amount of iconicity in word forms and, finally, iconicity is the norm, rather than the exception in sign languages. This introduction provides the motivation for taking a multimodal approach to the study of language learning, processing and evolution, and discusses the broad implications of shifting our current dominant approaches and assumptions to encompass multimodal expression in both signed and spoken languages. PMID:25092660

  12. Automatic Mexican sign language and digits recognition using normalized central moments

    NASA Astrophysics Data System (ADS)

    Solís, Francisco; Martínez, David; Espinosa, Oscar; Toxqui, Carina

    2016-09-01

    This work presents a framework for automatic Mexican sign language and digits recognition based on computer vision system using normalized central moments and artificial neural networks. Images are captured by digital IP camera, four LED reflectors and a green background in order to reduce computational costs and prevent the use of special gloves. 42 normalized central moments are computed per frame and used in a Multi-Layer Perceptron to recognize each database. Four versions per sign and digit were used in training phase. 93% and 95% of recognition rates were achieved for Mexican sign language and digits respectively.

  13. The question of sign-language and the utility of signs in the instruction of the deaf: two papers by Alexander Graham Bell (1898).

    PubMed

    Bell, Alexander Graham

    2005-01-01

    Alexander Graham Bell is often portrayed as either hero or villain of deaf individuals and the Deaf community. His writings, however, indicate that he was neither, and was not as clearly definite in his beliefs about language as is often supposed. The following two articles, reprinted from The Educator (1898), Vol. V, pp. 3-4 and pp. 38-44, capture Bell's thinking about sign language and its use in the classroom. Contrary to frequent claims, Bell does not demand "oral" training for all deaf children--even if he thinks it is the superior alternative--but does advocate for it for "the semi-deaf" and "the semi-mute." "In regard to the others," he writes, "I am not so sure." Although he clearly voices his support for oral methods and fingerspelling (the Rochester method) over sign language, Bell acknowledges the use and utility of signing in a carefully-crafted discussion that includes both linguistics and educational philosophy. In separating the language used at home from that in school and on the playground, Bell reveals a far more complex view of language learning by deaf children than he is often granted. (M. Marschark).

  14. V2S: Voice to Sign Language Translation System for Malaysian Deaf People

    NASA Astrophysics Data System (ADS)

    Mean Foong, Oi; Low, Tang Jung; La, Wai Wan

    The process of learning and understand the sign language may be cumbersome to some, and therefore, this paper proposes a solution to this problem by providing a voice (English Language) to sign language translation system using Speech and Image processing technique. Speech processing which includes Speech Recognition is the study of recognizing the words being spoken, regardless of whom the speaker is. This project uses template-based recognition as the main approach in which the V2S system first needs to be trained with speech pattern based on some generic spectral parameter set. These spectral parameter set will then be stored as template in a database. The system will perform the recognition process through matching the parameter set of the input speech with the stored templates to finally display the sign language in video format. Empirical results show that the system has 80.3% recognition rate.

  15. Examining the contribution of motor movement and language dominance to increased left lateralization during sign generation in native signers.

    PubMed

    Gutierrez-Sigut, Eva; Payne, Heather; MacSweeney, Mairéad

    2016-08-01

    The neural systems supporting speech and sign processing are very similar, although not identical. In a previous fTCD study of hearing native signers (Gutierrez-Sigut, Daws, et al., 2015) we found stronger left lateralization for sign than speech. Given that this increased lateralization could not be explained by hand movement alone, the contribution of motor movement versus 'linguistic' processes to the strength of hemispheric lateralization during sign production remains unclear. Here we directly contrast lateralization strength of covert versus overt signing during phonological and semantic fluency tasks. To address the possibility that hearing native signers' elevated lateralization indices (LIs) were due to performing a task in their less dominant language, here we test deaf native signers, whose dominant language is British Sign Language (BSL). Signers were more strongly left lateralized for overt than covert sign generation. However, the strength of lateralization was not correlated with the amount of time producing movements of the right hand. Comparisons with previous data from hearing native English speakers suggest stronger laterality indices for sign than speech in both covert and overt tasks. This increased left lateralization may be driven by specific properties of sign production such as the increased use of self-monitoring mechanisms or the nature of phonological encoding of signs. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Iconicity and Sign Lexical Acquisition: A Review

    PubMed Central

    Ortega, Gerardo

    2017-01-01

    The study of iconicity, defined as the direct relationship between a linguistic form and its referent, has gained momentum in recent years across a wide range of disciplines. In the spoken modality, there is abundant evidence showing that iconicity is a key factor that facilitates language acquisition. However, when we look at sign languages, which excel in the prevalence of iconic structures, there is a more mixed picture, with some studies showing a positive effect and others showing a null or negative effect. In an attempt to reconcile the existing evidence the present review presents a critical overview of the literature on the acquisition of a sign language as first (L1) and second (L2) language and points at some factor that may be the source of disagreement. Regarding sign L1 acquisition, the contradicting findings may relate to iconicity being defined in a very broad sense when a more fine-grained operationalisation might reveal an effect in sign learning. Regarding sign L2 acquisition, evidence shows that there is a clear dissociation in the effect of iconicity in that it facilitates conceptual-semantic aspects of sign learning but hinders the acquisition of the exact phonological form of signs. It will be argued that when we consider the gradient nature of iconicity and that signs consist of a phonological form attached to a meaning we can discern how iconicity impacts sign learning in positive and negative ways. PMID:28824480

  17. The Signer and the Sign: Cortical Correlates of Person Identity and Language Processing from Point-Light Displays

    ERIC Educational Resources Information Center

    Campbell, Ruth; Capek, Cheryl M.; Gazarian, Karine; MacSweeney, Mairead; Woll, Bencie; David, Anthony S.; McGuire, Philip K.; Brammer, Michael J.

    2011-01-01

    In this study, the first to explore the cortical correlates of signed language (SL) processing under point-light display conditions, the observer identified either a signer or a lexical sign from a display in which different signers were seen producing a number of different individual signs. Many of the regions activated by point-light under these…

  18. Methodological Note: Analyzing Signs for Recognition & Feature Salience.

    ERIC Educational Resources Information Center

    Shyan, Melissa R.

    1985-01-01

    Presents a method to determine how signs in American Sign Language are recognized by signers. The method uses natural settings and avoids common artificialities found in prior work. A pilot study is described involving language research with Atlantic Bottlenose Dolphins in which the method was successfully used. (SED)

  19. Study of Man-Machine Communications Systems for the Handicapped. Volume III. Final Report.

    ERIC Educational Resources Information Center

    Kafafian, Haig

    The report describes a series of studies conducted to determine the extent to which severly handicapped students who were able to comprehend language and language structure but who were not able to write or type could communicate using various man-machine systems. Included among the systems tested were specialized electric typewriting machines, a…

  20. The influence of visual feedback and register changes on sign language production: A kinematic study with deaf signers

    PubMed Central

    EMMOREY, KAREN; GERTSBERG, NELLY; KORPICS, FRANCO; WRIGHT, CHARLES E.

    2009-01-01

    Speakers monitor their speech output by listening to their own voice. However, signers do not look directly at their hands and cannot see their own face. We investigated the importance of a visual perceptual loop for sign language monitoring by examining whether changes in visual input alter sign production. Deaf signers produced American Sign Language (ASL) signs within a carrier phrase under five conditions: blindfolded, wearing tunnel-vision goggles, normal (citation) signing, shouting, and informal signing. Three-dimensional movement trajectories were obtained using an Optotrak Certus system. Informally produced signs were shorter with less vertical movement. Shouted signs were displaced forward and to the right and were produced within a larger volume of signing space, with greater velocity, greater distance traveled, and a longer duration. Tunnel vision caused signers to produce less movement within the vertical dimension of signing space, but blind and citation signing did not differ significantly on any measure, except duration. Thus, signers do not “sign louder” when they cannot see themselves, but they do alter their sign production when vision is restricted. We hypothesize that visual feedback serves primarily to fine-tune the size of signing space rather than as input to a comprehension-based monitor. PMID:20046943

  1. The influence of visual feedback and register changes on sign language production: A kinematic study with deaf signers.

    PubMed

    Emmorey, Karen; Gertsberg, Nelly; Korpics, Franco; Wright, Charles E

    2009-01-01

    Speakers monitor their speech output by listening to their own voice. However, signers do not look directly at their hands and cannot see their own face. We investigated the importance of a visual perceptual loop for sign language monitoring by examining whether changes in visual input alter sign production. Deaf signers produced American Sign Language (ASL) signs within a carrier phrase under five conditions: blindfolded, wearing tunnel-vision goggles, normal (citation) signing, shouting, and informal signing. Three-dimensional movement trajectories were obtained using an Optotrak Certus system. Informally produced signs were shorter with less vertical movement. Shouted signs were displaced forward and to the right and were produced within a larger volume of signing space, with greater velocity, greater distance traveled, and a longer duration. Tunnel vision caused signers to produce less movement within the vertical dimension of signing space, but blind and citation signing did not differ significantly on any measure, except duration. Thus, signers do not "sign louder" when they cannot see themselves, but they do alter their sign production when vision is restricted. We hypothesize that visual feedback serves primarily to fine-tune the size of signing space rather than as input to a comprehension-based monitor.

  2. The Relationship between American Sign Language Vocabulary and the Development of Language-Based Reasoning Skills in Deaf Children

    ERIC Educational Resources Information Center

    Henner, Jonathan

    2016-01-01

    The language-based analogical reasoning abilities of Deaf children are a controversial topic. Researchers lack agreement about whether Deaf children possess the ability to reason using language-based analogies, or whether this ability is limited by a lack of access to vocabulary, both written and signed. This dissertation examines factors that…

  3. Using the "Common European Framework of Reference for Languages" to Teach Sign Language to Parents of Deaf Children

    ERIC Educational Resources Information Center

    Snoddon, Kristin

    2015-01-01

    No formal Canadian curriculum presently exists for teaching American Sign Language (ASL) as a second language to parents of deaf and hard of hearing children. However, this group of ASL learners is in need of more comprehensive, research-based support, given the rapid expansion in Canada of universal neonatal hearing screening and the…

  4. The Subsystem of Numerals in Catalan Sign Language: Description and Examples from a Psycholinguistic Study

    ERIC Educational Resources Information Center

    Fuentes, Mariana; Tolchinsky, Liliana

    2004-01-01

    Linguistic descriptions of sign languages are important to the recognition of their linguistic status. These languages are an essential part of the cultural heritage of the communities that create and use them and vital in the education of deaf children. They are also the reference point in language acquisition studies. Ours is exploratory…

  5. Mental Disorders in Deaf and Hard of Hearing Adult Outpatients: A Comparison of Linguistic Subgroups.

    PubMed

    Øhre, Beate; Volden, Maj; Falkum, Erik; von Tetzchner, Stephen

    2017-01-01

    Deaf and hard of hearing (DHH) individuals who use signed language and those who use spoken language face different challenges and stressors. Accordingly, the profile of their mental problems may also differ. However, studies of mental disorders in this population have seldom differentiated between linguistic groups. Our study compares demographics, mental disorders, and levels of distress and functioning in 40 patients using Norwegian Sign Language (NSL) and 36 patients using spoken language. Assessment instruments were translated into NSL. More signers were deaf than hard of hearing, did not share a common language with their childhood caregivers, and had attended schools for DHH children. More Norwegian-speaking than signing patients reported medical comorbidity, whereas the distribution of mental disorders, symptoms of anxiety and depression, and daily functioning did not differ significantly. Somatic complaints and greater perceived social isolation indicate higher stress levels in DHH patients using spoken language than in those using sign language. Therefore, preventive interventions are necessary, as well as larger epidemiological and clinical studies concerning the mental health of all language groups within the DHH population. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Understanding 'not': neuropsychological dissociations between hand and head markers of negation in BSL.

    PubMed

    Atkinson, Jo; Campbell, Ruth; Marshall, Jane; Thacker, Alice; Woll, Bencie

    2004-01-01

    Simple negation in natural languages represents a complex interrelationship of syntax, prosody, semantics and pragmatics, and may be realised in various ways: lexically, morphologically and prosodically. In almost all spoken languages, the first two of these are the primary realisations of syntactic negation. In contrast, in many signed languages negation can occur without lexical or morphological marking. Thus, in British Sign Language (BSL), negation is obligatorily expressed using face-head actions alone (facial negation) with the option of articulating a manual form alongside the required face-head actions (lexical negation). What are the processes underlying facial negation? Here, we explore this question neuropsychologically. If facial negation reflects lexico-syntactic processing in BSL, it may be relatively spared in people with unilateral right hemisphere (RH) lesions, as has been suggested for other 'grammatical facial actions' [Language and Speech 42 (1999) 307; Emmorey, K. (2002). Language, cognition and the brain: Insights from sign language research. Mahwah, NJ: Erlbaum (Lawrence)]. Three BSL users with RH lesions were specifically impaired in perceiving facial compared with manual (lexical and morphological) negation. This dissociation was absent in three users of BSL with left hemisphere lesions and different degrees of language disorder, who also showed relative sparing of negation comprehension. We conclude that, in contrast to some analyses [Applied Psycholinguistics 18 (1997) 411; Emmorey, K. (2002). Language, cognition and the brain: Insights from sign language research. Mahwah, NJ: Erlbaum (Lawrence); Archives of Neurology 36 (1979) 837], non-manual negation in sign may not be a direct surface realisation of syntax [Language and Speech 42 (1999) 143; Language and Speech 42 (1999) 127]. Difficulties with facial negation in the RH-lesion group were associated with specific impairments in processing facial images, including facial expressions. However, they did not reflect generalised 'face-blindness', since the reading of (English) speech patterns from faces was spared in this group. We propose that some aspects of the linguistic analysis of sign language are achieved by prosodic analysis systems (analysis of face and head gestures), which are lateralised to the minor hemisphere.

  7. Language lateralization of hearing native signers: A functional transcranial Doppler sonography (fTCD) study of speech and sign production.

    PubMed

    Gutierrez-Sigut, Eva; Daws, Richard; Payne, Heather; Blott, Jonathan; Marshall, Chloë; MacSweeney, Mairéad

    2015-12-01

    Neuroimaging studies suggest greater involvement of the left parietal lobe in sign language compared to speech production. This stronger activation might be linked to the specific demands of sign encoding and proprioceptive monitoring. In Experiment 1 we investigate hemispheric lateralization during sign and speech generation in hearing native users of English and British Sign Language (BSL). Participants exhibited stronger lateralization during BSL than English production. In Experiment 2 we investigated whether this increased lateralization index could be due exclusively to the higher motoric demands of sign production. Sign naïve participants performed a phonological fluency task in English and a non-sign repetition task. Participants were left lateralized in the phonological fluency task but there was no consistent pattern of lateralization for the non-sign repetition in these hearing non-signers. The current data demonstrate stronger left hemisphere lateralization for producing signs than speech, which was not primarily driven by motoric articulatory demands. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Information status and word order in Croatian Sign Language.

    PubMed

    Milkovic, Marina; Bradaric-Joncic, Sandra; Wilbur, Ronnie B

    2007-01-01

    This paper presents the results of research on information structure and word order in narrative sentences taken from signed short stories in Croatian Sign Language (HZJ). The basic word order in HZJ is SVO. Factors that result in other word orders include: reversible arguments, verb categories, locative constructions, contrastive focus, and prior context. Word order in context depends on communication rules, based on the relationship between old (theme) and new (rheme) information, which is predicated of the theme. In accordance with Grice's Maxim of Quantity, HZJ has a tendency to omit old information, or to reduce it to pronominal status. If old information is overtly signed in non-pronominal form, it precedes the rheme. We have observed a variety of sign language mechanisms that are used to show items of reduced contextual significance: use of assigned spatial location for previously introduced referents; eyegaze to indicate spatial location of previously introduced referents; use of the non-dominant hand for backgrounded information; use of a special category of signs known as classifiers as pronominal indicators of previously introduced referents; and complex noun phrases that allow a single occurrence of a noun to simultaneously serve multiple functions. These devices permit information to be conveyed without the need for separate signs for every referent, which would create longer constructions that could be taxing to both production and perception. The results of this research are compatible with well-known word order generalizations - HZJ has its own grammar, independent of spoken language, like any other sign language.

  9. Sign Language Planning in the Netherlands between 1980 and 2010

    ERIC Educational Resources Information Center

    Schermer, Trude

    2012-01-01

    This article discusses several aspects of language planning with respect to Sign Language of the Netherlands, or Nederlandse Gebarentaal (NGT). For nearly thirty years members of the Deaf community, the Dutch Deaf Council (Dovenschap) have been working together with researchers, several organizations in deaf education, and the organization of…

  10. Investigating Black ASL: A Systematic Review

    ERIC Educational Resources Information Center

    Toliver-Smith, Andrea; Gentry, Betholyn

    2017-01-01

    The authors reviewed the literature regarding linguistic variations seen in American Sign Language. These variations are influenced by region and culture. Features of spoken languages have also influenced sign languages as they intersected, e.g., Black ASL has been influenced by African American English. A literature review was conducted to…

  11. Categorical Coding of Manual & English Alphabet Characters by Beginning Students of American Sign Language.

    ERIC Educational Resources Information Center

    Hoemann, Harry W.; Koenig, Teresa J.

    1990-01-01

    Analysis of the performance of beginning American Sign Language students, who had only recently learned the manual alphabet, on a task in which proactive interference would build up rapidly on successive trials, supported the view that different languages have separate memory stores. (Author/CB)

  12. A Preliminary Study on Interpreting for Emergent Signers

    ERIC Educational Resources Information Center

    Smith, Caitlin; Dicus, Danica

    2015-01-01

    Sign language interpreters work with a variety of consumer populations throughout their careers. One such population, referred to as "emergent signers," consists of consumers who are in the process of learning American Sign Language, and who rely on interpreters during their language acquisition period. A gap in the research is revealed…

  13. Ideologies and Attitudes toward Sign Languages: An Approximation

    ERIC Educational Resources Information Center

    Krausneker, Verena

    2015-01-01

    Attitudes are complex and little research in the field of linguistics has focused on language attitudes. This article deals with attitudes toward sign languages and those who use them--attitudes that are influenced by ideological constructions. The article reviews five categories of such constructions and discusses examples in each one.

  14. Recognition of Indian Sign Language in Live Video

    NASA Astrophysics Data System (ADS)

    Singha, Joyeeta; Das, Karen

    2013-05-01

    Sign Language Recognition has emerged as one of the important area of research in Computer Vision. The difficulty faced by the researchers is that the instances of signs vary with both motion and appearance. Thus, in this paper a novel approach for recognizing various alphabets of Indian Sign Language is proposed where continuous video sequences of the signs have been considered. The proposed system comprises of three stages: Preprocessing stage, Feature Extraction and Classification. Preprocessing stage includes skin filtering, histogram matching. Eigen values and Eigen Vectors were considered for feature extraction stage and finally Eigen value weighted Euclidean distance is used to recognize the sign. It deals with bare hands, thus allowing the user to interact with the system in natural way. We have considered 24 different alphabets in the video sequences and attained a success rate of 96.25%.

  15. Functional language and data flow architectures

    NASA Technical Reports Server (NTRS)

    Ercegovac, M. D.; Patel, D. R.; Lang, T.

    1983-01-01

    This is a tutorial article about language and architecture approaches for highly concurrent computer systems based on the functional style of programming. The discussion concentrates on the basic aspects of functional languages, and sequencing models such as data-flow, demand-driven and reduction which are essential at the machine organization level. Several examples of highly concurrent machines are described.

  16. Web-Based Machine Translation as a Tool for Promoting Electronic Literacy and Language Awareness

    ERIC Educational Resources Information Center

    Williams, Lawrence

    2006-01-01

    This article addresses a pervasive problem of concern to teachers of many foreign languages: the use of Web-Based Machine Translation (WBMT) by students who do not understand the complexities of this relatively new tool. Although networked technologies have greatly increased access to many language and communication tools, WBMT is still…

  17. Ausdruckskraft und Regelmaessigkeit: Was Esperanto fuer automatische Uebersetzung geeignet macht (Expressiveness and Formal Regularity: What Makes Esperanto Suitable for Machine Translation).

    ERIC Educational Resources Information Center

    Schubert, Klaus

    1988-01-01

    Describes DLT, the multilingual machine translation system that uses Esperanto as an intermediate language in which substantial portions of the translation subprocesses are carried out. The criteria for choosing an intermediate language and the reasons for preferring Esperanto over other languages are explained. (Author/DJD)

  18. Text generation from Taiwanese Sign Language using a PST-based language model for augmentative communication.

    PubMed

    Wu, Chung-Hsien; Chiu, Yu-Hsien; Guo, Chi-Shiang

    2004-12-01

    This paper proposes a novel approach to the generation of Chinese sentences from ill-formed Taiwanese Sign Language (TSL) for people with hearing impairments. First, a sign icon-based virtual keyboard is constructed to provide a visualized interface to retrieve sign icons from a sign database. A proposed language model (LM), based on a predictive sentence template (PST) tree, integrates a statistical variable n-gram LM and linguistic constraints to deal with the translation problem from ill-formed sign sequences to grammatical written sentences. The PST tree trained by a corpus collected from the deaf schools was used to model the correspondence between signed and written Chinese. In addition, a set of phrase formation rules, based on trigger pair category, was derived for sentence pattern expansion. These approaches improved the efficiency of text generation and the accuracy of word prediction and, therefore, improved the input rate. For the assessment of practical communication aids, a reading-comprehension training program with ten profoundly deaf students was undertaken in a deaf school in Tainan, Taiwan. Evaluation results show that the literacy aptitude test and subjective satisfactory level are significantly improved.

  19. Signing Earth Science: Accommodations for Students Who Are Deaf or Hard of Hearing and Whose First Language Is Sign

    NASA Astrophysics Data System (ADS)

    Vesel, J.; Hurdich, J.

    2014-12-01

    TERC and Vcom3D used the SigningAvatar® accessibility software to research and develop a Signing Earth Science Dictionary (SESD) of approximately 750 standards-based Earth science terms for high school students who are deaf and hard of hearing and whose first language is sign. The partners also evaluated the extent to which use of the SESD furthers understanding of Earth science content, command of the language of Earth science, and the ability to study Earth science independently. Disseminated as a Web-based version and App, the SESD is intended to serve the ~36,000 grade 9-12 students who are deaf or hard of hearing and whose first language is sign, the majority of whom leave high school reading at the fifth grade or below. It is also intended for teachers and interpreters who interact with members of this population and professionals working with Earth science education programs during field trips, internships etc. The signed SESD terms have been incorporated into a Mobile Communication App (MCA). This App for Androids is intended to facilitate communication between English speakers and persons who communicate in American Sign Language (ASL) or Signed English. It can translate words, phrases, or whole sentences from written or spoken English to animated signing. It can also fingerspell proper names and other words for which there are no signs. For our presentation, we will demonstrate the interactive features of the SigningAvatar® accessibility software that support the three principles of Universal Design for Learning (UDL) and have been incorporated into the SESD and MCA. Results from national field-tests will provide insight into the SESD's and MCA's potential applicability beyond grade 12 as accommodations that can be used for accessing the vocabulary deaf and hard of hearing students need for study of the geosciences and for facilitating communication about content. This work was funded in part by grants from NSF and the U.S. Department of Education.

  20. Exploring the use of dynamic language assessment with deaf children, who use American Sign Language: Two case studies.

    PubMed

    Mann, Wolfgang; Peña, Elizabeth D; Morgan, Gary

    2014-01-01

    We describe a model for assessment of lexical-semantic organization skills in American Sign Language (ASL) within the framework of dynamic vocabulary assessment and discuss the applicability and validity of the use of mediated learning experiences (MLE) with deaf signing children. Two elementary students (ages 7;6 and 8;4) completed a set of four vocabulary tasks and received two 30-minute mediations in ASL. Each session consisted of several scripted activities focusing on the use of categorization. Both had experienced difficulties in providing categorically related responses in one of the vocabulary tasks used previously. Results showed that the two students exhibited notable differences with regards to their learning pace, information uptake, and effort required by the mediator. Furthermore, we observed signs of a shift in strategic behavior by the lower performing student during the second mediation. Results suggest that the use of dynamic assessment procedures in a vocabulary context was helpful in understanding children's strategies as related to learning potential. These results are discussed in terms of deaf children's cognitive modifiability with implications for planning instruction and how MLE can be used with a population that uses ASL. The reader will (1) recognize the challenges in appropriate language assessment of deaf signing children; (2) recall the three areas explored to investigate whether a dynamic assessment approach is sensitive to differences in deaf signing children's language learning profiles (3) discuss how dynamic assessment procedures can make deaf signing children's individual language learning differences visible. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Designing an American Sign Language Avatar for Learning Computer Science Concepts for Deaf or Hard-of-Hearing Students and Deaf Interpreters

    ERIC Educational Resources Information Center

    Andrei, Stefan; Osborne, Lawrence; Smith, Zanthia

    2013-01-01

    The current learning process of Deaf or Hard of Hearing (D/HH) students taking Science, Technology, Engineering, and Mathematics (STEM) courses needs, in general, a sign interpreter for the translation of English text into American Sign Language (ASL) signs. This method is at best impractical due to the lack of availability of a specialized sign…

  2. Quantifying the Efficiency of a Translator: The Effect of Syntactical and Literal Written Translations on Language Comprehension Using the Machine Translation System FALCon (Foreign Area Language Converter)

    ERIC Educational Resources Information Center

    McCulloh, Ian A.; Morton, Jillian; Jantzi, Jennifer K.; Rodriguez, Amy M.; Graham, John

    2008-01-01

    The purpose of this study is to introduce a new method of evaluating human comprehension in the context of machine translation using a language translation program known as the FALCon (Forward Area Language Converter). The FALCon works by converting documents into digital images via scanner, and then converting those images to electronic text by…

  3. Mastering the Pressures of Variation: A Cognitive Linguistic Examination of Advanced Hearing ASL L2 Signers

    ERIC Educational Resources Information Center

    Nadolske, Marie Anne

    2009-01-01

    Despite the fact that American Sign Language (ASL) courses at the college-level have been increasing in frequency, little is understood about the capabilities of hearing individuals learning a sign language as a second language. This study aims to begin assessing the language skills of advanced L2 learners of ASL by comparing L2 signer productions…

  4. Runtime Verification of C Programs

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2008-01-01

    We present in this paper a framework, RMOR, for monitoring the execution of C programs against state machines, expressed in a textual (nongraphical) format in files separate from the program. The state machine language has been inspired by a graphical state machine language RCAT recently developed at the Jet Propulsion Laboratory, as an alternative to using Linear Temporal Logic (LTL) for requirements capture. Transitions between states are labeled with abstract event names and Boolean expressions over such. The abstract events are connected to code fragments using an aspect-oriented pointcut language similar to ASPECTJ's or ASPECTC's pointcut language. The system is implemented in the C analysis and transformation package CIL, and is programmed in OCAML, the implementation language of CIL. The work is closely related to the notion of stateful aspects within aspect-oriented programming, where pointcut languages are extended with temporal assertions over the execution trace.

  5. The Nature of Object Marking in American Sign Language

    ERIC Educational Resources Information Center

    Gokgoz, Kadir

    2013-01-01

    In this dissertation, I examine the nature of object marking in American Sign Language (ASL). I investigate object marking by means of directionality (the movement of the verb towards a certain location in signing space) and by means of handling classifiers (certain handshapes accompanying the verb). I propose that object marking in ASL is…

  6. Teaching Children with Language Delays to Say or Sign "More": Promises and Potential Pitfalls

    ERIC Educational Resources Information Center

    Lederer, Susan Hendler

    2018-01-01

    Teaching young children with language delays to say or sign the word "more" has had strong support from the literature since the 1970s (Bloom & Lahey, 1978; Holland, 1975; Lahey & Bloom, 1977; Lederer, 2002). Semantically, teaching children the word/sign "more" is supported by research on early vocabulary development…

  7. Information Status and Word Order in Croatian Sign Language

    ERIC Educational Resources Information Center

    Milkovic, Marina; Bradaric-Joncic, Sandra; Wilbur, Ronnie B.

    2007-01-01

    This paper presents the results of research on information structure and word order in narrative sentences taken from signed short stories in Croatian Sign Language (HZJ). The basic word order in HZJ is SVO. Factors that result in other word orders include: reversible arguments, verb categories, locative constructions, contrastive focus, and prior…

  8. Selected Lexical Patterns in Saudi Arabian Sign Language

    ERIC Educational Resources Information Center

    Young, Lesa; Palmer, Jeffrey Levi; Reynolds, Wanette

    2012-01-01

    This combined paper will focus on the description of two selected lexical patterns in Saudi Arabian Sign Language (SASL): metaphor and metonymy in emotion-related signs (Young) and lexicalization patterns of objects and their derivational roots (Palmer and Reynolds). The over-arcing methodology used by both studies is detailed in Stephen and…

  9. Writing Signed Languages: What for? What Form? A Response

    ERIC Educational Resources Information Center

    Moores, Donald F.

    2017-01-01

    In his article in an "American Annals of the Deaf" special issue that also includes the present article, Grushkin (EJ1174123) divides his discussion of a written sign system into three basic parts. The first presents arguments against the development of a written form of American Sign Language; the second provides a rationale…

  10. Spanish Sign in the Americas.

    ERIC Educational Resources Information Center

    Schein, Jerome D.

    1995-01-01

    Spanish Sign Language (SSL) is now the second most used sign language. This article introduces resources for the study of SSL, including three SSL dictionaries--two from Argentina and one from Puerto Rico. Differences in SSL between and within the two countries are noted. Implications for deaf educators in North America are drawn. (Author/DB)

  11. A Sensitive Period for the Acquisition of Complex Morphology: Evidence from American Sign Language.

    ERIC Educational Resources Information Center

    Galvan, Dennis

    A study investigated acquisition of three independent yet simulatneously produced morphological systems in American Sign Language (ASL): the linguistic use of space, use of classifiers, and inflections for aspect, all information incorporated into the production of a sign. Subjects were 30 deaf children with severe or profound prelingual hearing…

  12. Cross-Modal Bilingualism: Language Contact as Evidence of Linguistic Transfer in Sign Bilingual Education

    ERIC Educational Resources Information Center

    Menendez, Bruno

    2010-01-01

    New positive attitudes towards language interaction in the realm of bilingualism open new horizons for sign bilingual education. Plaza-Pust and Morales-Lopez have innovatively reconceptualised a new cross-disciplinary approach to sign bilingualism, based on both sociolinguistics and psycholinguistics. According to this framework, cross-modal…

  13. An Interactive Astronaut-Robot System with Gesture Control

    PubMed Central

    Liu, Jinguo; Luo, Yifan; Ju, Zhaojie

    2016-01-01

    Human-robot interaction (HRI) plays an important role in future planetary exploration mission, where astronauts with extravehicular activities (EVA) have to communicate with robot assistants by speech-type or gesture-type user interfaces embedded in their space suits. This paper presents an interactive astronaut-robot system integrating a data-glove with a space suit for the astronaut to use hand gestures to control a snake-like robot. Support vector machine (SVM) is employed to recognize hand gestures and particle swarm optimization (PSO) algorithm is used to optimize the parameters of SVM to further improve its recognition accuracy. Various hand gestures from American Sign Language (ASL) have been selected and used to test and validate the performance of the proposed system. PMID:27190503

  14. The impact of time on predicate forms in the manual modality: Signers, homesigners, and silent gesturers*

    PubMed Central

    Goldin-Meadow, Susan

    2014-01-01

    It is difficult to create spoken forms that can be understood on the spot. But the manual modality, in large part because of its iconic potential, allows us to construct forms that are immediately understood, thus requiring essentially no time to develop. This paper contrasts manual forms for actions produced over 3 time spans—by silent gesturers who are asked to invent gestures on the spot; by homesigners who have created gesture systems over their lifespans; and by signers who have learned a conventional sign language from other signers—and finds that properties of the predicate differ across these time spans. Silent gesturers use location to establish co-reference in the way established sign languages do, but show little evidence of the segmentation sign languages display in motion forms for manner and path, and little evidence of the finger complexity sign languages display in handshapes in predicates representing events. Homesigners, in contrast, not only use location to establish co-reference, but also display segmentation in their motion forms for manner and path and finger complexity in their object handshapes, although they have not yet decreased finger complexity to the levels found in sign languages in their handling handshapes. The manual modality thus allows us to watch language as it grows, offering insight into factors that may have shaped and may continue to shape human language. PMID:25329421

  15. Deaf children attending different school environments: sign language abilities and theory of mind.

    PubMed

    Tomasuolo, Elena; Valeri, Giovanni; Di Renzo, Alessio; Pasqualetti, Patrizio; Volterra, Virginia

    2013-01-01

    The present study examined whether full access to sign language as a medium for instruction could influence performance in Theory of Mind (ToM) tasks. Three groups of Italian participants (age range: 6-14 years) participated in the study: Two groups of deaf signing children and one group of hearing-speaking children. The two groups of deaf children differed only in their school environment: One group attended a school with a teaching assistant (TA; Sign Language is offered only by the TA to a single deaf child), and the other group attended a bilingual program (Italian Sign Language and Italian). Linguistic abilities and understanding of false belief were assessed using similar materials and procedures in spoken Italian with hearing children and in Italian Sign Language with deaf children. Deaf children attending the bilingual school performed significantly better than deaf children attending school with the TA in tasks assessing lexical comprehension and ToM, whereas the performance of hearing children was in between that of the two deaf groups. As for lexical production, deaf children attending the bilingual school performed significantly better than the two other groups. No significant differences were found between early and late signers or between children with deaf and hearing parents.

  16. Language as a multimodal phenomenon: implications for language learning, processing and evolution.

    PubMed

    Vigliocco, Gabriella; Perniss, Pamela; Vinson, David

    2014-09-19

    Our understanding of the cognitive and neural underpinnings of language has traditionally been firmly based on spoken Indo-European languages and on language studied as speech or text. However, in face-to-face communication, language is multimodal: speech signals are invariably accompanied by visual information on the face and in manual gestures, and sign languages deploy multiple channels (hands, face and body) in utterance construction. Moreover, the narrow focus on spoken Indo-European languages has entrenched the assumption that language is comprised wholly by an arbitrary system of symbols and rules. However, iconicity (i.e. resemblance between aspects of communicative form and meaning) is also present: speakers use iconic gestures when they speak; many non-Indo-European spoken languages exhibit a substantial amount of iconicity in word forms and, finally, iconicity is the norm, rather than the exception in sign languages. This introduction provides the motivation for taking a multimodal approach to the study of language learning, processing and evolution, and discusses the broad implications of shifting our current dominant approaches and assumptions to encompass multimodal expression in both signed and spoken languages. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  17. Ada (Tradename) Compiler Validation Summary Report. International Business Machines Corporation. IBM Development System for the Ada Language for MVS, Version 1.0. IBM 4381 (IBM System/370) under MVS.

    DTIC Science & Technology

    1986-05-05

    AVF-VSR-36.0187 Ada" COMPILER VALIDATION SUMMARY REPORT: International Business Machines Corporation IBM Development System for the Ada Language for...withdrawn from ACVC Version 1.7 were not run. The compiler was tested using command scripts provided by International Business Machines Corporation. These...APPENDIX A COMPLIANCE STATEMENT International Business Machines Corporation has submitted the following compliance statement concerning the IBM

  18. When technology became language: the origins of the linguistic conception of computer programming, 1950-1960.

    PubMed

    Nofre, David; Priestley, Mark; Alberts, Gerard

    2014-01-01

    Language is one of the central metaphors around which the discipline of computer science has been built. The language metaphor entered modern computing as part of a cybernetic discourse, but during the second half of the 1950s acquired a more abstract meaning, closely related to the formal languages of logic and linguistics. The article argues that this transformation was related to the appearance of the commercial computer in the mid-1950s. Managers of computing installations and specialists on computer programming in academic computer centers, confronted with an increasing variety of machines, called for the creation of "common" or "universal languages" to enable the migration of computer code from machine to machine. Finally, the article shows how the idea of a universal language was a decisive step in the emergence of programming languages, in the recognition of computer programming as a proper field of knowledge, and eventually in the way we think of the computer.

  19. The Sociolinguistics of Sign Languages.

    ERIC Educational Resources Information Center

    Lucas, Ceil, Ed.

    This collection of papers examines how sign languages are distributed around the world; what occurs when they come in contact with spoken and written languages, and how signers use them in a variety of situations. Each chapter introduces the key issues in a particular area of inquiry and provides a comprehensive review of the literature. The seven…

  20. The Psychotherapist and the Sign Language Interpreter

    ERIC Educational Resources Information Center

    de Bruin, Ed; Brugmans, Petra

    2006-01-01

    Specialized psychotherapy for deaf people in the Dutch and Western European mental health systems is still a rather young specialism. A key policy principle in Dutch mental health care for the deaf is that they should receive treatment in the language most accessible to them, which is usually Dutch Sign Language (Nederlandse Gebarentaal or NGT).…

  1. Why American Sign Language Gloss Must Matter

    ERIC Educational Resources Information Center

    Supalla, Samuel J.; Cripps, Jody H.; Byrne, Andrew P. J.

    2017-01-01

    Responding to an article by Grushkin (EJ1174123) on how deaf children best learn to read, published, along with the present article, in an "American Annals of the Deaf" special issue, the authors review American Sign Language gloss. Topics include how ASL gloss enables deaf children to learn to read in their own language and…

  2. Language Interdependence between American Sign Language and English: A Review of Empirical Studies

    ERIC Educational Resources Information Center

    Rusher, Melissa Ausbrooks

    2012-01-01

    This study provides a contemporary definition of American Sign Language/English bilingual education (AEBE) and outlines an essential theoretical framework. Included is a history and evolution of the methodology. The author also summarizes the general findings of twenty-six (26) empirical studies conducted in the United States that directly or…

  3. Sign language aphasia from a neurodegenerative disease.

    PubMed

    Falchook, Adam D; Mayberry, Rachel I; Poizner, Howard; Burtis, David Brandon; Doty, Leilani; Heilman, Kenneth M

    2013-01-01

    While Alois Alzheimer recognized the effects of the disease he described on speech and language in his original description of the disease in 1907, the effects of Alzheimer's disease (AD) on language in deaf signers has not previously been reported. We evaluated a 55-year-old right-handed congenitally deaf woman with a 2-year history of progressive memory loss and a deterioration of her ability to communicate in American Sign Language, which she learned at the age of eight. Examination revealed that she had impaired episodic memory as well as marked impairments in the production and comprehension of fingerspelling and grammatically complex sentences. She also had signs of anomia as well as an ideomotor apraxia and visual-spatial dysfunction. This report illustrates the challenges in evaluation of a patient for the presence of degenerative dementia when the person is deaf from birth, uses sign language, and has a late age of primary language acquisition. Although our patient could neither speak nor hear, in many respects her cognitive disorders mirror those of patients with AD who had normally learned to speak.

  4. Sign language indexation within the MPEG-7 framework

    NASA Astrophysics Data System (ADS)

    Zaharia, Titus; Preda, Marius; Preteux, Francoise J.

    1999-06-01

    In this paper, we address the issue of sign language indexation/recognition. The existing tools, like on-like Web dictionaries or other educational-oriented applications, are making exclusive use of textural annotations. However, keyword indexing schemes have strong limitations due to the ambiguity of the natural language and to the huge effort needed to manually annotate a large amount of data. In order to overcome these drawbacks, we tackle sign language indexation issue within the MPEG-7 framework and propose an approach based on linguistic properties and characteristics of sing language. The method developed introduces the concept of over time stable hand configuration instanciated on natural or synthetic prototypes. The prototypes are indexed by means of a shape descriptor which is defined as a translation, rotation and scale invariant Hough transform. A very compact representation is available by considering the Fourier transform of the Hough coefficients. Such an approach has been applied to two data sets consisting of 'Letters' and 'Words' respectively. The accuracy and robustness of the result are discussed and a compete sign language description schema is proposed.

  5. Neural networks for sign language translation

    NASA Astrophysics Data System (ADS)

    Wilson, Beth J.; Anspach, Gretel

    1993-09-01

    A neural network is used to extract relevant features of sign language from video images of a person communicating in American Sign Language or Signed English. The key features are hand motion, hand location with respect to the body, and handshape. A modular hybrid design is under way to apply various techniques, including neural networks, in the development of a translation system that will facilitate communication between deaf and hearing people. One of the neural networks described here is used to classify video images of handshapes into their linguistic counterpart in American Sign Language. The video image is preprocessed to yield Fourier descriptors that encode the shape of the hand silhouette. These descriptors are then used as inputs to a neural network that classifies their shapes. The network is trained with various examples from different signers and is tested with new images from new signers. The results have shown that for coarse handshape classes, the network is invariant to the type of camera used to film the various signers and to the segmentation technique.

  6. Detecting Dementia Through Interactive Computer Avatars

    PubMed Central

    Adachi, Hiroyoshi; Ukita, Norimichi; Ikeda, Manabu; Kazui, Hiroaki; Kudo, Takashi; Nakamura, Satoshi

    2017-01-01

    This paper proposes a new approach to automatically detect dementia. Even though some works have detected dementia from speech and language attributes, most have applied detection using picture descriptions, narratives, and cognitive tasks. In this paper, we propose a new computer avatar with spoken dialog functionalities that produces spoken queries based on the mini-mental state examination, the Wechsler memory scale-revised, and other related neuropsychological questions. We recorded the interactive data of spoken dialogues from 29 participants (14 dementia and 15 healthy controls) and extracted various audiovisual features. We tried to predict dementia using audiovisual features and two machine learning algorithms (support vector machines and logistic regression). Here, we show that the support vector machines outperformed logistic regression, and by using the extracted features they classified the participants into two groups with 0.93 detection performance, as measured by the areas under the receiver operating characteristic curve. We also newly identified some contributing features, e.g., gap before speaking, the variations of fundamental frequency, voice quality, and the ratio of smiling. We concluded that our system has the potential to detect dementia through spoken dialog systems and that the system can assist health care workers. In addition, these findings could help medical personnel detect signs of dementia. PMID:29018636

  7. Our Policies, Their Text: German Language Students' Strategies with and Beliefs about Web-Based Machine Translation

    ERIC Educational Resources Information Center

    White, Kelsey D.; Heidrich, Emily

    2013-01-01

    Most educators are aware that some students utilize web-based machine translators for foreign language assignments, however, little research has been done to determine how and why students utilize these programs, or what the implications are for language learning and teaching. In this mixed-methods study we utilized surveys, a translation task,…

  8. Information Transfer Capacity of Articulators in American Sign Language.

    PubMed

    Malaia, Evie; Borneman, Joshua D; Wilbur, Ronnie B

    2018-03-01

    The ability to convey information is a fundamental property of communicative signals. For sign languages, which are overtly produced with multiple, completely visible articulators, the question arises as to how the various channels co-ordinate and interact with each other. We analyze motion capture data of American Sign Language (ASL) narratives, and show that the capacity of information throughput, mathematically defined, is highest on the dominant hand (DH). We further demonstrate that information transfer capacity is also significant for the non-dominant hand (NDH), and the head channel too, as compared to control channels (ankles). We discuss both redundancy and independence in articulator motion in sign language, and argue that the NDH and the head articulators contribute to the overall information transfer capacity, indicating that they are neither completely redundant to, nor completely independent of, the DH.

  9. Telesign: a videophone system for sign language distant communication

    NASA Astrophysics Data System (ADS)

    Mozelle, Gerard; Preteux, Francoise J.; Viallet, Jean-Emmanuel

    1998-09-01

    This paper presents a low bit rate videophone system for deaf people communicating by means of sign language. Classic video conferencing systems have focused on head and shoulders sequences which are not well-suited for sign language video transmission since hearing impaired people also use their hands and arms to communicate. To address the above-mentioned functionality, we have developed a two-step content-based video coding system based on: (1) A segmentation step. Four or five video objects (VO) are extracted using a cooperative approach between color-based and morphological segmentation. (2) VO coding are achieved by using a standardized MPEG-4 video toolbox. Results of encoded sign language video sequences, presented for three target bit rates (32 kbits/s, 48 kbits/s and 64 kbits/s), demonstrate the efficiency of the approach presented in this paper.

  10. The Biology of Linguistic Expression Impacts Neural Correlates for Spatial Language

    PubMed Central

    Emmorey, Karen; McCullough, Stephen; Mehta, Sonya; Ponto, Laura L. B.; Grabowski, Thomas J.

    2013-01-01

    Biological differences between signed and spoken languages may be most evident in the expression of spatial information. PET was used to investigate the neural substrates supporting the production of spatial language in American Sign Language as expressed by classifier constructions, in which handshape indicates object type and the location/motion of the hand iconically depicts the location/motion of a referent object. Deaf native signers performed a picture description task in which they overtly named objects or produced classifier constructions that varied in location, motion, or object type. In contrast to the expression of location and motion, the production of both lexical signs and object type classifier morphemes engaged left inferior frontal cortex and left inferior temporal cortex, supporting the hypothesis that unlike the location and motion components of a classifier construction, classifier handshapes are categorical morphemes that are retrieved via left hemisphere language regions. In addition, lexical signs engaged the anterior temporal lobes to a greater extent than classifier constructions, which we suggest reflects increased semantic processing required to name individual objects compared with simply indicating the type of object. Both location and motion classifier constructions engaged bilateral superior parietal cortex, with some evidence that the expression of static locations differentially engaged the left intraparietal sulcus. We argue that bilateral parietal activation reflects the biological underpinnings of sign language. To express spatial information, signers must transform visual–spatial representations into a body-centered reference frame and reach toward target locations within signing space. PMID:23249348

  11. A Crosslinguistic, Crosscultural Analysis of Metaphors in Two Italian Sign Language (LIS) Registers

    ERIC Educational Resources Information Center

    Russo, Tommaso

    2005-01-01

    This article deals with two main topics: the interplay of iconicity and metaphors in signed language discourse and the relevance of sociocultural knowledge for a full understanding of LIS metaphors. In metaphors, the iconic features of signs play a role in the creative process of determining a mental fit between two different domains. Iconicity…

  12. Sign Language Subtitling by Highly Comprehensible "Semantroids"

    ERIC Educational Resources Information Center

    Adamo-Villani, Nicolleta; Beni, Gerardo

    2007-01-01

    We introduce a new method of sign language subtitling aimed at young deaf children who have not acquired reading skills yet, and can communicate only via signs. The method is based on: 1) the recently developed concept of "semantroid[TM]" (an animated 3D avatar limited to head and hands); 2) the design, development, and psychophysical evaluation…

  13. An Intelligent Computer-Based System for Sign Language Tutoring

    ERIC Educational Resources Information Center

    Ritchings, Tim; Khadragi, Ahmed; Saeb, Magdy

    2012-01-01

    A computer-based system for sign language tutoring has been developed using a low-cost data glove and a software application that processes the movement signals for signs in real-time and uses Pattern Matching techniques to decide if a trainee has closely replicated a teacher's recorded movements. The data glove provides 17 movement signals from…

  14. An Investigation of the Need for Sign Language Assessment in Deaf Education

    ERIC Educational Resources Information Center

    Mann, Wolfgang; Prinz, Philip M.

    2006-01-01

    The attitudes of educators of the deaf and other professionals in deaf education concerning assessment of the use of American Sign Language (ASL) and other sign systems was investigated. A questionnaire was distributed to teachers in a residential school for the deaf in California. In addition to questions regarding the availability of sign…

  15. A Lexical Comparison of Signs from Icelandic and Danish Sign Languages

    ERIC Educational Resources Information Center

    Aldersson, Russell R.; McEntee-Atalianis, Lisa J.

    2008-01-01

    This article reports on a comparison of lexical items in the vocabulary of Icelandic and Danish sign languages prompted by anecdotal reports of similarity and historical records detailing close contact between the two communities. Drawing on previous studies, including Bickford (2005), McKee and Kennedy (1998, 2000a, 2000b) and Parkhurst and…

  16. Sign Language Recognition and Translation: A Multidisciplined Approach from the Field of Artificial Intelligence

    ERIC Educational Resources Information Center

    Parton, Becky Sue

    2006-01-01

    In recent years, research has progressed steadily in regard to the use of computers to recognize and render sign language. This paper reviews significant projects in the field beginning with finger-spelling hands such as "Ralph" (robotics), CyberGloves (virtual reality sensors to capture isolated and continuous signs), camera-based…

  17. Relationship between the linguistic environments and early bilingual language development of hearing children in deaf-parented families.

    PubMed

    Kanto, Laura; Huttunen, Kerttu; Laakso, Marja-Leena

    2013-04-01

    We explored variation in the linguistic environments of hearing children of Deaf parents and how it was associated with their early bilingual language development. For that purpose we followed up the children's productive vocabulary (measured with the MCDI; MacArthur Communicative Development Inventory) and syntactic complexity (measured with the MLU10; mean length of the 10 longest utterances the child produced during videorecorded play sessions) in both Finnish Sign Language and spoken Finnish between the ages of 12 and 30 months. Additionally, we developed new methodology for describing the linguistic environments of the children (N = 10). Large variation was uncovered in both the amount and type of language input and language acquisition among the children. Language exposure and increases in productive vocabulary and syntactic complexity were interconnected. Language acquisition was found to be more dependent on the amount of exposure in sign language than in spoken language. This was judged to be related to the status of sign language as a minority language. The results are discussed in terms of parents' language choices, family dynamics in Deaf-parented families and optimal conditions for bilingual development.

  18. Cochlear implantation (CI) for prelingual deafness: the relevance of studies of brain organization and the role of first language acquisition in considering outcome success.

    PubMed

    Campbell, Ruth; MacSweeney, Mairéad; Woll, Bencie

    2014-01-01

    Cochlear implantation (CI) for profound congenital hearing impairment, while often successful in restoring hearing to the deaf child, does not always result in effective speech processing. Exposure to non-auditory signals during the pre-implantation period is widely held to be responsible for such failures. Here, we question the inference that such exposure irreparably distorts the function of auditory cortex, negatively impacting the efficacy of CI. Animal studies suggest that in congenital early deafness there is a disconnection between (disordered) activation in primary auditory cortex (A1) and activation in secondary auditory cortex (A2). In humans, one factor contributing to this functional decoupling is assumed to be abnormal activation of A1 by visual projections-including exposure to sign language. In this paper we show that that this abnormal activation of A1 does not routinely occur, while A2 functions effectively supramodally and multimodally to deliver spoken language irrespective of hearing status. What, then, is responsible for poor outcomes for some individuals with CI and for apparent abnormalities in cortical organization in these people? Since infancy is a critical period for the acquisition of language, deaf children born to hearing parents are at risk of developing inefficient neural structures to support skilled language processing. A sign language, acquired by a deaf child as a first language in a signing environment, is cortically organized like a heard spoken language in terms of specialization of the dominant perisylvian system. However, very few deaf children are exposed to sign language in early infancy. Moreover, no studies to date have examined sign language proficiency in relation to cortical organization in individuals with CI. Given the paucity of such relevant findings, we suggest that the best guarantee of good language outcome after CI is the establishment of a secure first language pre-implant-however that may be achieved, and whatever the success of auditory restoration.

  19. Cochlear implantation (CI) for prelingual deafness: the relevance of studies of brain organization and the role of first language acquisition in considering outcome success

    PubMed Central

    Campbell, Ruth; MacSweeney, Mairéad; Woll, Bencie

    2014-01-01

    Cochlear implantation (CI) for profound congenital hearing impairment, while often successful in restoring hearing to the deaf child, does not always result in effective speech processing. Exposure to non-auditory signals during the pre-implantation period is widely held to be responsible for such failures. Here, we question the inference that such exposure irreparably distorts the function of auditory cortex, negatively impacting the efficacy of CI. Animal studies suggest that in congenital early deafness there is a disconnection between (disordered) activation in primary auditory cortex (A1) and activation in secondary auditory cortex (A2). In humans, one factor contributing to this functional decoupling is assumed to be abnormal activation of A1 by visual projections—including exposure to sign language. In this paper we show that that this abnormal activation of A1 does not routinely occur, while A2 functions effectively supramodally and multimodally to deliver spoken language irrespective of hearing status. What, then, is responsible for poor outcomes for some individuals with CI and for apparent abnormalities in cortical organization in these people? Since infancy is a critical period for the acquisition of language, deaf children born to hearing parents are at risk of developing inefficient neural structures to support skilled language processing. A sign language, acquired by a deaf child as a first language in a signing environment, is cortically organized like a heard spoken language in terms of specialization of the dominant perisylvian system. However, very few deaf children are exposed to sign language in early infancy. Moreover, no studies to date have examined sign language proficiency in relation to cortical organization in individuals with CI. Given the paucity of such relevant findings, we suggest that the best guarantee of good language outcome after CI is the establishment of a secure first language pre-implant—however that may be achieved, and whatever the success of auditory restoration. PMID:25368567

  20. Sign language in dental education-A new nexus.

    PubMed

    Jones, T; Cumberbatch, K

    2017-08-14

    The introduction of the landmark mandatory teaching of sign language to undergraduate dental students at the University of the West Indies (UWI), Mona Campus in Kingston, Jamaica, to bridge the communication gap between dentists and their patients is reviewed. A review of over 90 Doctor of Dental Surgery and Doctor of Dental Medicine curricula in North America, the United Kingdom, parts of Europe and Australia showed no inclusion of sign language in those curricula as a mandatory component. In Jamaica, the government's training school for dental auxiliaries served as the forerunner to the UWI's introduction of formal training of sign language in 2012. Outside of the UWI, a couple of dental schools have sign language courses, but none have a mandatory programme as the one at the UWI. Dentists the world over have had to rely on interpreters to sign with their deaf patients. The deaf in Jamaica have not appreciated the fact that dentists cannot sign and they have felt insulted and only go to the dentist in emergency situations. The mandatory inclusion of sign language in the Undergraduate Dental Programme curriculum at The University of the West Indies, Mona Campus, sought to establish a direct communication channel to formally bridge this gap. The programme of two sign language courses and a direct clinical competency requirement was developed during the second year of the first cohort of the newly introduced undergraduate dental programme through a collaborating partnership between two faculties on the Mona Campus. The programme was introduced in 2012 in the third year of the 5-year undergraduate dental programme. To date, two cohorts have completed the programme, and the preliminary findings from an ongoing clinical study have shown a positive impact on dental care access and dental treatment for deaf patients at the UWI Mona Dental Polyclinic. The development of a direct communication channel between dental students and the deaf that has led to increased dental access and treatment for the deaf can be extended to dentists and to other dental students globally. The vision is that similar courses will be introduced in other health training programmes at the UWI, and conceivably, in other institutions. The small sample size allows for informative, but not definitive, conclusions to be drawn. The mandatory inclusion of sign language and Deaf culture in the dental curricula has not just removed a communication barrier, but has assisted in the empathetic and ethical development of the dental student. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Health Promotion and Healthier Products Increase Vending Purchases: A Randomized Factorial Trial.

    PubMed

    Hua, Sophia V; Kimmel, Lisa; Van Emmenes, Michael; Taherian, Rafi; Remer, Geraldine; Millman, Adam; Ickovics, Jeannette R

    2017-07-01

    The current food environment has a high prevalence of nutrient-sparse foods and beverages, most starkly seen in vending machine offerings. There are currently few studies that explore different interventions that might lead to healthier vending machine purchases. To examine how healthier product availability, price reductions, and/or promotional signs affect sales and revenue of snack and beverage vending machines. A 2×2×2 factorial randomized controlled trial was conducted. Students, staff, and employees on a university campus. All co-located snack and beverage vending machines (n=56, 28 snack and 28 beverage) were randomized into one of eight conditions: availability of healthier products and/or 25% price reduction for healthier items and/or promotional signs on machines. Aggregate sales and revenue data for the 5-month study period (February to June 2015) were compared with data from the same months 1 year prior. Analyses were conducted July 2015. The change in units sold and revenue between February through June 2014 and 2015. Linear regression models (main effects and interaction effects) and t test analyses were performed. The interaction between healthier product guidelines and promotional signs in snack vending machines documented increased revenue (P<0.05). Beverage machines randomized to meet healthier product guidelines documented increased units sold (P<0.05) with no revenue change. Price reductions alone had no effect, nor were there any effects for the three-way interaction of the factors. Examining top-selling products for all vending machines combined, pre- to postintervention, we found an overall shift to healthier purchasing. When healthier vending snacks are available, promotional signs are also important to ensure consumers purchase those items in greater amounts. Mitigating potential loss in profits is essential for sustainability of a healthier food environment. Copyright © 2017 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  2. Preference for language in early infancy: the human language bias is not speech specific.

    PubMed

    Krentz, Ursula C; Corina, David P

    2008-01-01

    Fundamental to infants' acquisition of their native language is an inherent interest in the language spoken around them over non-linguistic environmental sounds. The following studies explored whether the bias for linguistic signals in hearing infants is specific to speech, or reflects a general bias for all human language, spoken and signed. Results indicate that 6-month-old infants prefer an unfamiliar, visual-gestural language (American Sign Language) over non-linguistic pantomime, but 10-month-olds do not. These data provide evidence against a speech-specific bias in early infancy and provide insights into those properties of human languages that may underlie this language-general attentional bias.

  3. Distinctive Feature Extraction for Indian Sign Language (ISL) Gesture using Scale Invariant Feature Transform (SIFT)

    NASA Astrophysics Data System (ADS)

    Patil, Sandeep Baburao; Sinha, G. R.

    2017-02-01

    India, having less awareness towards the deaf and dumb peoples leads to increase the communication gap between deaf and hard hearing community. Sign language is commonly developed for deaf and hard hearing peoples to convey their message by generating the different sign pattern. The scale invariant feature transform was introduced by David Lowe to perform reliable matching between different images of the same object. This paper implements the various phases of scale invariant feature transform to extract the distinctive features from Indian sign language gestures. The experimental result shows the time constraint for each phase and the number of features extracted for 26 ISL gestures.

  4. Kinect-based sign language recognition of static and dynamic hand movements

    NASA Astrophysics Data System (ADS)

    Dalawis, Rando C.; Olayao, Kenneth Deniel R.; Ramos, Evan Geoffrey I.; Samonte, Mary Jane C.

    2017-02-01

    A different approach of sign language recognition of static and dynamic hand movements was developed in this study using normalized correlation algorithm. The goal of this research was to translate fingerspelling sign language into text using MATLAB and Microsoft Kinect. Digital input image captured by Kinect devices are matched from template samples stored in a database. This Human Computer Interaction (HCI) prototype was developed to help people with communication disability to express their thoughts with ease. Frame segmentation and feature extraction was used to give meaning to the captured images. Sequential and random testing was used to test both static and dynamic fingerspelling gestures. The researchers explained some factors they encountered causing some misclassification of signs.

  5. Age of acquisition effects on the functional organization of language in the adult brain.

    PubMed

    Mayberry, Rachel I; Chen, Jen-Kai; Witcher, Pamela; Klein, Denise

    2011-10-01

    Using functional magnetic resonance imaging (fMRI), we neuroimaged deaf adults as they performed two linguistic tasks with sentences in American Sign Language, grammatical judgment and phonemic-hand judgment. Participants' age-onset of sign language acquisition ranged from birth to 14 years; length of sign language experience was substantial and did not vary in relation to age of acquisition. For both tasks, a more left lateralized pattern of activation was observed, with activity for grammatical judgment being more anterior than that observed for phonemic-hand judgment, which was more posterior by comparison. Age of acquisition was linearly and negatively related to activation levels in anterior language regions and positively related to activation levels in posterior visual regions for both tasks. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. Increasing Literacy Skills for Students with Intellectual and Developmental Disabilities: Effects of Integrating Comprehensive Reading Instruction with Sign Language

    ERIC Educational Resources Information Center

    Beecher, Larissa; Childre, Amy

    2012-01-01

    This study evaluated the impact of a comprehensive reading program enhanced with sign language on the literacy and language skills of three elementary school students with intellectual and developmental disabilities. Students received individual and small group comprehensive reading instruction for approximately 55 minutes per session. Reading…

  7. Language between Bodies: A Cognitive Approach to Understanding Linguistic Politeness in American Sign Language

    ERIC Educational Resources Information Center

    Roush, Daniel R.

    2011-01-01

    This article proposes an answer to the primary question of how the American Sign Language (ASL) community in the United States conceptualizes (im)politeness and its related notions. It begins with a review of evolving theoretical issues in research on (im)politeness and related methodological problems with studying (im)politeness in natural…

  8. Mobile Sign Language Learning Outside the Classroom

    ERIC Educational Resources Information Center

    Weaver, Kimberly A.; Starner, Thad

    2012-01-01

    The majority of deaf children in the United States are born to hearing parents with limited prior exposure to American Sign Language (ASL). Our research involves creating and validating a mobile language tool called SMARTSign. The goal is to help hearing parents learn ASL in a way that fits seamlessly into their daily routine. (Contains 3 figures.)

  9. A Barking Dog That Never Bites? The British Sign Language (Scotland) Bill

    ERIC Educational Resources Information Center

    De Meulder, Maartje

    2015-01-01

    This article describes and analyses the pathway to the British Sign Language (Scotland) Bill and the strategies used to reach it. Data collection has been done by means of interviews with key players, analysis of official documents, and participant observation. The article discusses the bill in relation to the Gaelic Language (Scotland) Act 2005…

  10. On Selected Morphemes in Saudi Arabian Sign Language

    ERIC Educational Resources Information Center

    Morris, Carla; Schneider, Erin

    2012-01-01

    Following a year of study of Saudi Arabian Sign Language (SASL), we are documenting our findings to provide a grammatical sketch of the language. This paper represents one part of that endeavor and focuses on a description of selected morphemes, both manual and non-manual, that have appeared in the course of data collection. While some of the…

  11. ME . . . ME . . . WASHOE: An Appreciation

    ERIC Educational Resources Information Center

    King, Barbara J.

    2008-01-01

    Washoe, the chimpanzee pioneer who learned aspects of American Sign Language, died in October 2007. In reviewing her life and accomplishments, this article focuses on Washoe's status as an ape and a person, and on the role of emotion in language learning and language use. It argues that Washoe's legacy stems not from the number of ASL signs she…

  12. Type of iconicity matters in the vocabulary development of signing children.

    PubMed

    Ortega, Gerardo; Sümer, Beyza; Özyürek, Aslı

    2017-01-01

    Recent research on signed as well as spoken language shows that the iconic features of the target language might play a role in language development. Here, we ask further whether different types of iconic depictions modulate children's preferences for certain types of sign-referent links during vocabulary development in sign language. Results from a picture description task indicate that lexical signs with 2 possible variants are used in different proportions by deaf signers from different age groups. While preschool and school-age children favored variants representing actions associated with their referent (e.g., a writing hand for the sign PEN), adults preferred variants representing the perceptual features of those objects (e.g., upward index finger representing a thin, elongated object for the sign PEN). Deaf parents interacting with their children, however, used action- and perceptual-based variants in equal proportion and favored action variants more than adults signing to other adults. We propose that when children are confronted with 2 variants for the same concept, they initially prefer action-based variants because they give them the opportunity to link a linguistic label to familiar schemas linked to their action/motor experiences. Our results echo findings showing a bias for action-based depictions in the development of iconic co-speech gestures suggesting a modality bias for such representations during development. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  13. Neural Basis of Action Understanding: Evidence from Sign Language Aphasia.

    PubMed

    Rogalsky, Corianne; Raphel, Kristin; Tomkovicz, Vivian; O'Grady, Lucinda; Damasio, Hanna; Bellugi, Ursula; Hickok, Gregory

    2013-01-01

    The neural basis of action understanding is a hotly debated issue. The mirror neuron account holds that motor simulation in fronto-parietal circuits is critical to action understanding including speech comprehension, while others emphasize the ventral stream in the temporal lobe. Evidence from speech strongly supports the ventral stream account, but on the other hand, evidence from manual gesture comprehension (e.g., in limb apraxia) has led to contradictory findings. Here we present a lesion analysis of sign language comprehension. Sign language is an excellent model for studying mirror system function in that it bridges the gap between the visual-manual system in which mirror neurons are best characterized and language systems which have represented a theoretical target of mirror neuron research. Twenty-one life long deaf signers with focal cortical lesions performed two tasks: one involving the comprehension of individual signs and the other involving comprehension of signed sentences (commands). Participants' lesions, as indicated on MRI or CT scans, were mapped onto a template brain to explore the relationship between lesion location and sign comprehension measures. Single sign comprehension was not significantly affected by left hemisphere damage. Sentence sign comprehension impairments were associated with left temporal-parietal damage. We found that damage to mirror system related regions in the left frontal lobe were not associated with deficits on either of these comprehension tasks. We conclude that the mirror system is not critically involved in action understanding.

  14. Quantifying the Efficiency of a Translator: The Effect of Syntactical and Literal Written Translations on Language Comprehension Using the Machine Translation System FALCon

    ERIC Educational Resources Information Center

    McCulloh, Ian A.; Morton, Jillian; Jantzi, Jennifer K.; Rodriguez, Amy M.; Graham, John

    2008-01-01

    This study introduces a new method of evaluating human comprehension in the context of machine translation using a language translation program known as the FALCon (Forward Area Language Converter). The participants include 48 freshmen from the United States Military Academy enrolled in the General Psychology course, PL100. Results of this study…

  15. Language Acquisition and Machine Learning.

    DTIC Science & Technology

    1986-02-01

    machine learning and examine its implications for computational models of language acquisition. As a framework for understanding this research, the authors propose four component tasks involved in learning from experience-aggregation, clustering, characterization, and storage. They then consider four common problems studied by machine learning researchers-learning from examples, heuristics learning, conceptual clustering, and learning macro-operators-describing each in terms of our framework. After this, they turn to the problem of grammar

  16. Family therapy with deaf persons: the systemic utilization of an interpreter.

    PubMed

    Harvey, M A

    1984-06-01

    This paper discusses the theory and practice of providing family therapy to families in which there are hearing parents and at least one Deaf child, particularly regarding the optimal utilization of an interpreter. The therapist must be knowledgeable about the psychosocial effects of deafness, the cultural aspects of deafness, and preferably be able to use American Sign Language and Signed English. The therapeutic benefit of utilizing an interpreter extends far beyond simply facilitating communication between each family member whose primary-language is either spoken English or Sign Language. The presence of an interpreter helps the therapist to modify family rules that deny the implications of deafness and prohibit the use of Sign Language, to modify the balance of power in the family, and to encourage participants to exhibit the ego defense mechanisms of projection and transference. The family therapist can utilize those subtle yet profound influences to therapeutic advantage.

  17. Prediction in a visual language: real-time sentence processing in American Sign Language across development.

    PubMed

    Lieberman, Amy M; Borovsky, Arielle; Mayberry, Rachel I

    2018-01-01

    Prediction during sign language comprehension may enable signers to integrate linguistic and non-linguistic information within the visual modality. In two eyetracking experiments, we investigated American Sign language (ASL) semantic prediction in deaf adults and children (aged 4-8 years). Participants viewed ASL sentences in a visual world paradigm in which the sentence-initial verb was either neutral or constrained relative to the sentence-final target noun. Adults and children made anticipatory looks to the target picture before the onset of the target noun in the constrained condition only, showing evidence for semantic prediction. Crucially, signers alternated gaze between the stimulus sign and the target picture only when the sentential object could be predicted from the verb. Signers therefore engage in prediction by optimizing visual attention between divided linguistic and referential signals. These patterns suggest that prediction is a modality-independent process, and theoretical implications are discussed.

  18. [Information technology in learning sign language].

    PubMed

    Hernández, Cesar; Pulido, Jose L; Arias, Jorge E

    2015-01-01

    To develop a technological tool that improves the initial learning of sign language in hearing impaired children. The development of this research was conducted in three phases: the lifting of requirements, design and development of the proposed device, and validation and evaluation device. Through the use of information technology and with the advice of special education professionals, we were able to develop an electronic device that facilitates the learning of sign language in deaf children. This is formed mainly by a graphic touch screen, a voice synthesizer, and a voice recognition system. Validation was performed with the deaf children in the Filadelfia School of the city of Bogotá. A learning methodology was established that improves learning times through a small, portable, lightweight, and educational technological prototype. Tests showed the effectiveness of this prototype, achieving a 32 % reduction in the initial learning time for sign language in deaf children.

  19. Bridge of Signs: Can Sign Language Empower Non-Deaf Children to Triumph over Their Communication Disabilities?

    ERIC Educational Resources Information Center

    Toth, Anne

    2009-01-01

    This pilot research project examined the use of sign language as a communication bridge for non-Deaf children between the ages of 0-6 years who had been diagnosed with, or whose communication difficulties suggested, the presence of such disorders as Autism, Down Syndrome, Fetal Alcohol Spectrum Disorder (FASD), and/or learning disabilities.…

  20. Structure of the Brazilian Sign Language (Libras) for Computational Tools: Citizenship and Social Inclusion

    NASA Astrophysics Data System (ADS)

    Guimaraes, Cayley; Antunes, Diego R.; de F. Guilhermino Trindade, Daniela; da Silva, Rafaella A. Lopes; Garcia, Laura Sanchez

    This work presents a computational model (XML) of the Brazilian Sign Language (Libras), based on its phonology. The model was used to create a sample of representative signs to aid the recording of a base of videos whose aim is to support the development of tools to support genuine social inclusion of the deaf.

  1. Deaf Children Attending Different School Environments: Sign Language Abilities and Theory of Mind

    ERIC Educational Resources Information Center

    Tomasuolo, Elena; Valeri, Giovanni; Di Renzo, Alessio; Pasqualetti, Patrizio; Volterra, Virginia

    2013-01-01

    The present study examined whether full access to sign language as a medium for instruction could influence performance in Theory of Mind (ToM) tasks. Three groups of Italian participants (age range: 6-14 years) participated in the study: Two groups of deaf signing children and one group of hearing-speaking children. The two groups of deaf…

  2. LSE-Sign: A lexical database for Spanish Sign Language.

    PubMed

    Gutierrez-Sigut, Eva; Costello, Brendan; Baus, Cristina; Carreiras, Manuel

    2016-03-01

    The LSE-Sign database is a free online tool for selecting Spanish Sign Language stimulus materials to be used in experiments. It contains 2,400 individual signs taken from a recent standardized LSE dictionary, and a further 2,700 related nonsigns. Each entry is coded for a wide range of grammatical, phonological, and articulatory information, including handshape, location, movement, and non-manual elements. The database is accessible via a graphically based search facility which is highly flexible both in terms of the search options available and the way the results are displayed. LSE-Sign is available at the following website: http://www.bcbl.eu/databases/lse/.

  3. Ada Compiler Validation Summary Report: Certificate Number 880318W1. 09042, International Business Machines Corporation, IBM Development System for the Ada Language, Version 2.1.0, IBM 4381 under MVS/XA, Host and Target

    DTIC Science & Technology

    1988-03-28

    International Business Machines Corporation IBM Development System for the Ada Language, Version 2.1.0 IBM 4381 under MVS/XA, host and target Completion...Joint Program Office, AJPO 20. ABSTRACT (Continue on reverse side if necessary and identify by block number) International Business Machines Corporation...in the compiler listed in this declaration. I declare that International Business Machines Corporation is the owner of record of the object code of

  4. Ada (Tradename) Compiler Validation Summary Report. International Business Machines Corporation. IBM Development System for the Ada Language for VM/CMS, Version 1.0. IBM 4381 (IBM System/370) under VM/CMS.

    DTIC Science & Technology

    1986-04-29

    COMPILER VALIDATION SUMMARY REPORT: International Business Machines Corporation IBM Development System for the Ada Language for VM/CMS, Version 1.0 IBM 4381...tested using command scripts provided by International Business Machines Corporation. These scripts were reviewed by the validation team. Test.s were run...s): IBM 4381 (System/370) Operating System: VM/CMS, release 3.6 International Business Machines Corporation has made no deliberate extensions to the

  5. Modeling the Emergence of Lexicons in Homesign Systems

    PubMed Central

    Richie, Russell; Yang, Charles; Coppola, Marie

    2014-01-01

    It is largely acknowledged that natural languages emerge from not just human brains, but also from rich communities of interacting human brains (Senghas, 2005). Yet the precise role of such communities and such interaction in the emergence of core properties of language has largely gone uninvestigated in naturally emerging systems, leaving the few existing computational investigations of this issue at an artificial setting. Here we take a step towards investigating the precise role of community structure in the emergence of linguistic conventions with both naturalistic empirical data and computational modeling. We first show conventionalization of lexicons in two different classes of naturally emerging signed systems: (1) protolinguistic “homesigns” invented by linguistically isolated Deaf individuals, and (2) a natural sign language emerging in a recently formed rich Deaf community. We find that the latter conventionalized faster than the former. Second, we model conventionalization as a population of interacting individuals who adjust their probability of sign use in response to other individuals' actual sign use, following an independently motivated model of language learning (Yang 2002, 2004). Simulations suggest that a richer social network, like that of natural (signed) languages, conventionalizes faster than a sparser social network, like that of homesign systems. We discuss our behavioral and computational results in light of other work on language emergence, and other work of behavior on complex networks. PMID:24482343

  6. Dictionary Based Machine Translation from Kannada to Telugu

    NASA Astrophysics Data System (ADS)

    Sindhu, D. V.; Sagar, B. M.

    2017-08-01

    Machine Translation is a task of translating from one language to another language. For the languages with less linguistic resources like Kannada and Telugu Dictionary based approach is the best approach. This paper mainly focuses on Dictionary based machine translation for Kannada to Telugu. The proposed methodology uses dictionary for translating word by word without much correlation of semantics between them. The dictionary based machine translation process has the following sub process: Morph analyzer, dictionary, transliteration, transfer grammar and the morph generator. As a part of this work bilingual dictionary with 8000 entries is developed and the suffix mapping table at the tag level is built. This system is tested for the children stories. In near future this system can be further improved by defining transfer grammar rules.

  7. Software design and documentation language, revision 1

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1979-01-01

    The Software Design and Documentation Language (SDDL) developed to provide an effective communications medium to support the design and documentation of complex software applications is described. Features of the system include: (1) a processor which can convert design specifications into an intelligible, informative machine-reproducible document; (2) a design and documentation language with forms and syntax that are simple, unrestrictive, and communicative; and (3) methodology for effective use of the language and processor. The SDDL processor is written in the SIMSCRIPT II programming language and is implemented on the UNIVAC 1108, the IBM 360/370, and Control Data machines.

  8. Machine Translation: The Alternative for the 21st Century?

    ERIC Educational Resources Information Center

    Cribb, V. Michael

    2000-01-01

    Outlines a scenario for the future of Teaching English as a Second or Other Languages that has seldom, if ever been considered in academic discussion: that advances in and availability of quality machine translation could mitigate the need for English language learning. (Author/VWL)

  9. Building a profile of subjective well-being for social media users.

    PubMed

    Chen, Lushi; Gong, Tao; Kosinski, Michal; Stillwell, David; Davidson, Robert L

    2017-01-01

    Subjective well-being includes 'affect' and 'satisfaction with life' (SWL). This study proposes a unified approach to construct a profile of subjective well-being based on social media language in Facebook status updates. We apply sentiment analysis to generate users' affect scores, and train a random forest model to predict SWL using affect scores and other language features of the status updates. Results show that: the computer-selected features resemble the key predictors of SWL as identified in early studies; the machine-predicted SWL is moderately correlated with the self-reported SWL (r = 0.36, p < 0.01), indicating that language-based assessment can constitute valid SWL measures; the machine-assessed affect scores resemble those reported in a previous experimental study; and the machine-predicted subjective well-being profile can also reflect other psychological traits like depression (r = 0.24, p < 0.01). This study provides important insights for psychological prediction using multiple, machine-assessed components and longitudinal or dense psychological assessment using social media language.

  10. Building a profile of subjective well-being for social media users

    PubMed Central

    Kosinski, Michal; Stillwell, David; Davidson, Robert L.

    2017-01-01

    Subjective well-being includes ‘affect’ and ‘satisfaction with life’ (SWL). This study proposes a unified approach to construct a profile of subjective well-being based on social media language in Facebook status updates. We apply sentiment analysis to generate users’ affect scores, and train a random forest model to predict SWL using affect scores and other language features of the status updates. Results show that: the computer-selected features resemble the key predictors of SWL as identified in early studies; the machine-predicted SWL is moderately correlated with the self-reported SWL (r = 0.36, p < 0.01), indicating that language-based assessment can constitute valid SWL measures; the machine-assessed affect scores resemble those reported in a previous experimental study; and the machine-predicted subjective well-being profile can also reflect other psychological traits like depression (r = 0.24, p < 0.01). This study provides important insights for psychological prediction using multiple, machine-assessed components and longitudinal or dense psychological assessment using social media language. PMID:29135991

  11. Efficient Embedded Decoding of Neural Network Language Models in a Machine Translation System.

    PubMed

    Zamora-Martinez, Francisco; Castro-Bleda, Maria Jose

    2018-02-22

    Neural Network Language Models (NNLMs) are a successful approach to Natural Language Processing tasks, such as Machine Translation. We introduce in this work a Statistical Machine Translation (SMT) system which fully integrates NNLMs in the decoding stage, breaking the traditional approach based on [Formula: see text]-best list rescoring. The neural net models (both language models (LMs) and translation models) are fully coupled in the decoding stage, allowing to more strongly influence the translation quality. Computational issues were solved by using a novel idea based on memorization and smoothing of the softmax constants to avoid their computation, which introduces a trade-off between LM quality and computational cost. These ideas were studied in a machine translation task with different combinations of neural networks used both as translation models and as target LMs, comparing phrase-based and [Formula: see text]-gram-based systems, showing that the integrated approach seems more promising for [Formula: see text]-gram-based systems, even with nonfull-quality NNLMs.

  12. The relation between working memory and language comprehension in signers and speakers.

    PubMed

    Emmorey, Karen; Giezen, Marcel R; Petrich, Jennifer A F; Spurgeon, Erin; O'Grady Farnady, Lucinda

    2017-06-01

    This study investigated the relation between linguistic and spatial working memory (WM) resources and language comprehension for signed compared to spoken language. Sign languages are both linguistic and visual-spatial, and therefore provide a unique window on modality-specific versus modality-independent contributions of WM resources to language processing. Deaf users of American Sign Language (ASL), hearing monolingual English speakers, and hearing ASL-English bilinguals completed several spatial and linguistic serial recall tasks. Additionally, their comprehension of spatial and non-spatial information in ASL and spoken English narratives was assessed. Results from the linguistic serial recall tasks revealed that the often reported advantage for speakers on linguistic short-term memory tasks does not extend to complex WM tasks with a serial recall component. For English, linguistic WM predicted retention of non-spatial information, and both linguistic and spatial WM predicted retention of spatial information. For ASL, spatial WM predicted retention of spatial (but not non-spatial) information, and linguistic WM did not predict retention of either spatial or non-spatial information. Overall, our findings argue against strong assumptions of independent domain-specific subsystems for the storage and processing of linguistic and spatial information and furthermore suggest a less important role for serial encoding in signed than spoken language comprehension. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Controlled English to facilitate human/machine analytical processing

    NASA Astrophysics Data System (ADS)

    Braines, Dave; Mott, David; Laws, Simon; de Mel, Geeth; Pham, Tien

    2013-06-01

    Controlled English is a human-readable information representation format that is implemented using a restricted subset of the English language, but which is unambiguous and directly accessible by simple machine processes. We have been researching the capabilities of CE in a number of contexts, and exploring the degree to which a flexible and more human-friendly information representation format could aid the intelligence analyst in a multi-agent collaborative operational environment; especially in cases where the agents are a mixture of other human users and machine processes aimed at assisting the human users. CE itself is built upon a formal logic basis, but allows users to easily specify models for a domain of interest in a human-friendly language. In our research we have been developing an experimental component known as the "CE Store" in which CE information can be quickly and flexibly processed and shared between human and machine agents. The CE Store environment contains a number of specialized machine agents for common processing tasks and also supports execution of logical inference rules that can be defined in the same CE language. This paper outlines the basic architecture of this approach, discusses some of the example machine agents that have been developed, and provides some typical examples of the CE language and the way in which it has been used to support complex analytical tasks on synthetic data sources. We highlight the fusion of human and machine processing supported through the use of the CE language and CE Store environment, and show this environment with examples of highly dynamic extensions to the model(s) and integration between different user-defined models in a collaborative setting.

  14. YES, #NO, Visibility and Variation in ASL and Tactile ASL

    ERIC Educational Resources Information Center

    Petronio, Karen; Dively, Valerie

    2006-01-01

    In American Sign Language (ASL), a receiver watches the signer and receives language visually. In contrast, when using tactile ASL, a variety of ASL, the deaf-blind receiver receives language by placing a hand on top of the signer's hand. In the study described in this article we compared the functions and frequency of the signs YES and #NO in…

  15. Beware of the Dog! Private Linguistic Landscapes in Two "Hungarian" Villages in South-West Slovakia

    ERIC Educational Resources Information Center

    Laihonen, Petteri

    2016-01-01

    This study demonstrates how a single type of sign can be connected to language policy on a larger scale. Focusing on the relationship between language policy and language ideologies, I investigate the private Linguistic Landscape (LL) of Hungarians living in two villages in Slovakia. Through an examination of "beware of the dog" signs,…

  16. Dissociating Linguistic and Non-Linguistic Gesture Processing: Electrophysiological Evidence from American Sign Language

    ERIC Educational Resources Information Center

    Grosvald, Michael; Gutierrez, Eva; Hafer, Sarah; Corina, David

    2012-01-01

    A fundamental advance in our understanding of human language would come from a detailed account of how non-linguistic and linguistic manual actions are differentiated in real time by language users. To explore this issue, we targeted the N400, an ERP component known to be sensitive to semantic context. Deaf signers saw 120 American Sign Language…

  17. Response bias reveals enhanced attention to inferior visual field in signers of American Sign Language.

    PubMed

    Dye, Matthew W G; Seymour, Jenessa L; Hauser, Peter C

    2016-04-01

    Deafness results in cross-modal plasticity, whereby visual functions are altered as a consequence of a lack of hearing. Here, we present a reanalysis of data originally reported by Dye et al. (PLoS One 4(5):e5640, 2009) with the aim of testing additional hypotheses concerning the spatial redistribution of visual attention due to deafness and the use of a visuogestural language (American Sign Language). By looking at the spatial distribution of errors made by deaf and hearing participants performing a visuospatial selective attention task, we sought to determine whether there was evidence for (1) a shift in the hemispheric lateralization of visual selective function as a result of deafness, and (2) a shift toward attending to the inferior visual field in users of a signed language. While no evidence was found for or against a shift in lateralization of visual selective attention as a result of deafness, a shift in the allocation of attention from the superior toward the inferior visual field was inferred in native signers of American Sign Language, possibly reflecting an adaptation to the perceptual demands imposed by a visuogestural language.

  18. Strategies, Language Transfer and the Simulation of the Second Language Learner's Mental Operations.

    ERIC Educational Resources Information Center

    Smith, Mike Sharwood

    1979-01-01

    An attempt is made to describe second language behavior and language transfer in cybernetic terms. This should make it possible to translate language into machine language and to clarify psycholinguistic explanations of second language performance. (PMJ)

  19. High-performance Chinese multiclass traffic sign detection via coarse-to-fine cascade and parallel support vector machine detectors

    NASA Astrophysics Data System (ADS)

    Chang, Faliang; Liu, Chunsheng

    2017-09-01

    The high variability of sign colors and shapes in uncontrolled environments has made the detection of traffic signs a challenging problem in computer vision. We propose a traffic sign detection (TSD) method based on coarse-to-fine cascade and parallel support vector machine (SVM) detectors to detect Chinese warning and danger traffic signs. First, a region of interest (ROI) extraction method is proposed to extract ROIs using color contrast features in local regions. The ROI extraction can reduce scanning regions and save detection time. For multiclass TSD, we propose a structure that combines a coarse-to-fine cascaded tree with a parallel structure of histogram of oriented gradients (HOG) + SVM detectors. The cascaded tree is designed to detect different types of traffic signs in a coarse-to-fine process. The parallel HOG + SVM detectors are designed to do fine detection of different types of traffic signs. The experiments demonstrate the proposed TSD method can rapidly detect multiclass traffic signs with different colors and shapes in high accuracy.

  20. Programming in HAL/S

    NASA Technical Reports Server (NTRS)

    Ryer, M. J.

    1978-01-01

    HAL/S is a computer programming language; it is a representation for algorithms which can be interpreted by either a person or a computer. HAL/S compilers transform blocks of HAL/S code into machine language which can then be directly executed by a computer. When the machine language is executed, the algorithm specified by the HAL/S code (source) is performed. This document describes how to read and write HAL/S source.

  1. A qualitative exploration of trial-related terminology in a study involving Deaf British Sign Language users.

    PubMed

    Young, Alys; Oram, Rosemary; Dodds, Claire; Nassimi-Green, Catherine; Belk, Rachel; Rogers, Katherine; Davies, Linda; Lovell, Karina

    2016-04-27

    Internationally, few clinical trials have involved Deaf people who use a signed language and none have involved BSL (British Sign Language) users. Appropriate terminology in BSL for key concepts in clinical trials that are relevant to recruitment and participant information materials, to support informed consent, do not exist. Barriers to conceptual understanding of trial participation and sources of misunderstanding relevant to the Deaf community are undocumented. A qualitative, community participatory exploration of trial terminology including conceptual understanding of 'randomisation', 'trial', 'informed choice' and 'consent' was facilitated in BSL involving 19 participants in five focus groups. Data were video-recorded and analysed in source language (BSL) using a phenomenological approach. Six necessary conditions for developing trial information to support comprehension were identified. These included: developing appropriate expressions and terminology from a community basis, rather than testing out previously derived translations from a different language; paying attention to language-specific features which support best means of expression (in the case of BSL expectations of specificity, verb directionality, handshape); bilingual influences on comprehension; deliberate orientation of information to avoid misunderstanding not just to promote accessibility; sensitivity to barriers to discussion about intelligibility of information that are cultural and social in origin, rather than linguistic; the importance of using contemporary language-in-use, rather than jargon-free or plain language, to support meaningful understanding. The study reinforces the ethical imperative to ensure trial participants who are Deaf are provided with optimum resources to understand the implications of participation and to make an informed choice. Results are relevant to the development of trial information in other signed languages as well as in spoken/written languages when participants' language use is different from the dominant language of the country.

  2. To Capture a Face: A Novel Technique for the Analysis and Quantification of Facial Expressions in American Sign Language

    ERIC Educational Resources Information Center

    Grossman, Ruth B.; Kegl, Judy

    2006-01-01

    American Sign Language uses the face to express vital components of grammar in addition to the more universal expressions of emotion. The study of ASL facial expressions has focused mostly on the perception and categorization of various expression types by signing and nonsigning subjects. Only a few studies of the production of ASL facial…

  3. The Relation between the Working Memory Skills of Sign Language Interpreters and the Quality of Their Interpretations

    ERIC Educational Resources Information Center

    Van Dijk, Rick; Christoffels, Ingrid; Postma, Albert; Hermans, Daan

    2012-01-01

    In two experiments we investigated the relationship between the working memory skills of sign language interpreters and the quality of their interpretations. In Experiment 1, we found that scores on 3-back tasks with signs and words were not related to the quality of interpreted narratives. In Experiment 2, we found that memory span scores for…

  4. The Influence of Visual Feedback and Register Changes on Sign Language Production: A Kinematic Study with Deaf Signers

    ERIC Educational Resources Information Center

    Emmorey, Karen; Gertsberg, Nelly; Korpics, Franco; Wright, Charles E.

    2009-01-01

    Speakers monitor their speech output by listening to their own voice. However, signers do not look directly at their hands and cannot see their own face. We investigated the importance of a visual perceptual loop for sign language monitoring by examining whether changes in visual input alter sign production. Deaf signers produced American Sign…

  5. Music and Sign Language to Promote Infant and Toddler Communication and Enhance Parent-Child Interaction

    ERIC Educational Resources Information Center

    Colwell, Cynthia; Memmott, Jenny; Meeker-Miller, Anne

    2014-01-01

    The purpose of this study was to determine the efficacy of using music and/or sign language to promote early communication in infants and toddlers (6-20 months) and to enhance parent-child interactions. Three groups used for this study were pairs of participants (care-giver(s) and child) assigned to each group: 1) Music Alone 2) Sign Language…

  6. Where to Look for American Sign Language (ASL) Sublexical Structure in the Visual World: Reply to Salverda (2016)

    ERIC Educational Resources Information Center

    Lieberman, Amy M.; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I.

    2016-01-01

    In this reply to Salverda (2016), we address a critique of the claims made in our recent study of real-time processing of American Sign Language (ASL) signs using a novel visual world eye-tracking paradigm (Lieberman, Borovsky, Hatrak, & Mayberry, 2015). Salverda asserts that our data do not support our conclusion that native signers and…

  7. Deaf children's non-verbal working memory is impacted by their language experience

    PubMed Central

    Marshall, Chloë; Jones, Anna; Denmark, Tanya; Mason, Kathryn; Atkinson, Joanna; Botting, Nicola; Morgan, Gary

    2015-01-01

    Several recent studies have suggested that deaf children perform more poorly on working memory tasks compared to hearing children, but these studies have not been able to determine whether this poorer performance arises directly from deafness itself or from deaf children's reduced language exposure. The issue remains unresolved because findings come mostly from (1) tasks that are verbal as opposed to non-verbal, and (2) involve deaf children who use spoken communication and therefore may have experienced impoverished input and delayed language acquisition. This is in contrast to deaf children who have been exposed to a sign language since birth from Deaf parents (and who therefore have native language-learning opportunities within a normal developmental timeframe for language acquisition). A more direct, and therefore stronger, test of the hypothesis that the type and quality of language exposure impact working memory is to use measures of non-verbal working memory (NVWM) and to compare hearing children with two groups of deaf signing children: those who have had native exposure to a sign language, and those who have experienced delayed acquisition and reduced quality of language input compared to their native-signing peers. In this study we investigated the relationship between NVWM and language in three groups aged 6–11 years: hearing children (n = 28), deaf children who were native users of British Sign Language (BSL; n = 8), and deaf children who used BSL but who were not native signers (n = 19). We administered a battery of non-verbal reasoning, NVWM, and language tasks. We examined whether the groups differed on NVWM scores, and whether scores on language tasks predicted scores on NVWM tasks. For the two executive-loaded NVWM tasks included in our battery, the non-native signers performed less accurately than the native signer and hearing groups (who did not differ from one another). Multiple regression analysis revealed that scores on the vocabulary measure predicted scores on those two executive-loaded NVWM tasks (with age and non-verbal reasoning partialled out). Our results suggest that whatever the language modality—spoken or signed—rich language experience from birth, and the good language skills that result from this early age of acquisition, play a critical role in the development of NVWM and in performance on NVWM tasks. PMID:25999875

  8. Sign Language Recognition System using Neural Network for Digital Hardware Implementation

    NASA Astrophysics Data System (ADS)

    Vargas, Lorena P.; Barba, Leiner; Torres, C. O.; Mattos, L.

    2011-01-01

    This work presents an image pattern recognition system using neural network for the identification of sign language to deaf people. The system has several stored image that show the specific symbol in this kind of language, which is employed to teach a multilayer neural network using a back propagation algorithm. Initially, the images are processed to adapt them and to improve the performance of discriminating of the network, including in this process of filtering, reduction and elimination noise algorithms as well as edge detection. The system is evaluated using the signs without including movement in their representation.

  9. 76 FR 27668 - ASC Machine Tools, Inc., Spokane Valley, WA; Notice of Negative Determination on Reconsideration

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-12

    ... DEPARTMENT OF LABOR Employment and Training Administration [TA-W-72,971] ASC Machine Tools, Inc... workers and former workers of ASC Machine Tools, Inc., Spokane Valley, Washington (the subject firm). The... workers of ASC Machine Tools, Inc., Spokane Valley, Washington. Signed in Washington, DC, on this 2nd day...

  10. Effects of Parental Deafness and Early Exposure to Manual Communication on the Cognitive Skills, English Language Skill, and Field Independence of Young Deaf Adults.

    ERIC Educational Resources Information Center

    Parasnis, Ila

    1983-01-01

    Differential effects of parental deafness and early exposure to manual communication were not observed in the cognitive and communication performance of the 38 experimental subjects. Furthermore, the Delayed sign language group performed significantly better than the early American Sign Language group on tests of speech perception and speech…

  11. The influence of communication mode on written language processing and beyond.

    PubMed

    Barca, Laura; Pezzulo, Giovanni

    2017-01-01

    Empirical evidence suggests a broad impact of communication mode on cognition at large, beyond language processing. Using a sign language since infancy might shape the representation of words and other linguistic stimuli - for example, incorporating in it the movements and signs used to express them. Once integrated into linguistic representations, this visuo-motor content can affect deaf signers' linguistic and cognitive processing.

  12. The Effects of American Sign Language as an Assessment Accommodation for Students Who Are Deaf or Hard of Hearing

    ERIC Educational Resources Information Center

    Cawthon, Stephanie W.; Winton, Samantha M.; Garberoglio, Carrie Lou; Gobble, Mark E.

    2011-01-01

    Students who are deaf or hard of hearing (SDHH) often need accommodations to participate in large-scale standardized assessments. One way to bridge the gap between the language of the test (English) and a student's linguistic background (often including American Sign Language [ASL]) is to present test items in ASL. The specific aim of this project…

  13. Praxis language reference manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, J.H.

    1981-01-01

    This document is a language reference manual for the programming language Praxis. The document contains the specifications that must be met by any compiler for the language. The Praxis language was designed for systems programming in real-time process applications. Goals for the language and its implementations are: (1) highly efficient code generated by the compiler; (2) program portability; (3) completeness, that is, all programming requirements can be met by the language without needing an assembler; and (4) separate compilation to aid in design and management of large systems. The language does not provide any facilities for input/output, stack and queuemore » handling, string operations, parallel processing, or coroutine processing. These features can be implemented as routines in the language, using machine-dependent code to take advantage of facilities in the control environment on different machines.« less

  14. Real-time lexical comprehension in young children learning American Sign Language.

    PubMed

    MacDonald, Kyle; LaMarr, Todd; Corina, David; Marchman, Virginia A; Fernald, Anne

    2018-04-16

    When children interpret spoken language in real time, linguistic information drives rapid shifts in visual attention to objects in the visual world. This language-vision interaction can provide insights into children's developing efficiency in language comprehension. But how does language influence visual attention when the linguistic signal and the visual world are both processed via the visual channel? Here, we measured eye movements during real-time comprehension of a visual-manual language, American Sign Language (ASL), by 29 native ASL-learning children (16-53 mos, 16 deaf, 13 hearing) and 16 fluent deaf adult signers. All signers showed evidence of rapid, incremental language comprehension, tending to initiate an eye movement before sign offset. Deaf and hearing ASL-learners showed similar gaze patterns, suggesting that the in-the-moment dynamics of eye movements during ASL processing are shaped by the constraints of processing a visual language in real time and not by differential access to auditory information in day-to-day life. Finally, variation in children's ASL processing was positively correlated with age and vocabulary size. Thus, despite competition for attention within a single modality, the timing and accuracy of visual fixations during ASL comprehension reflect information processing skills that are important for language acquisition regardless of language modality. © 2018 John Wiley & Sons Ltd.

  15. Introduction to the theory of machines and languages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weidhaas, P. P.

    1976-04-01

    This text is intended to be an elementary ''guided tour'' through some basic concepts of modern computer science. Various models of computing machines and formal languages are studied in detail. Discussions center around questions such as, ''What is the scope of problems that can or cannot be solved by computers.''

  16. What Is a Programming Language?

    ERIC Educational Resources Information Center

    Wold, Allen

    1983-01-01

    Explains what a computer programing language is in general, the differences between machine language, assembler languages, and high-level languages, and the functions of compilers and interpreters. High-level languages mentioned in the article are: BASIC, FORTRAN, COBOL, PILOT, LOGO, LISP, and SMALLTALK. (EAO)

  17. Productive High Performance Parallel Programming with Auto-tuned Domain-Specific Embedded Languages

    DTIC Science & Technology

    2013-01-02

    Compilation JVM Java Virtual Machine KB Kilobyte KDT Knowledge Discovery Toolbox LAPACK Linear Algebra Package LLVM Low-Level Virtual Machine LOC Lines...different starting points. Leo Meyerovich also helped solidify some of the ideas here in discussions during Par Lab retreats. I would also like to thank...multi-timestep computations by blocking in both time and space. 88 Implementation Output Approx DSL Type Language Language Parallelism LoC Graphite

  18. Machine Tool Software

    NASA Technical Reports Server (NTRS)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  19. Ada Compiler Validation Summary Report: Certificate Number: 880318W1. 09043 International Business Machines Corporation IBM Development System for the Ada Language, Version 2.1.0 IBM 4381 under VM/HPO, Host IBM 4381 under MVS/XA, Target

    DTIC Science & Technology

    1988-03-28

    International Business Machines Corporation IBM Development System for the Ada Language, Version 2.1.0 IBM 4381 under VM/HPO, host IBM 4381 under MVS/XA, target...Program Office, AJPO 20. ABSTRACT (Continue on reverse side if necessary and identify by block number) International Business Machines Corporation, IBM...Standard ANSI/MIL-STD-1815A in the compiler listed in this declaration. I declare that International Business Machines Corporation is the owner of record

  20. To Sign or Not to Sign? The Impact of Encouraging Infants to Gesture on Infant Language and Maternal Mind-Mindedness

    ERIC Educational Resources Information Center

    Kirk, Elizabeth; Howlett, Neil; Pine, Karen J.; Fletcher, Ben C.

    2013-01-01

    Findings are presented from the first randomized control trial of the effects of encouraging symbolic gesture (or "baby sign") on infant language, following 40 infants from age 8 months to 20 months. Half of the mothers were trained to model a target set of gestures to their infants. Frequent measures were taken of infant language…

  1. The English-Language and Reading Achievement of a Cohort of Deaf Students Speaking and Signing Standard English: A Preliminary Study.

    PubMed

    Nielsen, Diane Corcoran; Luetke, Barbara; McLean, Meigan; Stryker, Deborah

    2016-01-01

    Research suggests that English-language proficiency is critical if students who are deaf or hard of hearing (D/HH) are to read as their hearing peers. One explanation for the traditionally reported reading achievement plateau when students are D/HH is the inability to hear insalient English morphology. Signing Exact English can provide visual access to these features. The authors investigated the English morphological and syntactic abilities and reading achievement of elementary and middle school students at a school using simultaneously spoken and signed Standard American English facilitated by intentional listening, speech, and language strategies. A developmental trend (and no plateau) in language and reading achievement was detected; most participants demonstrated average or above-average English. Morphological awareness was prerequisite to high test scores; speech was not significantly correlated with achievement; language proficiency, measured by the Clinical Evaluation of Language Fundamentals-4 (Semel, Wiig, & Secord, 2003), predicted reading achievement.

  2. Natural Language Processing.

    ERIC Educational Resources Information Center

    Chowdhury, Gobinda G.

    2003-01-01

    Discusses issues related to natural language processing, including theoretical developments; natural language understanding; tools and techniques; natural language text processing systems; abstracting; information extraction; information retrieval; interfaces; software; Internet, Web, and digital library applications; machine translation for…

  3. UNIVERSAL TRANSLATOR,

    DTIC Science & Technology

    all languages with the aid of electron machines is being derived to show how easy it would be to decode even ’dead’ languages, and languages of the foginess of Andromeda , if such a language ever existed. (Author)

  4. [What bimodal bilingual have to say about bilingual developing?

    PubMed

    de Quadros, Ronice Müller; Lillo-Martin, Diane; Pichler, Deborah Chen

    2013-07-01

    The goal of this work is to present what our research with hearing children from Deaf parents, acquiring Brazilian Sign Language (Libras) and Portuguese, and American Sign Language (ASL) and English (Lillo-Martin et. al. 2010) have to say about bilingual development. The data analyzed in this study is part of the database of spontaneous interactions collected longitudinally, alternating contexts of sign and spoken languages. Moreover, there is data from experimental studies with tests in both pairs of languages that is incorporated to the present study. A general view about previous studies related to bimodal bilingual acquisition with hearing children, from "deaf" parents, will be presented. Then, we will show some linguistics aspects of this kind of acquisition found in our study and discuss about bilingual acquisition.

  5. Automatic translation among spoken languages

    NASA Technical Reports Server (NTRS)

    Walter, Sharon M.; Costigan, Kelly

    1994-01-01

    The Machine Aided Voice Translation (MAVT) system was developed in response to the shortage of experienced military field interrogators with both foreign language proficiency and interrogation skills. Combining speech recognition, machine translation, and speech generation technologies, the MAVT accepts an interrogator's spoken English question and translates it into spoken Spanish. The spoken Spanish response of the potential informant can then be translated into spoken English. Potential military and civilian applications for automatic spoken language translation technology are discussed in this paper.

  6. Sign Language in Astronomy and Space Sciences

    NASA Astrophysics Data System (ADS)

    Cova, J.; Movilio, V.; Gómez, Y.; Gutiérrez, F.; García, R.; Moreno, H.; González, F.; Díaz, J.; Villarroel, C.; Abreu, E.; Aparicio, D.; Cárdenas, J.; Casneiro, L.; Castillo, N.; Contreras, D.; La Verde, N.; Maita, M.; Martínez, A.; Villahermosa, J.; Quintero, A.

    2009-05-01

    Teaching science to school children with hearing deficiency and impairment can be a rewarding and valuable experience for both teacher and student, and necessary to society as a whole in order to reduce the discriminative policies in the formal educational system. The one most important obstacle to the teaching of science to students with hearing deficiency and impairments is the lack of vocabulary in sign language to express the precise concepts encountered in scientific endeavor. In a collaborative project between Centro de Investigaciones de Astronomía ``Francisco J. Duarte'' (CIDA), Universidad Pedagógica Experimental Libertador-Instituto Pedagógico de Maturín (UPEL-IPM) and Unidad Educativa Especial Bolivariana de Maturín (UEEBM) initiated in 2006, we have attempted to fill this gap by developing signs for astronomy and space sciences terminology. During two three-day workshops carried out at CIDA in Mérida in July 2006 and UPEL-IPM in Maturín in March 2007 a total of 112 concepts of astronomy and space sciences were coined in sign language using an interactive method which we describe in the text. The immediate goal of the project is to incorporate these terms into Venezuelan Sign Language (LSV).

  7. Access to Sign Language Interpreters in the Criminal Justice System.

    ERIC Educational Resources Information Center

    Miller, Katrina R.

    2001-01-01

    This study surveyed 46 professional sign language interpreters working in criminal justice settings and evaluated 22 cases to evaluate access issues for individuals with hearing impairments. Recommendations to increase the accessibility of interpreting services included providing ongoing awareness training to criminal justice personnel and…

  8. Sign Language Legislation as a Tool for Sustainability

    ERIC Educational Resources Information Center

    Pabsch, Annika

    2017-01-01

    This article explores three models of sustainability (environmental, economic, and social) and identifies characteristics of a sustainable community necessary to sustain the Deaf community as a whole. It is argued that sign language legislation is a valuable tool for achieving sustainability for the generations to come.

  9. Bag-of-visual-phrases and hierarchical deep models for traffic sign detection and recognition in mobile laser scanning data

    NASA Astrophysics Data System (ADS)

    Yu, Yongtao; Li, Jonathan; Wen, Chenglu; Guan, Haiyan; Luo, Huan; Wang, Cheng

    2016-03-01

    This paper presents a novel algorithm for detection and recognition of traffic signs in mobile laser scanning (MLS) data for intelligent transportation-related applications. The traffic sign detection task is accomplished based on 3-D point clouds by using bag-of-visual-phrases representations; whereas the recognition task is achieved based on 2-D images by using a Gaussian-Bernoulli deep Boltzmann machine-based hierarchical classifier. To exploit high-order feature encodings of feature regions, a deep Boltzmann machine-based feature encoder is constructed. For detecting traffic signs in 3-D point clouds, the proposed algorithm achieves an average recall, precision, quality, and F-score of 0.956, 0.946, 0.907, and 0.951, respectively, on the four selected MLS datasets. For on-image traffic sign recognition, a recognition accuracy of 97.54% is achieved by using the proposed hierarchical classifier. Comparative studies with the existing traffic sign detection and recognition methods demonstrate that our algorithm obtains promising, reliable, and high performance in both detecting traffic signs in 3-D point clouds and recognizing traffic signs on 2-D images.

  10. Sign Perception and Recognition in Non-Native Signers of ASL

    PubMed Central

    Morford, Jill P.; Carlson, Martina L.

    2011-01-01

    Past research has established that delayed first language exposure is associated with comprehension difficulties in non-native signers of American Sign Language (ASL) relative to native signers. The goal of the current study was to investigate potential explanations of this disparity: do non-native signers have difficulty with all aspects of comprehension, or are their comprehension difficulties restricted to some aspects of processing? We compared the performance of deaf non-native, hearing L2, and deaf native signers on a handshape and location monitoring and a sign recognition task. The results indicate that deaf non-native signers are as rapid and accurate on the monitoring task as native signers, with differences in the pattern of relative performance across handshape and location parameters. By contrast, non-native signers differ significantly from native signers during sign recognition. Hearing L2 signers, who performed almost as well as the two groups of deaf signers on the monitoring task, resembled the deaf native signers more than the deaf non-native signers on the sign recognition task. The combined results indicate that delayed exposure to a signed language leads to an overreliance on handshape during sign recognition. PMID:21686080

  11. Laughter Among Deaf Signers

    PubMed Central

    Provine, Robert R.; Emmorey, Karen

    2008-01-01

    The placement of laughter in the speech of hearing individuals is not random but “punctuates” speech, occurring during pauses and at phrase boundaries where punctuation would be placed in a transcript of a conversation. For speakers, language is dominant in the competition for the vocal tract since laughter seldom interrupts spoken phrases. For users of American Sign Language, however, laughter and language do not compete in the same way for a single output channel. This study investigated whether laughter occurs simultaneously with signing, or punctuates signing, as it does speech, in 11 signed conversations (with two to five participants) that had at least one instance of audible, vocal laughter. Laughter occurred 2.7 times more often during pauses and at phrase boundaries than simultaneously with a signed utterance. Thus, the production of laughter involves higher order cognitive or linguistic processes rather than the low-level regulation of motor processes competing for a single vocal channel. In an examination of other variables, the social dynamics of deaf and hearing people were similar, with “speakers” (those signing) laughing more than their audiences and females laughing more than males. PMID:16891353

  12. Laughter among deaf signers.

    PubMed

    Provine, Robert R; Emmorey, Karen

    2006-01-01

    The placement of laughter in the speech of hearing individuals is not random but "punctuates" speech, occurring during pauses and at phrase boundaries where punctuation would be placed in a transcript of a conversation. For speakers, language is dominant in the competition for the vocal tract since laughter seldom interrupts spoken phrases. For users of American Sign Language, however, laughter and language do not compete in the same way for a single output channel. This study investigated whether laughter occurs simultaneously with signing, or punctuates signing, as it does speech, in 11 signed conversations (with two to five participants) that had at least one instance of audible, vocal laughter. Laughter occurred 2.7 times more often during pauses and at phrase boundaries than simultaneously with a signed utterance. Thus, the production of laughter involves higher order cognitive or linguistic processes rather than the low-level regulation of motor processes competing for a single vocal channel. In an examination of other variables, the social dynamics of deaf and hearing people were similar, with "speakers" (those signing) laughing more than their audiences and females laughing more than males.

  13. Electrophysiological evidence for phonological priming in Spanish Sign Language lexical access.

    PubMed

    Gutiérrez, Eva; Müller, Oliver; Baus, Cristina; Carreiras, Manuel

    2012-06-01

    Interactive activation models of lexical access assume that the presentation of a given word activates not only its lexical representation but also those corresponding to words similar in form. Current theories are based on data from oral and written languages, and therefore signed languages represent a special challenge for existing theories of word recognition and lexical access since they allow us to question what the genuine fundamentals of human language are and what might be modality-specific adaptation. The aim of the present study is to determine the electrophysiological correlates and time course of phonological processing of Spanish Sign Language (LSE). Ten deaf native LSE signers and ten deaf non-native but highly proficient LSE signers participated in the experiment. We used the ERP methodology and form-based priming in the context of a delayed lexical decision task, manipulating phonological overlap (i.e. related prime-target pairs shared either handshape or location parameters). Results showed that both parameters under study modulated brain responses to the stimuli in different time windows. Phonological priming of location resulted in a higher amplitude of the N400 component (300-500 ms window) for signs but not for non-signs. This effect may be explained in terms of initial competition among candidates. Moreover, the fact that a higher amplitude N400 for related pairs was found for signs but not for non-signs points to an effect at the lexical level. Handshape overlap produced a later effect (600-800 ms window). In this window, a more negative-going wave for the related condition than for the unrelated condition was found for non-signs in the native signers group. The findings are discussed in relation to current models of lexical access and word recognition. Finally, differences between native and non-native signers point to a less efficient use of phonological information among the non-native signers. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Palm Reversal Errors in Native-Signing Children with Autism

    PubMed Central

    Shield, Aaron; Meier, Richard P.

    2012-01-01

    Children with autism spectrum disorder (ASD) who have native exposure to a sign language such as American Sign Language (ASL) have received almost no scientific attention. This paper reports the first studies on a sample of five native-signing children (four deaf children of deaf parents and one hearing child of deaf parents; ages 4;6 to 7;5) diagnosed with ASD. A domain-general deficit in the ability of children with ASD to replicate the gestures of others is hypothesized to be a source of palm orientation reversal errors in sign. In Study 1, naturalistic language samples were collected from three native-signing children with ASD and were analyzed for errors in handshape, location, movement and palm orientation. In Study 2, four native-signing children with ASD were compared to 12 typically-developing deaf children (ages 3;7 to 6;9, all born to deaf parents) on a fingerspelling task. In both studies children with ASD showed a tendency to reverse palm orientation on signs specified for inward/outward orientation. Typically-developing deaf children did not produce any such errors in palm orientation. We conclude that this kind of palm reversal has a perceptual rather than a motoric source, and is further evidence of a “self-other mapping” deficit in ASD. PMID:22981637

  15. ASL-LEX: A lexical database of American Sign Language.

    PubMed

    Caselli, Naomi K; Sehyr, Zed Sevcikova; Cohen-Goldberg, Ariel M; Emmorey, Karen

    2017-04-01

    ASL-LEX is a lexical database that catalogues information about nearly 1,000 signs in American Sign Language (ASL). It includes the following information: subjective frequency ratings from 25-31 deaf signers, iconicity ratings from 21-37 hearing non-signers, videoclip duration, sign length (onset and offset), grammatical class, and whether the sign is initialized, a fingerspelled loan sign, or a compound. Information about English translations is available for a subset of signs (e.g., alternate translations, translation consistency). In addition, phonological properties (sign type, selected fingers, flexion, major and minor location, and movement) were coded and used to generate sub-lexical frequency and neighborhood density estimates. ASL-LEX is intended for use by researchers, educators, and students who are interested in the properties of the ASL lexicon. An interactive website where the database can be browsed and downloaded is available at http://asl-lex.org .

  16. ASL-LEX: A lexical database of American Sign Language

    PubMed Central

    Caselli, Naomi K.; Sehyr, Zed Sevcikova; Cohen-Goldberg, Ariel M.; Emmorey, Karen

    2016-01-01

    ASL-LEX is a lexical database that catalogues information about nearly 1,000 signs in American Sign Language (ASL). It includes the following information: subjective frequency ratings from 25–31 deaf signers, iconicity ratings from 21–37 hearing non-signers, videoclip duration, sign length (onset and offset), grammatical class, and whether the sign is initialized, a fingerspelled loan sign or a compound. Information about English translations is available for a subset of signs (e.g., alternate translations, translation consistency). In addition, phonological properties (sign type, selected fingers, flexion, major and minor location, and movement) were coded and used to generate sub-lexical frequency and neighborhood density estimates. ASL-LEX is intended for use by researchers, educators, and students who are interested in the properties of the ASL lexicon. An interactive website where the database can be browsed and downloaded is available at http://asl-lex.org. PMID:27193158

  17. American Sign Language Teachers: Practices and Perceptions.

    ERIC Educational Resources Information Center

    Newell, William J.

    1995-01-01

    Reports on a survey of 359 teachers of American Sign Language (ASL) conducted in 1993-94. Results found that the ability to apply appropriate methods, professional knowledge of ASL teaching practice, and bilingual skills in ASL and English were considered very important. Knowledge of theoretical issues and classroom management skills were viewed…

  18. Using Signs to Facilitate Vocabulary in Children with Language Delays

    ERIC Educational Resources Information Center

    Lederer, Susan Hendler; Battaglia, Dana

    2015-01-01

    The purpose of this article is to explore recommended practices in choosing and using key word signs (i.e., simple single-word gestures for communication) to facilitate first spoken words in hearing children with language delays. Developmental, theoretical, and empirical supports for this practice are discussed. Practical recommendations for…

  19. Psychological Testing of Sign Language Interpreters

    ERIC Educational Resources Information Center

    Seal, Brenda C.

    2004-01-01

    Twenty-eight sign language interpreters participated in a battery of tests to determine if a profile of cognitive, motor, attention, and personality attributes might distinguish them as a group and at different credential levels. Eight interpreters held Level II and nine held Level III Virginia Quality Assurance Screenings (VQAS); the other 11…

  20. The Validity of the Gallaudet Lecture Films

    ERIC Educational Resources Information Center

    Supalla, Ted

    2004-01-01

    Despite the society's growing understanding of sign languages, particularly American Sign Language (ASL), there is still a profound limitation on the availability of literary, linguistic, historical, and other reference materials related to them because of the lack of a commonly accepted writing system. This article transcribed and analyzed a set…

Top