Sample records for japanese sign language

  1. Numeral Incorporation in Japanese Sign Language

    ERIC Educational Resources Information Center

    Ktejik, Mish

    2013-01-01

    This article explores the morphological process of numeral incorporation in Japanese Sign Language. Numeral incorporation is defined and the available research on numeral incorporation in signed language is discussed. The numeral signs in Japanese Sign Language are then introduced and followed by an explanation of the numeral morphemes which are…

  2. Sign language comprehension: the case of Spanish sign language.

    PubMed

    Rodríguez Ortiz, I R

    2008-01-01

    This study aims to answer the question, how much of Spanish Sign Language interpreting deaf individuals really understand. Study sampling included 36 deaf people (deafness ranging from severe to profound; variety depending on the age at which they learned sign language) and 36 hearing people who had good knowledge of sign language (most were interpreters). Sign language comprehension was assessed using passages of secondary level. After being exposed to the passages, the participants had to tell what they had understood about them, answer a set of related questions, and offer a title for the passage. Sign language comprehension by deaf participants was quite acceptable but not as good as that by hearing signers who, unlike deaf participants, were not only late learners of sign language as a second language but had also learned it through formal training.

  3. Exploring the Ancestral Roots of American Sign Language: Lexical Borrowing from Cistercian Sign Language and French Sign Language

    ERIC Educational Resources Information Center

    Cagle, Keith Martin

    2010-01-01

    American Sign Language (ASL) is the natural and preferred language of the Deaf community in both the United States and Canada. Woodward (1978) estimated that approximately 60% of the ASL lexicon is derived from early 19th century French Sign Language, which is known as "langue des signes francaise" (LSF). The lexicon of LSF and ASL may…

  4. Social Interaction Affects Neural Outcomes of Sign Language Learning As a Foreign Language in Adults.

    PubMed

    Yusa, Noriaki; Kim, Jungho; Koizumi, Masatoshi; Sugiura, Motoaki; Kawashima, Ryuta

    2017-01-01

    Children naturally acquire a language in social contexts where they interact with their caregivers. Indeed, research shows that social interaction facilitates lexical and phonological development at the early stages of child language acquisition. It is not clear, however, whether the relationship between social interaction and learning applies to adult second language acquisition of syntactic rules. Does learning second language syntactic rules through social interactions with a native speaker or without such interactions impact behavior and the brain? The current study aims to answer this question. Adult Japanese participants learned a new foreign language, Japanese sign language (JSL), either through a native deaf signer or via DVDs. Neural correlates of acquiring new linguistic knowledge were investigated using functional magnetic resonance imaging (fMRI). The participants in each group were indistinguishable in terms of their behavioral data after the instruction. The fMRI data, however, revealed significant differences in the neural activities between two groups. Significant activations in the left inferior frontal gyrus (IFG) were found for the participants who learned JSL through interactions with the native signer. In contrast, no cortical activation change in the left IFG was found for the group who experienced the same visual input for the same duration via the DVD presentation. Given that the left IFG is involved in the syntactic processing of language, spoken or signed, learning through social interactions resulted in an fMRI signature typical of native speakers: activation of the left IFG. Thus, broadly speaking, availability of communicative interaction is necessary for second language acquisition and this results in observed changes in the brain.

  5. Spoken Language Activation Alters Subsequent Sign Language Activation in L2 Learners of American Sign Language.

    PubMed

    Williams, Joshua T; Newman, Sharlene D

    2017-02-01

    A large body of literature has characterized unimodal monolingual and bilingual lexicons and how neighborhood density affects lexical access; however there have been relatively fewer studies that generalize these findings to bimodal (M2) second language (L2) learners of sign languages. The goal of the current study was to investigate parallel language activation in M2L2 learners of sign language and to characterize the influence of spoken language and sign language neighborhood density on the activation of ASL signs. A priming paradigm was used in which the neighbors of the sign target were activated with a spoken English word and compared the activation of the targets in sparse and dense neighborhoods. Neighborhood density effects in auditory primed lexical decision task were then compared to previous reports of native deaf signers who were only processing sign language. Results indicated reversed neighborhood density effects in M2L2 learners relative to those in deaf signers such that there were inhibitory effects of handshape density and facilitatory effects of location density. Additionally, increased inhibition for signs in dense handshape neighborhoods was greater for high proficiency L2 learners. These findings support recent models of the hearing bimodal bilingual lexicon, which posit lateral links between spoken language and sign language lexical representations.

  6. Planning Sign Languages: Promoting Hearing Hegemony? Conceptualizing Sign Language Standardization

    ERIC Educational Resources Information Center

    Eichmann, Hanna

    2009-01-01

    In light of the absence of a codified standard variety in British Sign Language and German Sign Language ("Deutsche Gebardensprache") there have been repeated calls for the standardization of both languages primarily from outside the Deaf community. The paper is based on a recent grounded theory study which explored perspectives on sign…

  7. Signed Language Working Memory Capacity of Signed Language Interpreters and Deaf Signers

    ERIC Educational Resources Information Center

    Wang, Jihong; Napier, Jemina

    2013-01-01

    This study investigated the effects of hearing status and age of signed language acquisition on signed language working memory capacity. Professional Auslan (Australian sign language)/English interpreters (hearing native signers and hearing nonnative signers) and deaf Auslan signers (deaf native signers and deaf nonnative signers) completed an…

  8. Sign language Web pages.

    PubMed

    Fels, Deborah I; Richards, Jan; Hardman, Jim; Lee, Daniel G

    2006-01-01

    The WORLD WIDE WEB has changed the way people interact. It has also become an important equalizer of information access for many social sectors. However, for many people, including some sign language users, Web accessing can be difficult. For some, it not only presents another barrier to overcome but has left them without cultural equality. The present article describes a system that allows sign language-only Web pages to be created and linked through a video-based technique called sign-linking. In two studies, 14 Deaf participants examined two iterations of signlinked Web pages to gauge the usability and learnability of a signing Web page interface. The first study indicated that signing Web pages were usable by sign language users but that some interface features required improvement. The second study showed increased usability for those features; users consequently couldnavigate sign language information with ease and pleasure.

  9. American Sign Language

    MedlinePlus

    ... Langue des Signes Française).Today’s ASL includes some elements of LSF plus the original local sign languages, which over the years ... evolves. It can also be used to model the essential elements and organization of natural language. Another NIDCD-funded research team is ...

  10. Signed language working memory capacity of signed language interpreters and deaf signers.

    PubMed

    Wang, Jihong; Napier, Jemina

    2013-04-01

    This study investigated the effects of hearing status and age of signed language acquisition on signed language working memory capacity. Professional Auslan (Australian sign language)/English interpreters (hearing native signers and hearing nonnative signers) and deaf Auslan signers (deaf native signers and deaf nonnative signers) completed an Auslan working memory (WM) span task. The results revealed that the hearing signers (i.e., the professional interpreters) significantly outperformed the deaf signers on the Auslan WM span task. However, the results showed no significant differences between the native signers and the nonnative signers in their Auslan working memory capacity. Furthermore, there was no significant interaction between hearing status and age of signed language acquisition. Additionally, the study found no significant differences between the deaf native signers (adults) and the deaf nonnative signers (adults) in their Auslan working memory capacity. The findings are discussed in relation to the participants' memory strategies and their early language experience. The findings present challenges for WM theories.

  11. Sociolinguistic Typology and Sign Languages

    PubMed Central

    Schembri, Adam; Fenlon, Jordan; Cormier, Kearsy; Johnston, Trevor

    2018-01-01

    This paper examines the possible relationship between proposed social determinants of morphological ‘complexity’ and how this contributes to linguistic diversity, specifically via the typological nature of the sign languages of deaf communities. We sketch how the notion of morphological complexity, as defined by Trudgill (2011), applies to sign languages. Using these criteria, sign languages appear to be languages with low to moderate levels of morphological complexity. This may partly reflect the influence of key social characteristics of communities on the typological nature of languages. Although many deaf communities are relatively small and may involve dense social networks (both social characteristics that Trudgill claimed may lend themselves to morphological ‘complexification’), the picture is complicated by the highly variable nature of the sign language acquisition for most deaf people, and the ongoing contact between native signers, hearing non-native signers, and those deaf individuals who only acquire sign languages in later childhood and early adulthood. These are all factors that may work against the emergence of morphological complexification. The relationship between linguistic typology and these key social factors may lead to a better understanding of the nature of sign language grammar. This perspective stands in contrast to other work where sign languages are sometimes presented as having complex morphology despite being young languages (e.g., Aronoff et al., 2005); in some descriptions, the social determinants of morphological complexity have not received much attention, nor has the notion of complexity itself been specifically explored. PMID:29515506

  12. Sociolinguistic Typology and Sign Languages.

    PubMed

    Schembri, Adam; Fenlon, Jordan; Cormier, Kearsy; Johnston, Trevor

    2018-01-01

    This paper examines the possible relationship between proposed social determinants of morphological 'complexity' and how this contributes to linguistic diversity, specifically via the typological nature of the sign languages of deaf communities. We sketch how the notion of morphological complexity, as defined by Trudgill (2011), applies to sign languages. Using these criteria, sign languages appear to be languages with low to moderate levels of morphological complexity. This may partly reflect the influence of key social characteristics of communities on the typological nature of languages. Although many deaf communities are relatively small and may involve dense social networks (both social characteristics that Trudgill claimed may lend themselves to morphological 'complexification'), the picture is complicated by the highly variable nature of the sign language acquisition for most deaf people, and the ongoing contact between native signers, hearing non-native signers, and those deaf individuals who only acquire sign languages in later childhood and early adulthood. These are all factors that may work against the emergence of morphological complexification. The relationship between linguistic typology and these key social factors may lead to a better understanding of the nature of sign language grammar. This perspective stands in contrast to other work where sign languages are sometimes presented as having complex morphology despite being young languages (e.g., Aronoff et al., 2005); in some descriptions, the social determinants of morphological complexity have not received much attention, nor has the notion of complexity itself been specifically explored.

  13. Signs of Change: Contemporary Attitudes to Australian Sign Language

    ERIC Educational Resources Information Center

    Slegers, Claudia

    2010-01-01

    This study explores contemporary attitudes to Australian Sign Language (Auslan). Since at least the 1960s, sign languages have been accepted by linguists as natural languages with all of the key ingredients common to spoken languages. However, these visual-spatial languages have historically been subject to ignorance and myth in Australia and…

  14. Standardization of Sign Languages

    ERIC Educational Resources Information Center

    Adam, Robert

    2015-01-01

    Over the years attempts have been made to standardize sign languages. This form of language planning has been tackled by a variety of agents, most notably teachers of Deaf students, social workers, government agencies, and occasionally groups of Deaf people themselves. Their efforts have most often involved the development of sign language books…

  15. Gesture, sign, and language: The coming of age of sign language and gesture studies.

    PubMed

    Goldin-Meadow, Susan; Brentari, Diane

    2017-01-01

    How does sign language compare with gesture, on the one hand, and spoken language on the other? Sign was once viewed as nothing more than a system of pictorial gestures without linguistic structure. More recently, researchers have argued that sign is no different from spoken language, with all of the same linguistic structures. The pendulum is currently swinging back toward the view that sign is gestural, or at least has gestural components. The goal of this review is to elucidate the relationships among sign language, gesture, and spoken language. We do so by taking a close look not only at how sign has been studied over the past 50 years, but also at how the spontaneous gestures that accompany speech have been studied. We conclude that signers gesture just as speakers do. Both produce imagistic gestures along with more categorical signs or words. Because at present it is difficult to tell where sign stops and gesture begins, we suggest that sign should not be compared with speech alone but should be compared with speech-plus-gesture. Although it might be easier (and, in some cases, preferable) to blur the distinction between sign and gesture, we argue that distinguishing between sign (or speech) and gesture is essential to predict certain types of learning and allows us to understand the conditions under which gesture takes on properties of sign, and speech takes on properties of gesture. We end by calling for new technology that may help us better calibrate the borders between sign and gesture.

  16. Gesture, sign and language: The coming of age of sign language and gesture studies

    PubMed Central

    Goldin-Meadow, Susan; Brentari, Diane

    2016-01-01

    How does sign language compare to gesture, on the one hand, and to spoken language on the other? At one time, sign was viewed as nothing more than a system of pictorial gestures with no linguistic structure. More recently, researchers have argued that sign is no different from spoken language with all of the same linguistic structures. The pendulum is currently swinging back toward the view that sign is gestural, or at least has gestural components. The goal of this review is to elucidate the relationships among sign language, gesture, and spoken language. We do so by taking a close look not only at how sign has been studied over the last 50 years, but also at how the spontaneous gestures that accompany speech have been studied. We come to the conclusion that signers gesture just as speakers do. Both produce imagistic gestures along with more categorical signs or words. Because, at the moment, it is difficult to tell where sign stops and where gesture begins, we suggest that sign should not be compared to speech alone, but should be compared to speech-plus-gesture. Although it might be easier (and, in some cases, preferable) to blur the distinction between sign and gesture, we argue that making a distinction between sign (or speech) and gesture is essential to predict certain types of learning, and allows us to understand the conditions under which gesture takes on properties of sign, and speech takes on properties of gesture. We end by calling for new technology that may help us better calibrate the borders between sign and gesture. PMID:26434499

  17. Writing Signed Languages: What For? What Form?

    PubMed

    Grushkin, Donald A

    2017-01-01

    Signed languages around the world have tended to maintain an "oral," unwritten status. Despite the advantages of possessing a written form of their language, signed language communities typically resist and reject attempts to create such written forms. The present article addresses many of the arguments against written forms of signed languages, and presents the potential advantages of writing signed languages. Following a history of the development of writing in spoken as well as signed language populations, the effects of orthographic types upon literacy and biliteracy are explored. Attempts at writing signed languages have followed two primary paths: "alphabetic" and "icono-graphic." It is argued that for greatest congruency and ease in developing biliteracy strategies in societies where an alphabetic script is used for the spoken language, signed language communities within these societies are best served by adoption of an alphabetic script for writing their signed language.

  18. THE PARADOX OF SIGN LANGUAGE MORPHOLOGY

    PubMed Central

    Aronoff, Mark; Meir, Irit; Sandler, Wendy

    2011-01-01

    Sign languages have two strikingly different kinds of morphological structure: sequential and simultaneous. The simultaneous morphology of two unrelated sign languages, American and Israeli Sign Language, is very similar and is largely inflectional, while what little sequential morphology we have found differs significantly and is derivational. We show that at least two pervasive types of inflectional morphology, verb agreement and classifier constructions, are iconically grounded in spatiotemporal cognition, while the sequential patterns can be traced to normal historical development. We attribute the paucity of sequential morphology in sign languages to their youth. This research both brings sign languages much closer to spoken languages in their morphological structure and shows how the medium of communication contributes to the structure of languages.* PMID:22223926

  19. Approaching sign language test construction: adaptation of the German sign language receptive skills test.

    PubMed

    Haug, Tobias

    2011-01-01

    There is a current need for reliable and valid test instruments in different countries in order to monitor deaf children's sign language acquisition. However, very few tests are commercially available that offer strong evidence for their psychometric properties. A German Sign Language (DGS) test focusing on linguistic structures that are acquired in preschool- and school-aged children (4-8 years old) is urgently needed. Using the British Sign Language Receptive Skills Test, that has been standardized and has sound psychometric properties, as a template for adaptation thus provides a starting point for tests of a sign language that is less documented, such as DGS. This article makes a novel contribution to the field by examining linguistic, cultural, and methodological issues in the process of adapting a test from the source language to the target language. The adapted DGS test has sound psychometric properties and provides the basis for revision prior to standardization. © The Author 2011. Published by Oxford University Press. All rights reserved.

  20. Adapting tests of sign language assessment for other sign languages--a review of linguistic, cultural, and psychometric problems.

    PubMed

    Haug, Tobias; Mann, Wolfgang

    2008-01-01

    Given the current lack of appropriate assessment tools for measuring deaf children's sign language skills, many test developers have used existing tests of other sign languages as templates to measure the sign language used by deaf people in their country. This article discusses factors that may influence the adaptation of assessment tests from one natural sign language to another. Two tests which have been adapted for several other sign languages are focused upon: the Test for American Sign Language and the British Sign Language Receptive Skills Test. A brief description is given of each test as well as insights from ongoing adaptations of these tests for other sign languages. The problems reported in these adaptations were found to be grounded in linguistic and cultural differences, which need to be considered for future test adaptations. Other reported shortcomings of test adaptation are related to the question of how well psychometric measures transfer from one instrument to another.

  1. The Legal Recognition of Sign Languages

    ERIC Educational Resources Information Center

    De Meulder, Maartje

    2015-01-01

    This article provides an analytical overview of the different types of explicit legal recognition of sign languages. Five categories are distinguished: constitutional recognition, recognition by means of general language legislation, recognition by means of a sign language law or act, recognition by means of a sign language law or act including…

  2. Sign Lowering and Phonetic Reduction in American Sign Language.

    PubMed

    Tyrone, Martha E; Mauk, Claude E

    2010-04-01

    This study examines sign lowering as a form of phonetic reduction in American Sign Language. Phonetic reduction occurs in the course of normal language production, when instead of producing a carefully articulated form of a word, the language user produces a less clearly articulated form. When signs are produced in context by native signers, they often differ from the citation forms of signs. In some cases, phonetic reduction is manifested as a sign being produced at a lower location than in the citation form. Sign lowering has been documented previously, but this is the first study to examine it in phonetic detail. The data presented here are tokens of the sign WONDER, as produced by six native signers, in two phonetic contexts and at three signing rates, which were captured by optoelectronic motion capture. The results indicate that sign lowering occurred for all signers, according to the factors we manipulated. Sign production was affected by several phonetic factors that also influence speech production, namely, production rate, phonetic context, and position within an utterance. In addition, we have discovered interesting variations in sign production, which could underlie distinctions in signing style, analogous to accent or voice quality in speech.

  3. Sign Language and Hearing Preschoolers.

    ERIC Educational Resources Information Center

    Reynolds, Kate E.

    1995-01-01

    Notes that sign language is the third most used second language in the United States and that early childhood is an ideal language-learning time. Describes the experiences of one preschool where American Sign Language has become an integral part of the curriculum. Includes guiding principles, classroom do's and don'ts, and a resource list of…

  4. Language Policy and Planning: The Case of Italian Sign Language

    ERIC Educational Resources Information Center

    Geraci, Carlo

    2012-01-01

    Italian Sign Language (LIS) is the name of the language used by the Italian Deaf community. The acronym LIS derives from Lingua italiana dei segni ("Italian language of signs"), although nowadays Italians refers to LIS as Lingua dei segni italiana, reflecting the more appropriate phrasing "Italian sign language." Historically,…

  5. Visual cortex entrains to sign language.

    PubMed

    Brookshire, Geoffrey; Lu, Jenny; Nusbaum, Howard C; Goldin-Meadow, Susan; Casasanto, Daniel

    2017-06-13

    Despite immense variability across languages, people can learn to understand any human language, spoken or signed. What neural mechanisms allow people to comprehend language across sensory modalities? When people listen to speech, electrophysiological oscillations in auditory cortex entrain to slow ([Formula: see text]8 Hz) fluctuations in the acoustic envelope. Entrainment to the speech envelope may reflect mechanisms specialized for auditory perception. Alternatively, flexible entrainment may be a general-purpose cortical mechanism that optimizes sensitivity to rhythmic information regardless of modality. Here, we test these proposals by examining cortical coherence to visual information in sign language. First, we develop a metric to quantify visual change over time. We find quasiperiodic fluctuations in sign language, characterized by lower frequencies than fluctuations in speech. Next, we test for entrainment of neural oscillations to visual change in sign language, using electroencephalography (EEG) in fluent speakers of American Sign Language (ASL) as they watch videos in ASL. We find significant cortical entrainment to visual oscillations in sign language <5 Hz, peaking at [Formula: see text]1 Hz. Coherence to sign is strongest over occipital and parietal cortex, in contrast to speech, where coherence is strongest over the auditory cortex. Nonsigners also show coherence to sign language, but entrainment at frontal sites is reduced relative to fluent signers. These results demonstrate that flexible cortical entrainment to language does not depend on neural processes that are specific to auditory speech perception. Low-frequency oscillatory entrainment may reflect a general cortical mechanism that maximizes sensitivity to informational peaks in time-varying signals.

  6. What sign language creation teaches us about language.

    PubMed

    Brentari, Diane; Coppola, Marie

    2013-03-01

    How do languages emerge? What are the necessary ingredients and circumstances that permit new languages to form? Various researchers within the disciplines of primatology, anthropology, psychology, and linguistics have offered different answers to this question depending on their perspective. Language acquisition, language evolution, primate communication, and the study of spoken varieties of pidgin and creoles address these issues, but in this article we describe a relatively new and important area that contributes to our understanding of language creation and emergence. Three types of communication systems that use the hands and body to communicate will be the focus of this article: gesture, homesign systems, and sign languages. The focus of this article is to explain why mapping the path from gesture to homesign to sign language has become an important research topic for understanding language emergence, not only for the field of sign languages, but also for language in general. WIREs Cogn Sci 2013, 4:201-211. doi: 10.1002/wcs.1212 For further resources related to this article, please visit the WIREs website. Copyright © 2012 John Wiley & Sons, Ltd.

  7. Awareness of Deaf Sign Language and Gang Signs.

    ERIC Educational Resources Information Center

    Smith, Cynthia; Morgan, Robert L.

    There have been increasing incidents of innocent people who use American Sign Language (ASL) or another form of sign language being victimized by gang violence due to misinterpretation of ASL hand formations. ASL is familiar to learners with a variety of disabilities, particularly those in the deaf community. The problem is that gang members have…

  8. Adaptation of a Vocabulary Test from British Sign Language to American Sign Language

    ERIC Educational Resources Information Center

    Mann, Wolfgang; Roy, Penny; Morgan, Gary

    2016-01-01

    This study describes the adaptation process of a vocabulary knowledge test for British Sign Language (BSL) into American Sign Language (ASL) and presents results from the first round of pilot testing with 20 deaf native ASL signers. The web-based test assesses the strength of deaf children's vocabulary knowledge by means of different mappings of…

  9. Arabic Sign Language: A Perspective

    ERIC Educational Resources Information Center

    Abdel-Fattah, M. A.

    2005-01-01

    Sign language in the Arab World has been recently recognized and documented. Many efforts have been made to establish the sign language used in individual countries, including Jordan, Egypt, Libya, and the Gulf States, by trying to standardize the language and spread it among members of the Deaf community and those concerned. Such efforts produced…

  10. Kinship in Mongolian Sign Language

    ERIC Educational Resources Information Center

    Geer, Leah

    2011-01-01

    Information and research on Mongolian Sign Language is scant. To date, only one dictionary is available in the United States (Badnaa and Boll 1995), and even that dictionary presents only a subset of the signs employed in Mongolia. The present study describes the kinship system used in Mongolian Sign Language (MSL) based on data elicited from…

  11. Japanese-English language equivalence of the Cognitive Abilities Screening Instrument among Japanese-Americans.

    PubMed

    Gibbons, Laura E; McCurry, Susan; Rhoads, Kristoffer; Masaki, Kamal; White, Lon; Borenstein, Amy R; Larson, Eric B; Crane, Paul K

    2009-02-01

    The Cognitive Abilities Screening Instrument (CASI) was designed for use in cross-cultural studies of Japanese and Japanese-American elderly in Japan and the U.S.A. The measurement equivalence in Japanese and English had not been confirmed in prior studies. We analyzed the 40 CASI items for differential item functioning (DIF) related to test language, as well as self-reported proficiency with written Japanese, age, and educational attainment in two large epidemiologic studies of Japanese-American elderly: the Kame Project (n=1708) and the Honolulu-Asia Aging Study (HAAS; n = 3148). DIF was present if the demographic groups differed in the probability of success on an item, after controlling for their underlying cognitive functioning ability. While seven CASI items had DIF related to language of testing in Kame (registration of one item; recall of one item; similes; judgment; repeating a phrase; reading and performing a command; and following a three-step instruction), the impact of DIF on participants' scores was minimal. Mean scores for Japanese and English speakers in Kame changed by <0.1 SD after accounting for DIF related to test language. In HAAS, insufficient numbers of participants were tested in Japanese to assess DIF related to test language. In both studies, DIF related to written Japanese proficiency, age, and educational attainment had minimal impact. To the extent that DIF could be assessed, the CASI appeared to meet the goal of measuring cognitive function equivalently in Japanese and English. Stratified data collection would be needed to confirm this conclusion. DIF assessment should be used in other studies with multiple language groups to confirm that measures function equivalently or, if not, form scores that account for DIF.

  12. Wavelets for sign language translation

    NASA Astrophysics Data System (ADS)

    Wilson, Beth J.; Anspach, Gretel

    1993-10-01

    Wavelet techniques are applied to help extract the relevant parameters of sign language from video images of a person communicating in American Sign Language or Signed English. The compression and edge detection features of two-dimensional wavelet analysis are exploited to enhance the algorithms under development to classify the hand motion, hand location with respect to the body, and handshape. These three parameters have different processing requirements and complexity issues. The results are described for applying various quadrature mirror filter designs to a filterbank implementation of the desired wavelet transform. The overall project is to develop a system that will translate sign language to English to facilitate communication between deaf and hearing people.

  13. Approaching Sign Language Test Construction: Adaptation of the German Sign Language Receptive Skills Test

    ERIC Educational Resources Information Center

    Haug, Tobias

    2011-01-01

    There is a current need for reliable and valid test instruments in different countries in order to monitor deaf children's sign language acquisition. However, very few tests are commercially available that offer strong evidence for their psychometric properties. A German Sign Language (DGS) test focusing on linguistic structures that are acquired…

  14. Hotel Employees' Japanese Language Experiences: Implications and Suggestions.

    ERIC Educational Resources Information Center

    Makita-Discekici, Yasuko

    1998-01-01

    Analyzes the Japanese language learning experiences of 13 hotel employees in Guam. Results of the study present implications and suggestions for a Japanese language program for the hotel industry. The project began as a result of hotel employees frustrations when they were unable to communicate effectively with their Japanese guests. (Auth/JL)

  15. The road to language learning is iconic: evidence from British Sign Language.

    PubMed

    Thompson, Robin L; Vinson, David P; Woll, Bencie; Vigliocco, Gabriella

    2012-12-01

    An arbitrary link between linguistic form and meaning is generally considered a universal feature of language. However, iconic (i.e., nonarbitrary) mappings between properties of meaning and features of linguistic form are also widely present across languages, especially signed languages. Although recent research has shown a role for sign iconicity in language processing, research on the role of iconicity in sign-language development has been mixed. In this article, we present clear evidence that iconicity plays a role in sign-language acquisition for both the comprehension and production of signs. Signed languages were taken as a starting point because they tend to encode a higher degree of iconic form-meaning mappings in their lexicons than spoken languages do, but our findings are more broadly applicable: Specifically, we hypothesize that iconicity is fundamental to all languages (signed and spoken) and that it serves to bridge the gap between linguistic form and human experience.

  16. Eye Gaze in Creative Sign Language

    ERIC Educational Resources Information Center

    Kaneko, Michiko; Mesch, Johanna

    2013-01-01

    This article discusses the role of eye gaze in creative sign language. Because eye gaze conveys various types of linguistic and poetic information, it is an intrinsic part of sign language linguistics in general and of creative signing in particular. We discuss various functions of eye gaze in poetic signing and propose a classification of gaze…

  17. Adapting Tests of Sign Language Assessment for Other Sign Languages--A Review of Linguistic, Cultural, and Psychometric Problems

    ERIC Educational Resources Information Center

    Haug, Tobias; Mann, Wolfgang

    2008-01-01

    Given the current lack of appropriate assessment tools for measuring deaf children's sign language skills, many test developers have used existing tests of other sign languages as templates to measure the sign language used by deaf people in their country. This article discusses factors that may influence the adaptation of assessment tests from…

  18. Functional and anatomical correlates of word-, sentence-, and discourse-level integration in sign language

    PubMed Central

    Inubushi, Tomoo; Sakai, Kuniyoshi L.

    2013-01-01

    In both vocal and sign languages, we can distinguish word-, sentence-, and discourse-level integration in terms of hierarchical processes, which integrate various elements into another higher level of constructs. In the present study, we used magnetic resonance imaging and voxel-based morphometry (VBM) to test three language tasks in Japanese Sign Language (JSL): word-level (Word), sentence-level (Sent), and discourse-level (Disc) decision tasks. We analyzed cortical activity and gray matter (GM) volumes of Deaf signers, and clarified three major points. First, we found that the activated regions in the frontal language areas gradually expanded in the dorso-ventral axis, corresponding to a difference in linguistic units for the three tasks. Moreover, the activations in each region of the frontal language areas were incrementally modulated with the level of linguistic integration. These dual mechanisms of the frontal language areas may reflect a basic organization principle of hierarchically integrating linguistic information. Secondly, activations in the lateral premotor cortex and inferior frontal gyrus were left-lateralized. Direct comparisons among the language tasks exhibited more focal activation in these regions, suggesting their functional localization. Thirdly, we found significantly positive correlations between individual task performances and GM volumes in localized regions, even when the ages of acquisition (AOAs) of JSL and Japanese were factored out. More specifically, correlations with the performances of the Word and Sent tasks were found in the left precentral/postcentral gyrus and insula, respectively, while correlations with those of the Disc task were found in the left ventral inferior frontal gyrus and precuneus. The unification of functional and anatomical studies would thus be fruitful for understanding human language systems from the aspects of both universality and individuality. PMID:24155706

  19. Early Sign Language Exposure and Cochlear Implantation Benefits.

    PubMed

    Geers, Ann E; Mitchell, Christine M; Warner-Czyz, Andrea; Wang, Nae-Yuh; Eisenberg, Laurie S

    2017-07-01

    Most children with hearing loss who receive cochlear implants (CI) learn spoken language, and parents must choose early on whether to use sign language to accompany speech at home. We address whether parents' use of sign language before and after CI positively influences auditory-only speech recognition, speech intelligibility, spoken language, and reading outcomes. Three groups of children with CIs from a nationwide database who differed in the duration of early sign language exposure provided in their homes were compared in their progress through elementary grades. The groups did not differ in demographic, auditory, or linguistic characteristics before implantation. Children without early sign language exposure achieved better speech recognition skills over the first 3 years postimplant and exhibited a statistically significant advantage in spoken language and reading near the end of elementary grades over children exposed to sign language. Over 70% of children without sign language exposure achieved age-appropriate spoken language compared with only 39% of those exposed for 3 or more years. Early speech perception predicted speech intelligibility in middle elementary grades. Children without sign language exposure produced speech that was more intelligible (mean = 70%) than those exposed to sign language (mean = 51%). This study provides the most compelling support yet available in CI literature for the benefits of spoken language input for promoting verbal development in children implanted by 3 years of age. Contrary to earlier published assertions, there was no advantage to parents' use of sign language either before or after CI. Copyright © 2017 by the American Academy of Pediatrics.

  20. Sign Language Planning: Pragmatism, Pessimism and Principles

    ERIC Educational Resources Information Center

    Turner, Graham H.

    2009-01-01

    This article introduces the present collection of sign language planning studies. Contextualising the analyses against the backdrop of core issues in the theory of language planning and the evolution of applied sign linguistics, it is argued that--while the sociolinguistic circumstances of signed languages worldwide can, in many respects, be…

  1. Problems for a Sign Language Planning Agency

    ERIC Educational Resources Information Center

    Covington, Virginia

    1977-01-01

    American Sign Language is chiefly untaught and nonstandardized. The Communicative Skills Program of the National Association of the Deaf aims to provide sign language classes for hearing personnel and to increase interpreting services. Programs, funding and aims of the Program are outlined. A government sign language planning agency is proposed.…

  2. One grammar or two? Sign Languages and the Nature of Human Language

    PubMed Central

    Lillo-Martin, Diane C; Gajewski, Jon

    2014-01-01

    Linguistic research has identified abstract properties that seem to be shared by all languages—such properties may be considered defining characteristics. In recent decades, the recognition that human language is found not only in the spoken modality but also in the form of sign languages has led to a reconsideration of some of these potential linguistic universals. In large part, the linguistic analysis of sign languages has led to the conclusion that universal characteristics of language can be stated at an abstract enough level to include languages in both spoken and signed modalities. For example, languages in both modalities display hierarchical structure at sub-lexical and phrasal level, and recursive rule application. However, this does not mean that modality-based differences between signed and spoken languages are trivial. In this article, we consider several candidate domains for modality effects, in light of the overarching question: are signed and spoken languages subject to the same abstract grammatical constraints, or is a substantially different conception of grammar needed for the sign language case? We look at differences between language types based on the use of space, iconicity, and the possibility for simultaneity in linguistic expression. The inclusion of sign languages does support some broadening of the conception of human language—in ways that are applicable for spoken languages as well. Still, the overall conclusion is that one grammar applies for human language, no matter the modality of expression. PMID:25013534

  3. On the System of Person-Denoting Signs in Estonian Sign Language: Estonian Name Signs

    ERIC Educational Resources Information Center

    Paales, Liina

    2010-01-01

    This article discusses Estonian personal name signs. According to study there are four personal name sign categories in Estonian Sign Language: (1) arbitrary name signs; (2) descriptive name signs; (3) initialized-descriptive name signs; (4) loan/borrowed name signs. Mostly there are represented descriptive and borrowed personal name signs among…

  4. Similarities & Differences in Two Brazilian Sign Languages.

    ERIC Educational Resources Information Center

    Ferreira-Brito, Lucinda

    1984-01-01

    mparison of sign language used by Urubu-Kaapor Indians in the Amazonian jungle (UKSL) and sign language used by deaf people in Sao Paulo (SPSL). In the former situation, deaf people are more integrated and accepted into their community than in Sao Paulo, because most hearing individuals are able and willing to use sign language to communicate with…

  5. Dictionaries of African Sign Languages: An Overview

    ERIC Educational Resources Information Center

    Schmaling, Constanze H.

    2012-01-01

    This article gives an overview of dictionaries of African sign languages that have been published to date most of which have not been widely distributed. After an introduction into the field of sign language lexicography and a discussion of some of the obstacles that authors of sign language dictionaries face in general, I will show problems…

  6. Unlocking Australia's Language Potential. Profiles of 9 Key Languages in Australia. Volume 7: Japanese.

    ERIC Educational Resources Information Center

    Marriott, Helen; And Others

    The report on the status of Japanese language teaching in Australia gives a broad view of Japanese study and discusses current educational issues in some detail. An introductory chapter offers a brief overview of the history, objectives, and issues of Japanese language instruction in Australia. The second chapter details features of instructional…

  7. The sign language skills classroom observation: a process for describing sign language proficiency in classroom settings.

    PubMed

    Reeves, J B; Newell, W; Holcomb, B R; Stinson, M

    2000-10-01

    In collaboration with teachers and students at the National Technical Institute for the Deaf (NTID), the Sign Language Skills Classroom Observation (SLSCO) was designed to provide feedback to teachers on their sign language communication skills in the classroom. In the present article, the impetus and rationale for development of the SLSCO is discussed. Previous studies related to classroom signing and observation methodology are reviewed. The procedure for developing the SLSCO is then described. This procedure included (a) interviews with faculty and students at NTID, (b) identification of linguistic features of sign language important for conveying content to deaf students, (c) development of forms for recording observations of classroom signing, (d) analysis of use of the forms, (e) development of a protocol for conducting the SLSCO, and (f) piloting of the SLSCO in classrooms. The results of use of the SLSCO with NTID faculty during a trial year are summarized.

  8. The Use of Sign Language Pronouns by Native-Signing Children with Autism.

    PubMed

    Shield, Aaron; Meier, Richard P; Tager-Flusberg, Helen

    2015-07-01

    We report the first study on pronoun use by an under-studied research population, children with autism spectrum disorder (ASD) exposed to American Sign Language from birth by their deaf parents. Personal pronouns cause difficulties for hearing children with ASD, who sometimes reverse or avoid them. Unlike speech pronouns, sign pronouns are indexical points to self and other. Despite this transparency, we find evidence from an elicitation task and parental report that signing children with ASD avoid sign pronouns in favor of names. An analysis of spontaneous usage showed that all children demonstrated the ability to point, but only children with better-developed sign language produced pronouns. Differences in language abilities and self-representation may explain these phenomena in sign and speech.

  9. Grammar, Gesture, and Meaning in American Sign Language.

    ERIC Educational Resources Information Center

    Liddell, Scott K.

    In sign languages of the Deaf, now recognized as fully legitimate human languages, some signs can meaningfully point toward things or can be meaningfully placed in the space ahead of the signer. Such spatial uses of sign are an obligatory part of fluent grammatical signing. There is no parallel for this in vocally produced languages. This book…

  10. Syntactic priming in American Sign Language.

    PubMed

    Hall, Matthew L; Ferreira, Victor S; Mayberry, Rachel I

    2015-01-01

    Psycholinguistic studies of sign language processing provide valuable opportunities to assess whether language phenomena, which are primarily studied in spoken language, are fundamentally shaped by peripheral biology. For example, we know that when given a choice between two syntactically permissible ways to express the same proposition, speakers tend to choose structures that were recently used, a phenomenon known as syntactic priming. Here, we report two experiments testing syntactic priming of a noun phrase construction in American Sign Language (ASL). Experiment 1 shows that second language (L2) signers with normal hearing exhibit syntactic priming in ASL and that priming is stronger when the head noun is repeated between prime and target (the lexical boost effect). Experiment 2 shows that syntactic priming is equally strong among deaf native L1 signers, deaf late L1 learners, and hearing L2 signers. Experiment 2 also tested for, but did not find evidence of, phonological or semantic boosts to syntactic priming in ASL. These results show that despite the profound differences between spoken and signed languages in terms of how they are produced and perceived, the psychological representation of sentence structure (as assessed by syntactic priming) operates similarly in sign and speech.

  11. SignMT: An Alternative Language Learning Tool

    ERIC Educational Resources Information Center

    Ditcharoen, Nadh; Naruedomkul, Kanlaya; Cercone, Nick

    2010-01-01

    Learning a second language is very difficult, especially, for the disabled; the disability may be a barrier to learn and to utilize information written in text form. We present the SignMT, Thai sign to Thai machine translation system, which is able to translate from Thai sign language into Thai text. In the translation process, SignMT takes into…

  12. From gesture to sign language: conventionalization of classifier constructions by adult hearing learners of British Sign Language.

    PubMed

    Marshall, Chloë R; Morgan, Gary

    2015-01-01

    There has long been interest in why languages are shaped the way they are, and in the relationship between sign language and gesture. In sign languages, entity classifiers are handshapes that encode how objects move, how they are located relative to one another, and how multiple objects of the same type are distributed in space. Previous studies have shown that hearing adults who are asked to use only manual gestures to describe how objects move in space will use gestures that bear some similarities to classifiers. We investigated how accurately hearing adults, who had been learning British Sign Language (BSL) for 1-3 years, produce and comprehend classifiers in (static) locative and distributive constructions. In a production task, learners of BSL knew that they could use their hands to represent objects, but they had difficulty choosing the same, conventionalized, handshapes as native signers. They were, however, highly accurate at encoding location and orientation information. Learners therefore show the same pattern found in sign-naïve gesturers. In contrast, handshape, orientation, and location were comprehended with equal (high) accuracy, and testing a group of sign-naïve adults showed that they too were able to understand classifiers with higher than chance accuracy. We conclude that adult learners of BSL bring their visuo-spatial knowledge and gestural abilities to the tasks of understanding and producing constructions that contain entity classifiers. We speculate that investigating the time course of adult sign language acquisition might shed light on how gesture became (and, indeed, becomes) conventionalized during the genesis of sign languages. Copyright © 2014 Cognitive Science Society, Inc.

  13. Assessing Japanese Language Needs for Business and Professional Use.

    ERIC Educational Resources Information Center

    Saito, Yoshiko

    A study is reported that aimed to: (1) assess perceived needs of Japanese language by students and business faculty; (2) assess Japanese language needs of business professionals who work with Japan; (3) determine what language abilities and levels of proficiency are desired; and (4) identify perceived problem areas and ways that they are handled…

  14. Direction Asymmetries in Spoken and Signed Language Interpreting

    ERIC Educational Resources Information Center

    Nicodemus, Brenda; Emmorey, Karen

    2013-01-01

    Spoken language (unimodal) interpreters often prefer to interpret from their non-dominant language (L2) into their native language (L1). Anecdotally, signed language (bimodal) interpreters express the opposite bias, preferring to interpret from L1 (spoken language) into L2 (signed language). We conducted a large survey study ("N" =…

  15. Sentence Repetition in Deaf Children with Specific Language Impairment in British Sign Language

    ERIC Educational Resources Information Center

    Marshall, Chloë; Mason, Kathryn; Rowley, Katherine; Herman, Rosalind; Atkinson, Joanna; Woll, Bencie; Morgan, Gary

    2015-01-01

    Children with specific language impairment (SLI) perform poorly on sentence repetition tasks across different spoken languages, but until now, this methodology has not been investigated in children who have SLI in a signed language. Users of a natural sign language encode different sentence meanings through their choice of signs and by altering…

  16. Are Australian Fans of Anime and Manga Motivated to Learn Japanese Language?

    ERIC Educational Resources Information Center

    Armour, William S.; Iida, Sumiko

    2016-01-01

    Recent research into Japanese as a foreign language education has strongly emphasized the link between Japanese popular culture and learning Japanese. However, these studies have only targeted Japanese language learners in formal education contexts and have largely ignored those who are not studying Japanese or studying Japanese informally. This…

  17. Writing Signed Languages: What for? What Form?

    ERIC Educational Resources Information Center

    Grushkin, Donald A.

    2017-01-01

    Signed languages around the world have tended to maintain an "oral," unwritten status. Despite the advantages of possessing a written form of their language, signed language communities typically resist and reject attempts to create such written forms. The present article addresses many of the arguments against written forms of signed…

  18. Imitation, Sign Language Skill and the Developmental Ease of Language Understanding (D-ELU) Model.

    PubMed

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2016-01-01

    Imitation and language processing are closely connected. According to the Ease of Language Understanding (ELU) model (Rönnberg et al., 2013) pre-existing mental representation of lexical items facilitates language understanding. Thus, imitation of manual gestures is likely to be enhanced by experience of sign language. We tested this by eliciting imitation of manual gestures from deaf and hard-of-hearing (DHH) signing and hearing non-signing children at a similar level of language and cognitive development. We predicted that the DHH signing children would be better at imitating gestures lexicalized in their own sign language (Swedish Sign Language, SSL) than unfamiliar British Sign Language (BSL) signs, and that both groups would be better at imitating lexical signs (SSL and BSL) than non-signs. We also predicted that the hearing non-signing children would perform worse than DHH signing children with all types of gestures the first time (T1) we elicited imitation, but that the performance gap between groups would be reduced when imitation was elicited a second time (T2). Finally, we predicted that imitation performance on both occasions would be associated with linguistic skills, especially in the manual modality. A split-plot repeated measures ANOVA demonstrated that DHH signers imitated manual gestures with greater precision than non-signing children when imitation was elicited the second but not the first time. Manual gestures were easier to imitate for both groups when they were lexicalized than when they were not; but there was no difference in performance between familiar and unfamiliar gestures. For both groups, language skills at T1 predicted imitation at T2. Specifically, for DHH children, word reading skills, comprehension and phonological awareness of sign language predicted imitation at T2. For the hearing participants, language comprehension predicted imitation at T2, even after the effects of working memory capacity and motor skills were taken into

  19. Imitation, Sign Language Skill and the Developmental Ease of Language Understanding (D-ELU) Model

    PubMed Central

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2016-01-01

    Imitation and language processing are closely connected. According to the Ease of Language Understanding (ELU) model (Rönnberg et al., 2013) pre-existing mental representation of lexical items facilitates language understanding. Thus, imitation of manual gestures is likely to be enhanced by experience of sign language. We tested this by eliciting imitation of manual gestures from deaf and hard-of-hearing (DHH) signing and hearing non-signing children at a similar level of language and cognitive development. We predicted that the DHH signing children would be better at imitating gestures lexicalized in their own sign language (Swedish Sign Language, SSL) than unfamiliar British Sign Language (BSL) signs, and that both groups would be better at imitating lexical signs (SSL and BSL) than non-signs. We also predicted that the hearing non-signing children would perform worse than DHH signing children with all types of gestures the first time (T1) we elicited imitation, but that the performance gap between groups would be reduced when imitation was elicited a second time (T2). Finally, we predicted that imitation performance on both occasions would be associated with linguistic skills, especially in the manual modality. A split-plot repeated measures ANOVA demonstrated that DHH signers imitated manual gestures with greater precision than non-signing children when imitation was elicited the second but not the first time. Manual gestures were easier to imitate for both groups when they were lexicalized than when they were not; but there was no difference in performance between familiar and unfamiliar gestures. For both groups, language skills at T1 predicted imitation at T2. Specifically, for DHH children, word reading skills, comprehension and phonological awareness of sign language predicted imitation at T2. For the hearing participants, language comprehension predicted imitation at T2, even after the effects of working memory capacity and motor skills were taken into

  20. Research Ethics in Sign Language Communities

    ERIC Educational Resources Information Center

    Harris, Raychelle; Holmes, Heidi M.; Mertens, Donna M.

    2009-01-01

    Codes of ethics exist for most professional associations whose members do research on, for, or with sign language communities. However, these ethical codes are silent regarding the need to frame research ethics from a cultural standpoint, an issue of particular salience for sign language communities. Scholars who write from the perspective of…

  1. The role of syllables in sign language production

    PubMed Central

    Baus, Cristina; Gutiérrez, Eva; Carreiras, Manuel

    2014-01-01

    The aim of the present study was to investigate the functional role of syllables in sign language and how the different phonological combinations influence sign production. Moreover, the influence of age of acquisition was evaluated. Deaf signers (native and non-native) of Catalan Signed Language (LSC) were asked in a picture-sign interference task to sign picture names while ignoring distractor-signs with which they shared two phonological parameters (out of three of the main sign parameters: Location, Movement, and Handshape). The results revealed a different impact of the three phonological combinations. While no effect was observed for the phonological combination Handshape-Location, the combination Handshape-Movement slowed down signing latencies, but only in the non-native group. A facilitatory effect was observed for both groups when pictures and distractors shared Location-Movement. Importantly, linguistic models have considered this phonological combination to be a privileged unit in the composition of signs, as syllables are in spoken languages. Thus, our results support the functional role of syllable units during phonological articulation in sign language production. PMID:25431562

  2. The role of syllables in sign language production.

    PubMed

    Baus, Cristina; Gutiérrez, Eva; Carreiras, Manuel

    2014-01-01

    The aim of the present study was to investigate the functional role of syllables in sign language and how the different phonological combinations influence sign production. Moreover, the influence of age of acquisition was evaluated. Deaf signers (native and non-native) of Catalan Signed Language (LSC) were asked in a picture-sign interference task to sign picture names while ignoring distractor-signs with which they shared two phonological parameters (out of three of the main sign parameters: Location, Movement, and Handshape). The results revealed a different impact of the three phonological combinations. While no effect was observed for the phonological combination Handshape-Location, the combination Handshape-Movement slowed down signing latencies, but only in the non-native group. A facilitatory effect was observed for both groups when pictures and distractors shared Location-Movement. Importantly, linguistic models have considered this phonological combination to be a privileged unit in the composition of signs, as syllables are in spoken languages. Thus, our results support the functional role of syllable units during phonological articulation in sign language production.

  3. Audience Effects in American Sign Language Interpretation

    ERIC Educational Resources Information Center

    Weisenberg, Julia

    2009-01-01

    There is a system of English mouthing during interpretation that appears to be the result of language contact between spoken language and signed language. English mouthing is a voiceless visual representation of words on a signer's lips produced concurrently with manual signs. It is a type of borrowing prevalent among English-dominant…

  4. A Field Guide for Sign Language Research.

    ERIC Educational Resources Information Center

    Stokoe, William; Kuschel, Rolf

    Field researchers of sign language are the target of this methodological guide. The prospective researcher is briefed on the rationale of sign language study as language study and as distinct from the study of kinesics. Subjects covered include problems of translating, use of interpreters, and ethics. Instruments for obtaining social and language…

  5. Spoken Language Activation Alters Subsequent Sign Language Activation in L2 Learners of American Sign Language

    ERIC Educational Resources Information Center

    Williams, Joshua T.; Newman, Sharlene D.

    2017-01-01

    A large body of literature has characterized unimodal monolingual and bilingual lexicons and how neighborhood density affects lexical access; however there have been relatively fewer studies that generalize these findings to bimodal (M2) second language (L2) learners of sign languages. The goal of the current study was to investigate parallel…

  6. Constraints on Negative Prefixation in Polish Sign Language.

    PubMed

    Tomaszewski, Piotr

    2015-01-01

    The aim of this article is to describe a negative prefix, NEG-, in Polish Sign Language (PJM) which appears to be indigenous to the language. This is of interest given the relative rarity of prefixes in sign languages. Prefixed PJM signs were analyzed on the basis of both a corpus of texts signed by 15 deaf PJM users who are either native or near-native signers, and material including a specified range of prefixed signs as demonstrated by native signers in dictionary form (i.e. signs produced in isolation, not as part of phrases or sentences). In order to define the morphological rules behind prefixation on both the phonological and morphological levels, native PJM users were consulted for their expertise. The research results can enrich models for describing processes of grammaticalization in the context of the visual-gestural modality that forms the basis for sign language structure.

  7. The psychotherapist and the sign language interpreter.

    PubMed

    de Bruin, Ed; Brugmans, Petra

    2006-01-01

    Specialized psychotherapy for deaf people in the Dutch and Western European mental health systems is still a rather young specialism. A key policy principle in Dutch mental health care for the deaf is that they should receive treatment in the language most accessible to them, which is usually Dutch Sign Language (Nederlandse Gebarentaal or NGT). Although psychotherapists for the deaf are trained to use sign language, situations will always arise in which a sign language interpreter is needed. Most psychotherapists have the opinion that working with a sign language interpreter in therapy sessions can be a valuable alternative option but also see it as a second-best solution because of its impact on the therapeutic process. This paper describes our years of collaborationship as a therapist and a sign language interpreter. If this collaborationship is optimal, it can generate a certain "therapeutic power" in the therapy sessions. Achieving this depends largely on the interplay between the therapist and the interpreter, which in our case is the result of literature research and our experiences during the last 17 years. We analyze this special collaborative relationship, which has several dimensions and recurrent themes like, the role conception of the interpreter, situational interpreting, organizing the interpretation setting, or managing therapeutic phenomena during therapy sessions.

  8. Using Sign Language in Your Classroom.

    ERIC Educational Resources Information Center

    Lawrence, Constance D.

    This paper reviews the research on use of American Sign Language in elementary classes that do not include children with hearing impairment and also reports on the use of the manual sign language alphabet in a primary class learning the phonetic sounds of the alphabet. The research reported is overwhelmingly positive in support of using sign…

  9. Phonological Awareness for American Sign Language

    ERIC Educational Resources Information Center

    Corina, David P.; Hafer, Sarah; Welch, Kearnan

    2014-01-01

    This paper examines the concept of phonological awareness (PA) as it relates to the processing of American Sign Language (ASL). We present data from a recently developed test of PA for ASL and examine whether sign language experience impacts the use of metalinguistic routines necessary for completion of our task. Our data show that deaf signers…

  10. Lexical access in sign language: a computational model.

    PubMed

    Caselli, Naomi K; Cohen-Goldberg, Ariel M

    2014-01-01

    PSYCHOLINGUISTIC THEORIES HAVE PREDOMINANTLY BEEN BUILT UPON DATA FROM SPOKEN LANGUAGE, WHICH LEAVES OPEN THE QUESTION: How many of the conclusions truly reflect language-general principles as opposed to modality-specific ones? We take a step toward answering this question in the domain of lexical access in recognition by asking whether a single cognitive architecture might explain diverse behavioral patterns in signed and spoken language. Chen and Mirman (2012) presented a computational model of word processing that unified opposite effects of neighborhood density in speech production, perception, and written word recognition. Neighborhood density effects in sign language also vary depending on whether the neighbors share the same handshape or location. We present a spreading activation architecture that borrows the principles proposed by Chen and Mirman (2012), and show that if this architecture is elaborated to incorporate relatively minor facts about either (1) the time course of sign perception or (2) the frequency of sub-lexical units in sign languages, it produces data that match the experimental findings from sign languages. This work serves as a proof of concept that a single cognitive architecture could underlie both sign and word recognition.

  11. Indonesian Sign Language Number Recognition using SIFT Algorithm

    NASA Astrophysics Data System (ADS)

    Mahfudi, Isa; Sarosa, Moechammad; Andrie Asmara, Rosa; Azrino Gustalika, M.

    2018-04-01

    Indonesian sign language (ISL) is generally used for deaf individuals and poor people communication in communicating. They use sign language as their primary language which consists of 2 types of action: sign and finger spelling. However, not all people understand their sign language so that this becomes a problem for them to communicate with normal people. this problem also becomes a factor they are isolated feel from the social life. It needs a solution that can help them to be able to interacting with normal people. Many research that offers a variety of methods in solving the problem of sign language recognition based on image processing. SIFT (Scale Invariant Feature Transform) algorithm is one of the methods that can be used to identify an object. SIFT is claimed very resistant to scaling, rotation, illumination and noise. Using SIFT algorithm for Indonesian sign language recognition number result rate recognition to 82% with the use of a total of 100 samples image dataset consisting 50 sample for training data and 50 sample images for testing data. Change threshold value get affect the result of the recognition. The best value threshold is 0.45 with rate recognition of 94%.

  12. Lexical access in sign language: a computational model

    PubMed Central

    Caselli, Naomi K.; Cohen-Goldberg, Ariel M.

    2014-01-01

    Psycholinguistic theories have predominantly been built upon data from spoken language, which leaves open the question: How many of the conclusions truly reflect language-general principles as opposed to modality-specific ones? We take a step toward answering this question in the domain of lexical access in recognition by asking whether a single cognitive architecture might explain diverse behavioral patterns in signed and spoken language. Chen and Mirman (2012) presented a computational model of word processing that unified opposite effects of neighborhood density in speech production, perception, and written word recognition. Neighborhood density effects in sign language also vary depending on whether the neighbors share the same handshape or location. We present a spreading activation architecture that borrows the principles proposed by Chen and Mirman (2012), and show that if this architecture is elaborated to incorporate relatively minor facts about either (1) the time course of sign perception or (2) the frequency of sub-lexical units in sign languages, it produces data that match the experimental findings from sign languages. This work serves as a proof of concept that a single cognitive architecture could underlie both sign and word recognition. PMID:24860539

  13. The Influence of Deaf People's Dual Category Status on Sign Language Planning: The British Sign Language (Scotland) Act (2015)

    ERIC Educational Resources Information Center

    De Meulder, Maartje

    2017-01-01

    Through the British Sign Language (Scotland) Act, British Sign Language (BSL) was given legal status in Scotland. The main motives for the Act were a desire to put BSL on a similar footing with Gaelic and the fact that in Scotland, BSL signers are the only group whose first language is not English who must rely on disability discrimination…

  14. Neural Language Processing in Adolescent First-Language Learners: Longitudinal Case Studies in American Sign Language

    PubMed Central

    Ferjan Ramirez, Naja; Leonard, Matthew K.; Davenport, Tristan S.; Torres, Christina; Halgren, Eric; Mayberry, Rachel I.

    2016-01-01

    One key question in neurolinguistics is the extent to which the neural processing system for language requires linguistic experience during early life to develop fully. We conducted a longitudinal anatomically constrained magnetoencephalography (aMEG) analysis of lexico-semantic processing in 2 deaf adolescents who had no sustained language input until 14 years of age, when they became fully immersed in American Sign Language. After 2 to 3 years of language, the adolescents' neural responses to signed words were highly atypical, localizing mainly to right dorsal frontoparietal regions and often responding more strongly to semantically primed words (Ferjan Ramirez N, Leonard MK, Torres C, Hatrak M, Halgren E, Mayberry RI. 2014. Neural language processing in adolescent first-language learners. Cereb Cortex. 24 (10): 2772–2783). Here, we show that after an additional 15 months of language experience, the adolescents' neural responses remained atypical in terms of polarity. While their responses to less familiar signed words still showed atypical localization patterns, the localization of responses to highly familiar signed words became more concentrated in the left perisylvian language network. Our findings suggest that the timing of language experience affects the organization of neural language processing; however, even in adolescence, language representation in the human brain continues to evolve with experience. PMID:25410427

  15. Sociolinguistic Variation and Change in British Sign Language Number Signs: Evidence of Leveling?

    ERIC Educational Resources Information Center

    Stamp, Rose; Schembri, Adam; Fenlon, Jordan; Rentelis, Ramas

    2015-01-01

    This article presents findings from the first major study to investigate lexical variation and change in British Sign Language (BSL) number signs. As part of the BSL Corpus Project, number sign variants were elicited from 249 deaf signers from eight sites throughout the UK. Age, school location, and language background were found to be significant…

  16. A Stronger Reason for the Right to Sign Languages

    ERIC Educational Resources Information Center

    Trovato, Sara

    2013-01-01

    Is the right to sign language only the right to a minority language? Holding a capability (not a disability) approach, and building on the psycholinguistic literature on sign language acquisition, I make the point that this right is of a stronger nature, since only sign languages can guarantee that each deaf child will properly develop the…

  17. Comparing Action Gestures and Classifier Verbs of Motion: Evidence from Australian Sign Language, Taiwan Sign Language, and Nonsigners' Gestures without Speech

    ERIC Educational Resources Information Center

    Schembri, Adam; Jones, Caroline; Burnham, Denis

    2005-01-01

    Recent research into signed languages indicates that signs may share some properties with gesture, especially in the use of space in classifier constructions. A prediction of this proposal is that there will be similarities in the representation of motion events by sign-naive gesturers and by native signers of unrelated signed languages. This…

  18. Papers in Linguistics. Volume 16. Studies in Japanese Language Use and Studies in the Languages of the USSR.

    ERIC Educational Resources Information Center

    Miyagawa, Shigeru, Ed.; And Others

    1983-01-01

    A volume combining two special issues of "Papers in Linguistics" contains 10 papers concerning Japanese language use and 12 concerning languages of the U.S.S.R. The papers on Japanese include: "Intrusion in Japanese Conversation,""Japanese Use of English Loans,""Some Discourse Principles and Lengthy Sentences in…

  19. Subarashii: Encounters in Japanese Spoken Language Education.

    ERIC Educational Resources Information Center

    Bernstein, Jared; Najmi, Amir; Ehsani, Farzad

    1999-01-01

    Describes Subarashii, an experimental computer-based interactive spoken-language education system designed to understand what a student is saying in Japanese and respond in a meaningful way in spoken Japanese. Implementation of a preprototype version of the Subarashii system identified strengths and limitations of continuous speech recognition…

  20. Who Is Qualified to Teach American Sign Language?

    ERIC Educational Resources Information Center

    Kanda, Jan; Fleischer, Larry

    1988-01-01

    Teachers of American Sign Language (ASL) can no longer qualify just by being able to sign well or by being deaf. ASL teachers must respect the language and its history, feel comfortable interacting with the deaf community, have completed formal study of language and pedagogy, be familiar with second-language teaching, and engage in personal and…

  1. Memory for Nonsemantic Attributes of American Sign Language Signs and English Words

    ERIC Educational Resources Information Center

    Siple, Patricia

    1977-01-01

    Two recognition memory experiments were used to study the retention of language and modality of input. A bilingual list of American Sign Language signs and English words was presented to two deaf and two hearing groups, one instructed to remember mode of input, and one hearing group. Findings are analyzed. (CHK)

  2. Neural Language Processing in Adolescent First-Language Learners: Longitudinal Case Studies in American Sign Language.

    PubMed

    Ferjan Ramirez, Naja; Leonard, Matthew K; Davenport, Tristan S; Torres, Christina; Halgren, Eric; Mayberry, Rachel I

    2016-03-01

    One key question in neurolinguistics is the extent to which the neural processing system for language requires linguistic experience during early life to develop fully. We conducted a longitudinal anatomically constrained magnetoencephalography (aMEG) analysis of lexico-semantic processing in 2 deaf adolescents who had no sustained language input until 14 years of age, when they became fully immersed in American Sign Language. After 2 to 3 years of language, the adolescents' neural responses to signed words were highly atypical, localizing mainly to right dorsal frontoparietal regions and often responding more strongly to semantically primed words (Ferjan Ramirez N, Leonard MK, Torres C, Hatrak M, Halgren E, Mayberry RI. 2014. Neural language processing in adolescent first-language learners. Cereb Cortex. 24 (10): 2772-2783). Here, we show that after an additional 15 months of language experience, the adolescents' neural responses remained atypical in terms of polarity. While their responses to less familiar signed words still showed atypical localization patterns, the localization of responses to highly familiar signed words became more concentrated in the left perisylvian language network. Our findings suggest that the timing of language experience affects the organization of neural language processing; however, even in adolescence, language representation in the human brain continues to evolve with experience. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. Monitoring Different Phonological Parameters of Sign Language Engages the Same Cortical Language Network but Distinctive Perceptual Ones.

    PubMed

    Cardin, Velia; Orfanidou, Eleni; Kästner, Lena; Rönnberg, Jerker; Woll, Bencie; Capek, Cheryl M; Rudner, Mary

    2016-01-01

    The study of signed languages allows the dissociation of sensorimotor and cognitive neural components of the language signal. Here we investigated the neurocognitive processes underlying the monitoring of two phonological parameters of sign languages: handshape and location. Our goal was to determine if brain regions processing sensorimotor characteristics of different phonological parameters of sign languages were also involved in phonological processing, with their activity being modulated by the linguistic content of manual actions. We conducted an fMRI experiment using manual actions varying in phonological structure and semantics: (1) signs of a familiar sign language (British Sign Language), (2) signs of an unfamiliar sign language (Swedish Sign Language), and (3) invented nonsigns that violate the phonological rules of British Sign Language and Swedish Sign Language or consist of nonoccurring combinations of phonological parameters. Three groups of participants were tested: deaf native signers, deaf nonsigners, and hearing nonsigners. Results show that the linguistic processing of different phonological parameters of sign language is independent of the sensorimotor characteristics of the language signal. Handshape and location were processed by different perceptual and task-related brain networks but recruited the same language areas. The semantic content of the stimuli did not influence this process, but phonological structure did, with nonsigns being associated with longer RTs and stronger activations in an action observation network in all participants and in the supramarginal gyrus exclusively in deaf signers. These results suggest higher processing demands for stimuli that contravene the phonological rules of a signed language, independently of previous knowledge of signed languages. We suggest that the phonological characteristics of a language may arise as a consequence of more efficient neural processing for its perception and production.

  4. Phonological Similarity in American Sign Language.

    ERIC Educational Resources Information Center

    Hildebrandt, Ursula; Corina, David

    2002-01-01

    Investigates deaf and hearing subjects' ratings of American Sign Language (ASL) signs to assess whether linguistic experience shapes judgments of sign similarity. Findings are consistent with linguistic theories that posit movement and location as core structural elements of syllable structure in ASL. (Author/VWL)

  5. Examination of Sign Language Education According to the Opinions of Members from a Basic Sign Language Certification Program

    ERIC Educational Resources Information Center

    Akmese, Pelin Pistav

    2016-01-01

    Being hearing impaired limits one's ability to communicate in that it affects all areas of development, particularly speech. One of the methods the hearing impaired use to communicate is sign language. This study, a descriptive study, intends to examine the opinions of individuals who had enrolled in a sign language certification program by using…

  6. The Phonetics of Head and Body Movement in the Realization of American Sign Language Signs.

    PubMed

    Tyrone, Martha E; Mauk, Claude E

    2016-01-01

    Because the primary articulators for sign languages are the hands, sign phonology and phonetics have focused mainly on them and treated other articulators as passive targets. However, there is abundant research on the role of nonmanual articulators in sign language grammar and prosody. The current study examines how hand and head/body movements are coordinated to realize phonetic targets. Kinematic data were collected from 5 deaf American Sign Language (ASL) signers to allow the analysis of movements of the hands, head and body during signing. In particular, we examine how the chin, forehead and torso move during the production of ASL signs at those three phonological locations. Our findings suggest that for signs with a lexical movement toward the head, the forehead and chin move to facilitate convergence with the hand. By comparison, the torso does not move to facilitate convergence with the hand for signs located at the torso. These results imply that the nonmanual articulators serve a phonetic as well as a grammatical or prosodic role in sign languages. Future models of sign phonetics and phonology should take into consideration the movements of the nonmanual articulators in the realization of signs. © 2016 S. Karger AG, Basel.

  7. On the Conventionalization of Mouth Actions in Australian Sign Language.

    PubMed

    Johnston, Trevor; van Roekel, Jane; Schembri, Adam

    2016-03-01

    This study investigates the conventionalization of mouth actions in Australian Sign Language. Signed languages were once thought of as simply manual languages because the hands produce the signs which individually and in groups are the symbolic units most easily equated with the words, phrases and clauses of spoken languages. However, it has long been acknowledged that non-manual activity, such as movements of the body, head and the face play a very important role. In this context, mouth actions that occur while communicating in signed languages have posed a number of questions for linguists: are the silent mouthings of spoken language words simply borrowings from the respective majority community spoken language(s)? Are those mouth actions that are not silent mouthings of spoken words conventionalized linguistic units proper to each signed language, culturally linked semi-conventional gestural units shared by signers with members of the majority speaking community, or even gestures and expressions common to all humans? We use a corpus-based approach to gather evidence of the extent of the use of mouth actions in naturalistic Australian Sign Language-making comparisons with other signed languages where data is available--and the form/meaning pairings that these mouth actions instantiate.

  8. Is Lhasa Tibetan Sign Language emerging, endangered, or both?

    PubMed

    Hofer, Theresia

    2017-05-24

    This article offers the first overview of the recent emergence of Tibetan Sign Language (TibSL) in Lhasa, capital of the Tibet Autonomous Region (TAR), China. Drawing on short anthropological fieldwork, in 2007 and 2014, with people and organisations involved in the formalisation and promotion of TibSL, the author discusses her findings within the nine-fold UNESCO model for assessing linguistic vitality and endangerment. She follows the adaptation of this model to assess signed languages by the Institute of Sign Languages and Deaf Studies (iSLanDS) at the University of Central Lancashire. The appraisal shows that TibSL appears to be between "severely" and "definitely" endangered, adding to the extant studies on the widespread phenomenon of sign language endangerment. Possible future influences and developments regarding the vitality and use of TibSL in Central Tibet and across the Tibetan plateau are then discussed and certain additions, not considered within the existing assessment model, suggested. In concluding, the article places the situation of TibSL within the wider circumstances of minority (sign) languages in China, Chinese Sign Language (CSL), and the post-2008 movement to promote and use "pure Tibetan language".

  9. American Sign Language Curricula: A Review

    ERIC Educational Resources Information Center

    Rosen, Russell S.

    2010-01-01

    There is an exponential growth in the number of schools that offer American Sign Language (ASL) for foreign language credit and the different ASL curricula that were published. This study analyzes different curricula in its assumptions regarding language, learning, and teaching of second languages. It is found that curricula vary in their…

  10. Validity of the American Sign Language Discrimination Test

    ERIC Educational Resources Information Center

    Bochner, Joseph H.; Samar, Vincent J.; Hauser, Peter C.; Garrison, Wayne M.; Searls, J. Matt; Sanders, Cynthia A.

    2016-01-01

    American Sign Language (ASL) is one of the most commonly taught languages in North America. Yet, few assessment instruments for ASL proficiency have been developed, none of which have adequately demonstrated validity. We propose that the American Sign Language Discrimination Test (ASL-DT), a recently developed measure of learners' ability to…

  11. The Mechanics of Fingerspelling: Analyzing Ethiopian Sign Language

    ERIC Educational Resources Information Center

    Duarte, Kyle

    2010-01-01

    Ethiopian Sign Language utilizes a fingerspelling system that represents Amharic orthography. Just as each character of the Amharic abugida encodes a consonant-vowel sound pair, each sign in the Ethiopian Sign Language fingerspelling system uses handshape to encode a base consonant, as well as a combination of timing, placement, and orientation to…

  12. Sign Language and the Brain: A Review

    ERIC Educational Resources Information Center

    Campbell, Ruth; MacSweeney, Mairead; Waters, Dafydd

    2008-01-01

    How are signed languages processed by the brain? This review briefly outlines some basic principles of brain structure and function and the methodological principles and techniques that have been used to investigate this question. We then summarize a number of different studies exploring brain activity associated with sign language processing…

  13. Neural networks for sign language translation

    NASA Astrophysics Data System (ADS)

    Wilson, Beth J.; Anspach, Gretel

    1993-09-01

    A neural network is used to extract relevant features of sign language from video images of a person communicating in American Sign Language or Signed English. The key features are hand motion, hand location with respect to the body, and handshape. A modular hybrid design is under way to apply various techniques, including neural networks, in the development of a translation system that will facilitate communication between deaf and hearing people. One of the neural networks described here is used to classify video images of handshapes into their linguistic counterpart in American Sign Language. The video image is preprocessed to yield Fourier descriptors that encode the shape of the hand silhouette. These descriptors are then used as inputs to a neural network that classifies their shapes. The network is trained with various examples from different signers and is tested with new images from new signers. The results have shown that for coarse handshape classes, the network is invariant to the type of camera used to film the various signers and to the segmentation technique.

  14. Equity in Education: Signed Language and the Courts

    ERIC Educational Resources Information Center

    Snoddon, Kristin

    2009-01-01

    This article examines several legal cases in Canada, the USA, and Australia involving signed language in education for Deaf students. In all three contexts, signed language rights for Deaf students have been viewed from within a disability legislation framework that either does not extend to recognizing language rights in education or that…

  15. Neural systems underlying lexical retrieval for sign language.

    PubMed

    Emmorey, Karen; Grabowski, Thomas; McCullough, Stephen; Damasio, Hanna; Ponto, Laura L B; Hichwa, Richard D; Bellugi, Ursula

    2003-01-01

    Positron emission tomography was used to investigate whether signed languages exhibit the same neural organization for lexical retrieval within classical and non-classical language areas as has been described for spoken English. Ten deaf native American sign language (ASL) signers were shown pictures of unique entities (famous persons) and non-unique entities (animals) and were asked to name each stimulus with an overt signed response. Proper name signed responses to famous people were fingerspelled, and common noun responses to animals were both fingerspelled and signed with native ASL signs. In general, retrieving ASL signs activated neural sites similar to those activated by hearing subjects retrieving English words. Naming famous persons activated the left temporal pole (TP), whereas naming animals (whether fingerspelled or signed) activated left inferotemporal (IT) cortex. The retrieval of fingerspelled and native signs generally engaged the same cortical regions, but fingerspelled signs in addition activated a premotor region, perhaps due to the increased motor planning and sequencing demanded by fingerspelling. Native signs activated portions of the left supramarginal gyrus (SMG), an area previously implicated in the retrieval of phonological features of ASL signs. Overall, the findings indicate that similar neuroanatomical areas are involved in lexical retrieval for both signs and words. Copyright 2003 Elsevier Science Ltd.

  16. A human mirror neuron system for language: Perspectives from signed languages of the deaf.

    PubMed

    Knapp, Heather Patterson; Corina, David P

    2010-01-01

    Language is proposed to have developed atop the human analog of the macaque mirror neuron system for action perception and production [Arbib M.A. 2005. From monkey-like action recognition to human language: An evolutionary framework for neurolinguistics (with commentaries and author's response). Behavioral and Brain Sciences, 28, 105-167; Arbib M.A. (2008). From grasp to language: Embodied concepts and the challenge of abstraction. Journal de Physiologie Paris 102, 4-20]. Signed languages of the deaf are fully-expressive, natural human languages that are perceived visually and produced manually. We suggest that if a unitary mirror neuron system mediates the observation and production of both language and non-linguistic action, three prediction can be made: (1) damage to the human mirror neuron system should non-selectively disrupt both sign language and non-linguistic action processing; (2) within the domain of sign language, a given mirror neuron locus should mediate both perception and production; and (3) the action-based tuning curves of individual mirror neurons should support the highly circumscribed set of motions that form the "vocabulary of action" for signed languages. In this review we evaluate data from the sign language and mirror neuron literatures and find that these predictions are only partially upheld. 2009 Elsevier Inc. All rights reserved.

  17. [Information technology in learning sign language].

    PubMed

    Hernández, Cesar; Pulido, Jose L; Arias, Jorge E

    2015-01-01

    To develop a technological tool that improves the initial learning of sign language in hearing impaired children. The development of this research was conducted in three phases: the lifting of requirements, design and development of the proposed device, and validation and evaluation device. Through the use of information technology and with the advice of special education professionals, we were able to develop an electronic device that facilitates the learning of sign language in deaf children. This is formed mainly by a graphic touch screen, a voice synthesizer, and a voice recognition system. Validation was performed with the deaf children in the Filadelfia School of the city of Bogotá. A learning methodology was established that improves learning times through a small, portable, lightweight, and educational technological prototype. Tests showed the effectiveness of this prototype, achieving a 32 % reduction in the initial learning time for sign language in deaf children.

  18. Is Lhasa Tibetan Sign Language emerging, endangered, or both?

    PubMed Central

    Hofer, Theresia

    2017-01-01

    This article offers the first overview of the recent emergence of Tibetan Sign Language (TibSL) in Lhasa, capital of the Tibet Autonomous Region (TAR), China. Drawing on short anthropological fieldwork, in 2007 and 2014, with people and organisations involved in the formalisation and promotion of TibSL, the author discusses her findings within the nine-fold UNESCO model for assessing linguistic vitality and endangerment. She follows the adaptation of this model to assess signed languages by the Institute of Sign Languages and Deaf Studies (iSLanDS) at the University of Central Lancashire. The appraisal shows that TibSL appears to be between “severely” and “definitely” endangered, adding to the extant studies on the widespread phenomenon of sign language endangerment. Possible future influences and developments regarding the vitality and use of TibSL in Central Tibet and across the Tibetan plateau are then discussed and certain additions, not considered within the existing assessment model, suggested. In concluding, the article places the situation of TibSL within the wider circumstances of minority (sign) languages in China, Chinese Sign Language (CSL), and the post-2008 movement to promote and use “pure Tibetan language”. PMID:29033477

  19. Historical Development of Hong Kong Sign Language

    ERIC Educational Resources Information Center

    Sze, Felix; Lo, Connie; Lo, Lisa; Chu, Kenny

    2013-01-01

    This article traces the origins of Hong Kong Sign Language (hereafter HKSL) and its subsequent development in relation to the establishment of Deaf education in Hong Kong after World War II. We begin with a detailed description of the history of Deaf education with a particular focus on the role of sign language in such development. We then…

  20. Pinky Extension as a Phonestheme in Mongolian Sign Language

    ERIC Educational Resources Information Center

    Healy, Christina

    2011-01-01

    Mongolian Sign Language (MSL) is a visual-gestural language that developed from multiple languages interacting as a result of both geographic proximity and political relations and of the natural development of a communication system by deaf community members. Similar to the phonological systems of other signed languages, MSL combines handshapes,…

  1. A Kinect-Based Sign Language Hand Gesture Recognition System for Hearing- and Speech-Impaired: A Pilot Study of Pakistani Sign Language.

    PubMed

    Halim, Zahid; Abbas, Ghulam

    2015-01-01

    Sign language provides hearing and speech impaired individuals with an interface to communicate with other members of the society. Unfortunately, sign language is not understood by most of the common people. For this, a gadget based on image processing and pattern recognition can provide with a vital aid for detecting and translating sign language into a vocal language. This work presents a system for detecting and understanding the sign language gestures by a custom built software tool and later translating the gesture into a vocal language. For the purpose of recognizing a particular gesture, the system employs a Dynamic Time Warping (DTW) algorithm and an off-the-shelf software tool is employed for vocal language generation. Microsoft(®) Kinect is the primary tool used to capture video stream of a user. The proposed method is capable of successfully detecting gestures stored in the dictionary with an accuracy of 91%. The proposed system has the ability to define and add custom made gestures. Based on an experiment in which 10 individuals with impairments used the system to communicate with 5 people with no disability, 87% agreed that the system was useful.

  2. American Sign Language Comprehension Test: A Tool for Sign Language Researchers

    ERIC Educational Resources Information Center

    Hauser, Peter C.; Paludneviciene, Raylene; Riddle, Wanda; Kurz, Kim B.; Emmorey, Karen; Contreras, Jessica

    2016-01-01

    The American Sign Language Comprehension Test (ASL-CT) is a 30-item multiple-choice test that measures ASL receptive skills and is administered through a website. This article describes the development and psychometric properties of the test based on a sample of 80 college students including deaf native signers, hearing native signers, deaf…

  3. Sign Language with Babies: What Difference Does It Make?

    ERIC Educational Resources Information Center

    Barnes, Susan Kubic

    2010-01-01

    Teaching sign language--to deaf or other children with special needs or to hearing children with hard-of-hearing family members--is not new. Teaching sign language to typically developing children has become increasingly popular since the publication of "Baby Signs"[R] (Goodwyn & Acredolo, 1996), now in its third edition. Attention to signing with…

  4. When does Iconicity in Sign Language Matter?

    PubMed Central

    Baus, Cristina; Carreiras, Manuel; Emmorey, Karen

    2012-01-01

    We examined whether iconicity in American Sign Language (ASL) enhances translation performance for new learners and proficient signers. Fifteen hearing nonsigners and 15 proficient ASL-English bilinguals performed a translation recognition task and a production translation task. Nonsigners were taught 28 ASL verbs (14 iconic; 14 non-iconic) prior to performing these tasks. Only new learners benefited from sign iconicity, recognizing iconic translations faster and more accurately and exhibiting faster forward (English-ASL) and backward (ASL-English) translation times for iconic signs. In contrast, proficient ASL-English bilinguals exhibited slower recognition and translation times for iconic signs. We suggest iconicity aids memorization in the early stages of adult sign language learning, but for fluent L2 signers, iconicity interacts with other variables that slow translation (specifically, the iconic signs had more translation equivalents than the non-iconic signs). Iconicity may also have slowed translation performance by forcing conceptual mediation for iconic signs, which is slower than translating via direct lexical links. PMID:23543899

  5. Workplace Concepts in Sign and Text. A Computerized Sign Language Dictionary.

    ERIC Educational Resources Information Center

    Western Pennsylvania School for the Deaf, Pittsburgh.

    This document is a dictionary of essential vocabulary, signs, and illustrations of workplace activities to be used to train deaf or hearing-impaired adults. It contains more than 500 entries with workplace-relevant vocabulary, each including an illustration of the signed word or phrase in American Sign Language, a description of how to make the…

  6. Brain correlates of constituent structure in sign language comprehension.

    PubMed

    Moreno, Antonio; Limousin, Fanny; Dehaene, Stanislas; Pallier, Christophe

    2018-02-15

    During sentence processing, areas of the left superior temporal sulcus, inferior frontal gyrus and left basal ganglia exhibit a systematic increase in brain activity as a function of constituent size, suggesting their involvement in the computation of syntactic and semantic structures. Here, we asked whether these areas play a universal role in language and therefore contribute to the processing of non-spoken sign language. Congenitally deaf adults who acquired French sign language as a first language and written French as a second language were scanned while watching sequences of signs in which the size of syntactic constituents was manipulated. An effect of constituent size was found in the basal ganglia, including the head of the caudate and the putamen. A smaller effect was also detected in temporal and frontal regions previously shown to be sensitive to constituent size in written language in hearing French subjects (Pallier et al., 2011). When the deaf participants read sentences versus word lists, the same network of language areas was observed. While reading and sign language processing yielded identical effects of linguistic structure in the basal ganglia, the effect of structure was stronger in all cortical language areas for written language relative to sign language. Furthermore, cortical activity was partially modulated by age of acquisition and reading proficiency. Our results stress the important role of the basal ganglia, within the language network, in the representation of the constituent structure of language, regardless of the input modality. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. LSE-Sign: A lexical database for Spanish Sign Language.

    PubMed

    Gutierrez-Sigut, Eva; Costello, Brendan; Baus, Cristina; Carreiras, Manuel

    2016-03-01

    The LSE-Sign database is a free online tool for selecting Spanish Sign Language stimulus materials to be used in experiments. It contains 2,400 individual signs taken from a recent standardized LSE dictionary, and a further 2,700 related nonsigns. Each entry is coded for a wide range of grammatical, phonological, and articulatory information, including handshape, location, movement, and non-manual elements. The database is accessible via a graphically based search facility which is highly flexible both in terms of the search options available and the way the results are displayed. LSE-Sign is available at the following website: http://www.bcbl.eu/databases/lse/.

  8. Sign Vocabulary in Deaf Toddlers Exposed to Sign Language since Birth

    ERIC Educational Resources Information Center

    Rinaldi, Pasquale; Caselli, Maria Cristina; Di Renzo, Alessio; Gulli, Tiziana; Volterra, Virginia

    2014-01-01

    Lexical comprehension and production is directly evaluated for the first time in deaf signing children below the age of 3 years. A Picture Naming Task was administered to 8 deaf signing toddlers (aged 2-3 years) who were exposed to Sign Language since birth. Results were compared with data of hearing speaking controls. In both deaf and hearing…

  9. Sign language aphasia from a neurodegenerative disease.

    PubMed

    Falchook, Adam D; Mayberry, Rachel I; Poizner, Howard; Burtis, David Brandon; Doty, Leilani; Heilman, Kenneth M

    2013-01-01

    While Alois Alzheimer recognized the effects of the disease he described on speech and language in his original description of the disease in 1907, the effects of Alzheimer's disease (AD) on language in deaf signers has not previously been reported. We evaluated a 55-year-old right-handed congenitally deaf woman with a 2-year history of progressive memory loss and a deterioration of her ability to communicate in American Sign Language, which she learned at the age of eight. Examination revealed that she had impaired episodic memory as well as marked impairments in the production and comprehension of fingerspelling and grammatically complex sentences. She also had signs of anomia as well as an ideomotor apraxia and visual-spatial dysfunction. This report illustrates the challenges in evaluation of a patient for the presence of degenerative dementia when the person is deaf from birth, uses sign language, and has a late age of primary language acquisition. Although our patient could neither speak nor hear, in many respects her cognitive disorders mirror those of patients with AD who had normally learned to speak.

  10. Intercultural Orientations as Japanese Language Learners' Motivation in Mainland China

    ERIC Educational Resources Information Center

    Lv, Leining; Gao, Xuesong; Teo, Timothy

    2017-01-01

    This article reports on a study that investigated how 665 Japanese language learners, who had started learning Japanese at different times in the last 3 decades, had been motivated to learn Japanese in China. Analysis of the survey data revealed that the participants displayed similar intercultural orientations when learning Japanese despite the…

  11. Introduction: Sign Language, Sustainable Development, and Equal Opportunities

    ERIC Educational Resources Information Center

    De Clerck, Goedele A. M.

    2017-01-01

    This article has been excerpted from "Introduction: Sign Language, Sustainable Development, and Equal Opportunities" (De Clerck) in "Sign Language, Sustainable Development, and Equal Opportunities: Envisioning the Future for Deaf Students" (G. A. M. De Clerck & P. V. Paul (Eds.) 2016). The idea of exploring various…

  12. The Sociolinguistics of Sign Languages.

    ERIC Educational Resources Information Center

    Lucas, Ceil, Ed.

    This collection of papers examines how sign languages are distributed around the world; what occurs when they come in contact with spoken and written languages, and how signers use them in a variety of situations. Each chapter introduces the key issues in a particular area of inquiry and provides a comprehensive review of the literature. The seven…

  13. Legal and Ethical Imperatives for Using Certified Sign Language Interpreters in Health Care Settings: How to "Do No Harm" When "It's (All) Greek" (Sign Language) to You.

    PubMed

    Nonaka, Angela M

    2016-09-01

    Communication obstacles in health care settings adversely impact patient-practitioner interactions by impeding service efficiency, reducing mutual trust and satisfaction, or even endangering health outcomes. When interlocutors are separated by language, interpreters are required. The efficacy of interpreting, however, is constrained not just by interpreters' competence but also by health care providers' facility working with interpreters. Deaf individuals whose preferred form of communication is a signed language often encounter communicative barriers in health care settings. In those environments, signing Deaf people are entitled to equal communicative access via sign language interpreting services according to the Americans with Disabilities Act and Executive Order 13166, the Limited English Proficiency Initiative. Yet, litigation in states across the United States suggests that individual and institutional providers remain uncertain about their legal obligations to provide equal communicative access. This article discusses the legal and ethical imperatives for using professionally certified (vs. ad hoc) sign language interpreters in health care settings. First outlining the legal terrain governing provision of sign language interpreting services, the article then describes different types of "sign language" (e.g., American Sign Language vs. manually coded English) and different forms of "sign language interpreting" (e.g., interpretation vs. transliteration vs. translation; simultaneous vs. consecutive interpreting; individual vs. team interpreting). This is followed by reviews of the formal credentialing process and of specialized forms of sign language interpreting-that is, certified deaf interpreting, trilingual interpreting, and court interpreting. After discussing practical steps for contracting professional sign language interpreters and addressing ethical issues of confidentiality, this article concludes by offering suggestions for working more effectively

  14. Discourses of prejudice in the professions: the case of sign languages

    PubMed Central

    Humphries, Tom; Kushalnagar, Poorna; Mathur, Gaurav; Napoli, Donna Jo; Padden, Carol; Rathmann, Christian; Smith, Scott

    2017-01-01

    There is no evidence that learning a natural human language is cognitively harmful to children. To the contrary, multilingualism has been argued to be beneficial to all. Nevertheless, many professionals advise the parents of deaf children that their children should not learn a sign language during their early years, despite strong evidence across many research disciplines that sign languages are natural human languages. Their recommendations are based on a combination of misperceptions about (1) the difficulty of learning a sign language, (2) the effects of bilingualism, and particularly bimodalism, (3) the bona fide status of languages that lack a written form, (4) the effects of a sign language on acquiring literacy, (5) the ability of technologies to address the needs of deaf children and (6) the effects that use of a sign language will have on family cohesion. We expose these misperceptions as based in prejudice and urge institutions involved in educating professionals concerned with the healthcare, raising and educating of deaf children to include appropriate information about first language acquisition and the importance of a sign language for deaf children. We further urge such professionals to advise the parents of deaf children properly, which means to strongly advise the introduction of a sign language as soon as hearing loss is detected. PMID:28280057

  15. Mapping language to the world: the role of iconicity in the sign language input.

    PubMed

    Perniss, Pamela; Lu, Jenny C; Morgan, Gary; Vigliocco, Gabriella

    2018-03-01

    Most research on the mechanisms underlying referential mapping has assumed that learning occurs in ostensive contexts, where label and referent co-occur, and that form and meaning are linked by arbitrary convention alone. In the present study, we focus on iconicity in language, that is, resemblance relationships between form and meaning, and on non-ostensive contexts, where label and referent do not co-occur. We approach the question of language learning from the perspective of the language input. Specifically, we look at child-directed language (CDL) in British Sign Language (BSL), a language rich in iconicity due to the affordances of the visual modality. We ask whether child-directed signing exploits iconicity in the language by highlighting the similarity mapping between form and referent. We find that CDL modifications occur more often with iconic signs than with non-iconic signs. Crucially, for iconic signs, modifications are more frequent in non-ostensive contexts than in ostensive contexts. Furthermore, we find that pointing dominates in ostensive contexts, and suggest that caregivers adjust the semiotic resources recruited in CDL to context. These findings offer first evidence for a role of iconicity in the language input and suggest that iconicity may be involved in referential mapping and language learning, particularly in non-ostensive contexts. © 2017 John Wiley & Sons Ltd.

  16. Children creating language: how Nicaraguan sign language acquired a spatial grammar.

    PubMed

    Senghas, A; Coppola, M

    2001-07-01

    It has long been postulated that language is not purely learned, but arises from an interaction between environmental exposure and innate abilities. The innate component becomes more evident in rare situations in which the environment is markedly impoverished. The present study investigated the language production of a generation of deaf Nicaraguans who had not been exposed to a developed language. We examined the changing use of early linguistic structures (specifically, spatial modulations) in a sign language that has emerged since the Nicaraguan group first came together: In tinder two decades, sequential cohorts of learners systematized the grammar of this new sign language. We examined whether the systematicity being added to the language stems from children or adults: our results indicate that such changes originate in children aged 10 and younger Thus, sequential cohorts of interacting young children collectively: possess the capacity not only to learn, but also to create, language.

  17. Sign Language and Language Acquisition in Man and Ape. New Dimensions in Comparative Pedolinguistics.

    ERIC Educational Resources Information Center

    Peng, Fred C. C., Ed.

    A collection of research materials on sign language and primatology is presented here. The essays attempt to show that: sign language is a legitimate language that can be learned not only by humans but by nonhuman primates as well, and nonhuman primates have the capability to acquire a human language using a different mode. The following…

  18. Acoustic Sources of Accent in Second Language Japanese Speech.

    PubMed

    Idemaru, Kaori; Wei, Peipei; Gubbins, Lucy

    2018-05-01

    This study reports an exploratory analysis of the acoustic characteristics of second language (L2) speech which give rise to the perception of a foreign accent. Japanese speech samples were collected from American English and Mandarin Chinese speakers ( n = 16 in each group) studying Japanese. The L2 participants and native speakers ( n = 10) provided speech samples modeling after six short sentences. Segmental (vowels and stops) and prosodic features (rhythm, tone, and fluency) were examined. Native Japanese listeners ( n = 10) rated the samples with regard to degrees of foreign accent. The analyses predicting accent ratings based on the acoustic measurements indicated that one of the prosodic features in particular, tone (defined as high and low patterns of pitch accent and intonation in this study), plays an important role in robustly predicting accent rating in L2 Japanese across the two first language (L1) backgrounds. These results were consistent with the prediction based on phonological and phonetic comparisons between Japanese and English, as well as Japanese and Mandarin Chinese. The results also revealed L1-specific predictors of perceived accent in Japanese. The findings of this study contribute to the growing literature that examines sources of perceived foreign accent.

  19. Compulsory "Foreign Language Activities" in Japanese Primary Schools

    ERIC Educational Resources Information Center

    Hashimoto, Kayoko

    2011-01-01

    From 2011, the new curriculum for introducing English to Japanese primary schools will be fully implemented in the form of "foreign language activities". This innovation forms part of the government's plan to cultivate "Japanese with English abilities", a development based on the awareness, particularly in the business sector,…

  20. The emergence of temporal language in Nicaraguan Sign Language

    PubMed Central

    Kocab, Annemarie; Senghas, Ann; Snedeker, Jesse

    2016-01-01

    Understanding what uniquely human properties account for the creation and transmission of language has been a central goal of cognitive science. Recently, the study of emerging sign languages, such as Nicaraguan Sign Language (NSL), has offered the opportunity to better understand how languages are created and the roles of the individual learner and the community of users. Here, we examined the emergence of two types of temporal language in NSL, comparing the linguistic devices for conveying temporal information among three sequential age cohorts of signers. Experiment 1 showed that while all three cohorts of signers could communicate about linearly ordered discrete events, only the second and third generations of signers successfully communicated information about events with more complex temporal structure. Experiment 2 showed that signers could discriminate between the types of temporal events in a nonverbal task. Finally, Experiment 3 investigated the ordinal use of numbers (e.g., first, second) in NSL signers, indicating that one strategy younger signers might have for accurately describing events in time might be to use ordinal numbers to mark each event. While the capacity for representing temporal concepts appears to be present in the human mind from the onset of language creation, the linguistic devices to convey temporality do not appear immediately. Evidently, temporal language emerges over generations of language transmission, as a product of individual minds interacting within a community of users. PMID:27591549

  1. Language Policy in Japanese Ethnic Churches in Canada and the Legitimization of Church Member Identities

    ERIC Educational Resources Information Center

    Barrett, Tyler

    2017-01-01

    This paper is aimed at understanding the language policy of Japanese ethnic churches and the legitimization of church member identities in the midst of dominant languages in Canada. While church members often construct Japanese ethnic Christian churches with 'grassroots' language policies that seem to legitimize their Japanese language and…

  2. Deaf-And-Mute Sign Language Generation System

    NASA Astrophysics Data System (ADS)

    Kawai, Hideo; Tamura, Shinichi

    1984-08-01

    We have developed a system which can recognize speech and generate the corresponding animation-like sign language sequence. The system is implemented in a popular personal computer. This has three video-RAM's and a voice recognition board which can recognize only registered voice of a specific speaker. Presently, fourty sign language patterns and fifty finger spellings are stored in two floppy disks. Each sign pattern is composed of one to four sub-patterns. That is, if the pattern is composed of one sub-pattern, it is displayed as a still pattern. If not, it is displayed as a motion pattern. This system will help communications between deaf-and-mute persons and healthy persons. In order to display in high speed, almost programs are written in a machine language.

  3. Pointing and Reference in Sign Language and Spoken Language: Anchoring vs. Identifying

    ERIC Educational Resources Information Center

    Barberà, Gemma; Zwets, Martine

    2013-01-01

    In both signed and spoken languages, pointing serves to direct an addressee's attention to a particular entity. This entity may be either present or absent in the physical context of the conversation. In this article we focus on pointing directed to nonspeaker/nonaddressee referents in Sign Language of the Netherlands (Nederlandse Gebarentaal,…

  4. "Those Anime Students": Foreign Language Literacy Development through Japanese Popular Culture

    ERIC Educational Resources Information Center

    Fukunaga, Natsuki

    2006-01-01

    Using multiliteracies and sociocultural perspectives on language and literacy learning, this article describes three Japanese as a foreign language (JFL) students' literacy development through involvement with Japanese popular culture. As part of a larger qualitative ethnographic study, the author interviewed JFL learners who have a particular…

  5. Discriminative exemplar coding for sign language recognition with Kinect.

    PubMed

    Sun, Chao; Zhang, Tianzhu; Bao, Bing-Kun; Xu, Changsheng; Mei, Tao

    2013-10-01

    Sign language recognition is a growing research area in the field of computer vision. A challenge within it is to model various signs, varying with time resolution, visual manual appearance, and so on. In this paper, we propose a discriminative exemplar coding (DEC) approach, as well as utilizing Kinect sensor, to model various signs. The proposed DEC method can be summarized as three steps. First, a quantity of class-specific candidate exemplars are learned from sign language videos in each sign category by considering their discrimination. Then, every video of all signs is described as a set of similarities between frames within it and the candidate exemplars. Instead of simply using a heuristic distance measure, the similarities are decided by a set of exemplar-based classifiers through the multiple instance learning, in which a positive (or negative) video is treated as a positive (or negative) bag and those frames similar to the given exemplar in Euclidean space as instances. Finally, we formulate the selection of the most discriminative exemplars into a framework and simultaneously produce a sign video classifier to recognize sign. To evaluate our method, we collect an American sign language dataset, which includes approximately 2000 phrases, while each phrase is captured by Kinect sensor with color, depth, and skeleton information. Experimental results on our dataset demonstrate the feasibility and effectiveness of the proposed approach for sign language recognition.

  6. New Perspectives on the History of American Sign Language

    ERIC Educational Resources Information Center

    Shaw, Emily; Delaporte, Yves

    2011-01-01

    Examinations of the etymology of American Sign Language have typically involved superficial analyses of signs as they exist over a short period of time. While it is widely known that ASL is related to French Sign Language, there has yet to be a comprehensive study of this historic relationship between their lexicons. This article presents…

  7. Contributions of the Study of Japanese as a Second language to our General Understanding of Second Language Acquisition and the Definition of Second Language Acquisition Research.

    ERIC Educational Resources Information Center

    Wakabayashi, Shigenori

    2003-01-01

    Reviews three books on the acquisition of Japanese as a second language: "Second Language Acquisition Process in the Classroom" by A.S. Ohta;"The Acquisition of Grammar by Learners of Japanese" (English translation of title), by H. Noda, K. Sakoda, K. Shibuya, and N. Kobayashi; and "The Acquisition of Japanese as a Second Language," B. K. Kanno,…

  8. Deficits in narrative abilities in child British Sign Language users with specific language impairment.

    PubMed

    Herman, Ros; Rowley, Katherine; Mason, Kathryn; Morgan, Gary

    2014-01-01

    This study details the first ever investigation of narrative skills in a group of 17 deaf signing children who have been diagnosed with disorders in their British Sign Language development compared with a control group of 17 deaf child signers matched for age, gender, education, quantity, and quality of language exposure and non-verbal intelligence. Children were asked to generate a narrative based on events in a language free video. Narratives were analysed for global structure, information content and local level grammatical devices, especially verb morphology. The language-impaired group produced shorter, less structured and grammatically simpler narratives than controls, with verb morphology particularly impaired. Despite major differences in how sign and spoken languages are articulated, narrative is shown to be a reliable marker of language impairment across the modality boundaries. © 2014 Royal College of Speech and Language Therapists.

  9. ERP correlates of German Sign Language processing in deaf native signers.

    PubMed

    Hänel-Faulhaber, Barbara; Skotara, Nils; Kügow, Monique; Salden, Uta; Bottari, Davide; Röder, Brigitte

    2014-05-10

    The present study investigated the neural correlates of sign language processing of Deaf people who had learned German Sign Language (Deutsche Gebärdensprache, DGS) from their Deaf parents as their first language. Correct and incorrect signed sentences were presented sign by sign on a computer screen. At the end of each sentence the participants had to judge whether or not the sentence was an appropriate DGS sentence. Two types of violations were introduced: (1) semantically incorrect sentences containing a selectional restriction violation (implausible object); (2) morphosyntactically incorrect sentences containing a verb that was incorrectly inflected (i.e., incorrect direction of movement). Event-related brain potentials (ERPs) were recorded from 74 scalp electrodes. Semantic violations (implausible signs) elicited an N400 effect followed by a positivity. Sentences with a morphosyntactic violation (verb agreement violation) elicited a negativity followed by a broad centro-parietal positivity. ERP correlates of semantic and morphosyntactic aspects of DGS clearly differed from each other and showed a number of similarities with those observed in other signed and oral languages. These data suggest a similar functional organization of signed and oral languages despite the visual-spacial modality of sign language.

  10. ERP correlates of German Sign Language processing in deaf native signers

    PubMed Central

    2014-01-01

    Background The present study investigated the neural correlates of sign language processing of Deaf people who had learned German Sign Language (Deutsche Gebärdensprache, DGS) from their Deaf parents as their first language. Correct and incorrect signed sentences were presented sign by sign on a computer screen. At the end of each sentence the participants had to judge whether or not the sentence was an appropriate DGS sentence. Two types of violations were introduced: (1) semantically incorrect sentences containing a selectional restriction violation (implausible object); (2) morphosyntactically incorrect sentences containing a verb that was incorrectly inflected (i.e., incorrect direction of movement). Event-related brain potentials (ERPs) were recorded from 74 scalp electrodes. Results Semantic violations (implausible signs) elicited an N400 effect followed by a positivity. Sentences with a morphosyntactic violation (verb agreement violation) elicited a negativity followed by a broad centro-parietal positivity. Conclusions ERP correlates of semantic and morphosyntactic aspects of DGS clearly differed from each other and showed a number of similarities with those observed in other signed and oral languages. These data suggest a similar functional organization of signed and oral languages despite the visual-spacial modality of sign language. PMID:24884527

  11. Reading and American Sign Language: Strategies for Translation.

    ERIC Educational Resources Information Center

    Burkholder, Kim

    1999-01-01

    A hearing teacher for whom American Sign Language is a second language identifies nine strategies developed for reading and telling stories to deaf children. These include: ask obvious questions related to the story, portray written dialog as conversation, emphasize points by saying the same thing with different signs, and adapt the story to…

  12. The emergence of temporal language in Nicaraguan Sign Language.

    PubMed

    Kocab, Annemarie; Senghas, Ann; Snedeker, Jesse

    2016-11-01

    Understanding what uniquely human properties account for the creation and transmission of language has been a central goal of cognitive science. Recently, the study of emerging sign languages, such as Nicaraguan Sign Language (NSL), has offered the opportunity to better understand how languages are created and the roles of the individual learner and the community of users. Here, we examined the emergence of two types of temporal language in NSL, comparing the linguistic devices for conveying temporal information among three sequential age cohorts of signers. Experiment 1 showed that while all three cohorts of signers could communicate about linearly ordered discrete events, only the second and third generations of signers successfully communicated information about events with more complex temporal structure. Experiment 2 showed that signers could discriminate between the types of temporal events in a nonverbal task. Finally, Experiment 3 investigated the ordinal use of numbers (e.g., first, second) in NSL signers, indicating that one strategy younger signers might have for accurately describing events in time might be to use ordinal numbers to mark each event. While the capacity for representing temporal concepts appears to be present in the human mind from the onset of language creation, the linguistic devices to convey temporality do not appear immediately. Evidently, temporal language emerges over generations of language transmission, as a product of individual minds interacting within a community of users. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Deaf Life on Isolated Japanese Islands.

    ERIC Educational Resources Information Center

    Torigoe, Takashi; And Others

    1995-01-01

    Interviewed 38 adults with deafness and little schooling in Okinawa concerning their social and language environment. Many of the individuals used an indigenous gestural system shared with hearing people that enabled them to participate in the hearing community. Most had only limited contact with the deaf community and Japanese Sign Language.…

  14. Japanese Language School: Aid or Hindrance to the Americanization of Japanese Americans in Hawaii?

    ERIC Educational Resources Information Center

    Shoho, Alan R.

    A study examined the experiences of 60 Japanese immigrants to Hawaii (Niseis), aged 61-80, who attended Japanese-language schools as children. Using a case study oral history approach, the study gathered oral testimonies through semi-structured interviews. Historical documents were also used as primary sources of information about the schools.…

  15. Language Policies in Uruguay and Uruguayan Sign Language (LSU)

    ERIC Educational Resources Information Center

    Behares, Luis Ernesto; Brovetto, Claudia; Crespi, Leonardo Peluso

    2012-01-01

    In the first part of this article the authors consider the policies that apply to Uruguayan Sign Language (Lengua de Senas Uruguaya; hereafter LSU) and the Uruguayan Deaf community within the general framework of language policies in Uruguay. By analyzing them succinctly and as a whole, the authors then explain twenty-first-century innovations.…

  16. Discourses of prejudice in the professions: the case of sign languages.

    PubMed

    Humphries, Tom; Kushalnagar, Poorna; Mathur, Gaurav; Napoli, Donna Jo; Padden, Carol; Rathmann, Christian; Smith, Scott

    2017-09-01

    There is no evidence that learning a natural human language is cognitively harmful to children. To the contrary, multilingualism has been argued to be beneficial to all. Nevertheless, many professionals advise the parents of deaf children that their children should not learn a sign language during their early years, despite strong evidence across many research disciplines that sign languages are natural human languages. Their recommendations are based on a combination of misperceptions about (1) the difficulty of learning a sign language, (2) the effects of bilingualism, and particularly bimodalism, (3) the bona fide status of languages that lack a written form, (4) the effects of a sign language on acquiring literacy, (5) the ability of technologies to address the needs of deaf children and (6) the effects that use of a sign language will have on family cohesion. We expose these misperceptions as based in prejudice and urge institutions involved in educating professionals concerned with the healthcare, raising and educating of deaf children to include appropriate information about first language acquisition and the importance of a sign language for deaf children. We further urge such professionals to advise the parents of deaf children properly, which means to strongly advise the introduction of a sign language as soon as hearing loss is detected. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  17. Legal Pathways to the Recognition of Sign Languages: A Comparison of the Catalan and Spanish Sign Language Acts

    ERIC Educational Resources Information Center

    Quer, Josep

    2012-01-01

    Despite being minority languages like many others, sign languages have traditionally remained absent from the agendas of policy makers and language planning and policies. In the past two decades, though, this situation has started to change at different paces and to different degrees in several countries. In this article, the author describes the…

  18. A Kinect based sign language recognition system using spatio-temporal features

    NASA Astrophysics Data System (ADS)

    Memiş, Abbas; Albayrak, Songül

    2013-12-01

    This paper presents a sign language recognition system that uses spatio-temporal features on RGB video images and depth maps for dynamic gestures of Turkish Sign Language. Proposed system uses motion differences and accumulation approach for temporal gesture analysis. Motion accumulation method, which is an effective method for temporal domain analysis of gestures, produces an accumulated motion image by combining differences of successive video frames. Then, 2D Discrete Cosine Transform (DCT) is applied to accumulated motion images and temporal domain features transformed into spatial domain. These processes are performed on both RGB images and depth maps separately. DCT coefficients that represent sign gestures are picked up via zigzag scanning and feature vectors are generated. In order to recognize sign gestures, K-Nearest Neighbor classifier with Manhattan distance is performed. Performance of the proposed sign language recognition system is evaluated on a sign database that contains 1002 isolated dynamic signs belongs to 111 words of Turkish Sign Language (TSL) in three different categories. Proposed sign language recognition system has promising success rates.

  19. The Verbal System of Catalan Sign Language (LSC)

    ERIC Educational Resources Information Center

    Morales-Lopez, Esperanza; Boldu-Menasanch, Rosa Maria; Alonso-Rodriguez, Jesus Amador; Gras-Ferrer, Victoria; Rodriguez-Gonzalez, Maria Angeles

    2005-01-01

    This article describes the predicative verbal system of Catalan Sign Language (LSC) as it is used by Deaf people in the province of Barcelona. We also present a historical perspective of the research on this topic, which provides insight into the changes that have taken place over the last few decades in sign language linguistics. The principal…

  20. The Birth and Rebirth of "Sign Language Studies"

    ERIC Educational Resources Information Center

    Armstrong, David F.

    2012-01-01

    As most readers of this journal are aware, "Sign Language Studies" ("SLS") served for many years as effectively the only serious scholarly outlet for work in the nascent field of sign language linguistics. Now reaching its 40th anniversary, the journal was founded by William C. Stokoe and then edited by him for the first quarter century of its…

  1. Signs of Resistance: Peer Learning of Sign Languages within "Oral" Schools for the Deaf

    ERIC Educational Resources Information Center

    Anglin-Jaffe, Hannah

    2013-01-01

    This article explores the role of the Deaf child as peer educator. In schools where sign languages were banned, Deaf children became the educators of their Deaf peers in a number of contexts worldwide. This paper analyses how this peer education of sign language worked in context by drawing on two examples from boarding schools for the deaf in…

  2. How sensory-motor systems impact the neural organization for language: direct contrasts between spoken and signed language

    PubMed Central

    Emmorey, Karen; McCullough, Stephen; Mehta, Sonya; Grabowski, Thomas J.

    2014-01-01

    To investigate the impact of sensory-motor systems on the neural organization for language, we conducted an H215O-PET study of sign and spoken word production (picture-naming) and an fMRI study of sign and audio-visual spoken language comprehension (detection of a semantically anomalous sentence) with hearing bilinguals who are native users of American Sign Language (ASL) and English. Directly contrasting speech and sign production revealed greater activation in bilateral parietal cortex for signing, while speaking resulted in greater activation in bilateral superior temporal cortex (STC) and right frontal cortex, likely reflecting auditory feedback control. Surprisingly, the language production contrast revealed a relative increase in activation in bilateral occipital cortex for speaking. We speculate that greater activation in visual cortex for speaking may actually reflect cortical attenuation when signing, which functions to distinguish self-produced from externally generated visual input. Directly contrasting speech and sign comprehension revealed greater activation in bilateral STC for speech and greater activation in bilateral occipital-temporal cortex for sign. Sign comprehension, like sign production, engaged bilateral parietal cortex to a greater extent than spoken language. We hypothesize that posterior parietal activation in part reflects processing related to spatial classifier constructions in ASL and that anterior parietal activation may reflect covert imitation that functions as a predictive model during sign comprehension. The conjunction analysis for comprehension revealed that both speech and sign bilaterally engaged the inferior frontal gyrus (with more extensive activation on the left) and the superior temporal sulcus, suggesting an invariant bilateral perisylvian language system. We conclude that surface level differences between sign and spoken languages should not be dismissed and are critical for understanding the neurobiology of language

  3. Sign language in dental education-A new nexus.

    PubMed

    Jones, T; Cumberbatch, K

    2017-08-14

    The introduction of the landmark mandatory teaching of sign language to undergraduate dental students at the University of the West Indies (UWI), Mona Campus in Kingston, Jamaica, to bridge the communication gap between dentists and their patients is reviewed. A review of over 90 Doctor of Dental Surgery and Doctor of Dental Medicine curricula in North America, the United Kingdom, parts of Europe and Australia showed no inclusion of sign language in those curricula as a mandatory component. In Jamaica, the government's training school for dental auxiliaries served as the forerunner to the UWI's introduction of formal training of sign language in 2012. Outside of the UWI, a couple of dental schools have sign language courses, but none have a mandatory programme as the one at the UWI. Dentists the world over have had to rely on interpreters to sign with their deaf patients. The deaf in Jamaica have not appreciated the fact that dentists cannot sign and they have felt insulted and only go to the dentist in emergency situations. The mandatory inclusion of sign language in the Undergraduate Dental Programme curriculum at The University of the West Indies, Mona Campus, sought to establish a direct communication channel to formally bridge this gap. The programme of two sign language courses and a direct clinical competency requirement was developed during the second year of the first cohort of the newly introduced undergraduate dental programme through a collaborating partnership between two faculties on the Mona Campus. The programme was introduced in 2012 in the third year of the 5-year undergraduate dental programme. To date, two cohorts have completed the programme, and the preliminary findings from an ongoing clinical study have shown a positive impact on dental care access and dental treatment for deaf patients at the UWI Mona Dental Polyclinic. The development of a direct communication channel between dental students and the deaf that has led to increased dental

  4. Standing Strong: Maloney Interdistrict Magnet School Japanese Language and Culture Program

    ERIC Educational Resources Information Center

    Haxhi, Jessica; Yamashita-Iverson, Kazumi

    2009-01-01

    Maloney Interdistrict Magnet School (MIMS) is the only elementary school in Waterbury that has a world language program and is one of only two elementary Japanese programs in Connecticut. In the past 15 years, more than 1500 students have participated in its Japanese Language and Culture (JLC) Program in grades Prekindergarten through 5th. The JLC…

  5. The Use of Anime in Teaching Japanese as a Foreign Language

    ERIC Educational Resources Information Center

    Han, Chan Yee; Ling, Wong Ngan

    2017-01-01

    The study of popular culture is now becoming an emerging research area within education. While many studies have confirmed that students' interest in anime has driven much of enrolment in Japanese language courses, the impact of using anime as a teaching tool has not been studied thoroughly in the teaching Japanese as a Foreign Language (JFL)…

  6. Sign language processing and the mirror neuron system.

    PubMed

    Corina, David P; Knapp, Heather

    2006-05-01

    In this paper we review evidence for frontal and parietal lobe involvement in sign language comprehension and production, and evaluate the extent to which these data can be interpreted within the context of a mirror neuron system for human action observation and execution. We present data from three literatures--aphasia, cortical stimulation, and functional neuroimaging. Generally, we find support for the idea that sign language comprehension and production can be viewed in the context of a broadly-construed frontal-parietal human action observation/execution system. However, sign language data cannot be fully accounted for under a strict interpretation of the mirror neuron system. Additionally, we raise a number of issues concerning the lack of specificity in current accounts of the human action observation/execution system.

  7. Directionality Effects in Simultaneous Language Interpreting: The Case of Sign Language Interpreters in the Netherlands

    ERIC Educational Resources Information Center

    van Dijk, Rick; Boers, Eveline; Christoffels, Ingrid; Hermans, Daan

    2011-01-01

    The quality of interpretations produced by sign language interpreters was investigated. Twenty-five experienced interpreters were instructed to interpret narratives from (a) spoken Dutch to Sign Language of the Netherlands (SLN), (b) spoken Dutch to Sign Supported Dutch (SSD), and (c) SLN to spoken Dutch. The quality of the interpreted narratives…

  8. The Effectiveness of a Japanese Language Course on Cross-Cultural Competence.

    ERIC Educational Resources Information Center

    Miyamoto, Yumi; Rasmussen, Roger

    1998-01-01

    A quasi-experimental research design was employed to measure the effectiveness of a Japanese language and culture course entitled "Japanese for the Business Community." The study analyzed seven cross-cultural competence attributes in interactions with Japanese people in business settings. (Auth/JL)

  9. Sign language indexation within the MPEG-7 framework

    NASA Astrophysics Data System (ADS)

    Zaharia, Titus; Preda, Marius; Preteux, Francoise J.

    1999-06-01

    In this paper, we address the issue of sign language indexation/recognition. The existing tools, like on-like Web dictionaries or other educational-oriented applications, are making exclusive use of textural annotations. However, keyword indexing schemes have strong limitations due to the ambiguity of the natural language and to the huge effort needed to manually annotate a large amount of data. In order to overcome these drawbacks, we tackle sign language indexation issue within the MPEG-7 framework and propose an approach based on linguistic properties and characteristics of sing language. The method developed introduces the concept of over time stable hand configuration instanciated on natural or synthetic prototypes. The prototypes are indexed by means of a shape descriptor which is defined as a translation, rotation and scale invariant Hough transform. A very compact representation is available by considering the Fourier transform of the Hough coefficients. Such an approach has been applied to two data sets consisting of 'Letters' and 'Words' respectively. The accuracy and robustness of the result are discussed and a compete sign language description schema is proposed.

  10. Order of the major constituents in sign languages: implications for all language

    PubMed Central

    Napoli, Donna Jo; Sutton-Spence, Rachel

    2014-01-01

    A survey of reports of sign order from 42 sign languages leads to a handful of generalizations. Two accounts emerge, one amodal and the other modal. We argue that universal pressures are at work with respect to some generalizations, but that pressure from the visual modality is at work with respect to others. Together, these pressures conspire to make all sign languages order their major constituents SOV or SVO. This study leads us to the conclusion that the order of S with regard to verb phrase (VP) may be driven by sensorimotor system concerns that feed universal grammar. PMID:24860523

  11. "Thinking-for-Writing": A Prolegomenon on Writing Signed Languages.

    PubMed

    Rosen, Russell S; Hartman, Maria C; Wang, Ye

    2017-01-01

    In his article in this American Annals of the Deaf special issue that also includes the present article, Grushkin argues that the writing difficulties of many deaf and hard of hearing children result primarily from the orthographic nature of the writing system; he proposes a new system based on features found in signed languages. In response, the present authors review the literature on D/HH children's writing difficulties, outline the main percepts of and assumptions about writing signed languages, discuss "thinking-for-writing" as a process in developing writing skills, offer research designs to test the effectiveness of writing signed language systems, and provide strategies for adopting "thinking-for-writing" in education. They conclude that until empirical studies show that writing signed languages effectively reflects writers' "thinking-for-writing," the alphabetic orthographic system of English should still be used, and ways should be found to teach D/HH children to use English writing to express their thoughts.

  12. Flemish Sign Language Standardisation

    ERIC Educational Resources Information Center

    Van Herreweghe, Mieke; Vermeerbergen, Myriam

    2009-01-01

    In 1997, the Flemish Deaf community officially rejected standardisation of Flemish Sign Language. It was a bold choice, which at the time was not in line with some of the decisions taken in the neighbouring countries. In this article, we shall discuss the choices the Flemish Deaf community has made in this respect and explore why the Flemish Deaf…

  13. Sign Language Web Pages

    ERIC Educational Resources Information Center

    Fels, Deborah I.; Richards, Jan; Hardman, Jim; Lee, Daniel G.

    2006-01-01

    The World Wide Web has changed the way people interact. It has also become an important equalizer of information access for many social sectors. However, for many people, including some sign language users, Web accessing can be difficult. For some, it not only presents another barrier to overcome but has left them without cultural equality. The…

  14. Iconicity as a General Property of Language: Evidence from Spoken and Signed Languages

    PubMed Central

    Perniss, Pamela; Thompson, Robin L.; Vigliocco, Gabriella

    2010-01-01

    Current views about language are dominated by the idea of arbitrary connections between linguistic form and meaning. However, if we look beyond the more familiar Indo-European languages and also include both spoken and signed language modalities, we find that motivated, iconic form-meaning mappings are, in fact, pervasive in language. In this paper, we review the different types of iconic mappings that characterize languages in both modalities, including the predominantly visually iconic mappings found in signed languages. Having shown that iconic mapping are present across languages, we then proceed to review evidence showing that language users (signers and speakers) exploit iconicity in language processing and language acquisition. While not discounting the presence and importance of arbitrariness in language, we put forward the idea that iconicity need also be recognized as a general property of language, which may serve the function of reducing the gap between linguistic form and conceptual representation to allow the language system to “hook up” to motor, perceptual, and affective experience. PMID:21833282

  15. Towards a Transcription System of Sign Language for 3D Virtual Agents

    NASA Astrophysics Data System (ADS)

    Do Amaral, Wanessa Machado; de Martino, José Mario

    Accessibility is a growing concern in computer science. Since virtual information is mostly presented visually, it may seem that access for deaf people is not an issue. However, for prelingually deaf individuals, those who were deaf since before acquiring and formally learn a language, written information is often of limited accessibility than if presented in signing. Further, for this community, signing is their language of choice, and reading text in a spoken language is akin to using a foreign language. Sign language uses gestures and facial expressions and is widely used by deaf communities. To enabling efficient production of signed content on virtual environment, it is necessary to make written records of signs. Transcription systems have been developed to describe sign languages in written form, but these systems have limitations. Since they were not originally designed with computer animation in mind, in general, the recognition and reproduction of signs in these systems is an easy task only to those who deeply know the system. The aim of this work is to develop a transcription system to provide signed content in virtual environment. To animate a virtual avatar, a transcription system requires explicit enough information, such as movement speed, signs concatenation, sequence of each hold-and-movement and facial expressions, trying to articulate close to reality. Although many important studies in sign languages have been published, the transcription problem remains a challenge. Thus, a notation to describe, store and play signed content in virtual environments offers a multidisciplinary study and research tool, which may help linguistic studies to understand the sign languages structure and grammar.

  16. Event representations constrain the structure of language: Sign language as a window into universally accessible linguistic biases.

    PubMed

    Strickland, Brent; Geraci, Carlo; Chemla, Emmanuel; Schlenker, Philippe; Kelepir, Meltem; Pfau, Roland

    2015-05-12

    According to a theoretical tradition dating back to Aristotle, verbs can be classified into two broad categories. Telic verbs (e.g., "decide," "sell," "die") encode a logical endpoint, whereas atelic verbs (e.g., "think," "negotiate," "run") do not, and the denoted event could therefore logically continue indefinitely. Here we show that sign languages encode telicity in a seemingly universal way and moreover that even nonsigners lacking any prior experience with sign language understand these encodings. In experiments 1-5, nonsigning English speakers accurately distinguished between telic (e.g., "decide") and atelic (e.g., "think") signs from (the historically unrelated) Italian Sign Language, Sign Language of the Netherlands, and Turkish Sign Language. These results were not due to participants' inferring that the sign merely imitated the action in question. In experiment 6, we used pseudosigns to show that the presence of a salient visual boundary at the end of a gesture was sufficient to elicit telic interpretations, whereas repeated movement without salient boundaries elicited atelic interpretations. Experiments 7-10 confirmed that these visual cues were used by all of the sign languages studied here. Together, these results suggest that signers and nonsigners share universally accessible notions of telicity as well as universally accessible "mapping biases" between telicity and visual form.

  17. Standardizing Chinese Sign Language for Use in Post-Secondary Education

    ERIC Educational Resources Information Center

    Lin, Christina Mien-Chun; Gerner de Garcia, Barbara; Chen-Pichler, Deborah

    2009-01-01

    There are over 100 languages in China, including Chinese Sign Language. Given the large population and geographical dispersion of the country's deaf community, sign variation is to be expected. Language barriers due to lexical variation may exist for deaf college students in China, who often live outside their home regions. In presenting an…

  18. Regional Sign Language Varieties in Contact: Investigating Patterns of Accommodation

    ERIC Educational Resources Information Center

    Stamp, Rose; Schembri, Adam; Evans, Bronwen G.; Cormier, Kearsy

    2016-01-01

    Short-term linguistic accommodation has been observed in a number of spoken language studies. The first of its kind in sign language research, this study aims to investigate the effects of regional varieties in contact and lexical accommodation in British Sign Language (BSL). Twenty-five participants were recruited from Belfast, Glasgow,…

  19. Choosing Accommodations: Signed Language Interpreting and the Absence of Choice.

    PubMed

    Burke, Teresa Blankmeyer

    This paper carves out a topic space for discussion about the ethical question of whether input from signing Deaf consumers of interpreting services ought to be included in the provision of signed language interpreter accommodations. The first section provides background about disability accommodations and practices, including how signed language interpreting accommodations are similar and dissimilar to other kinds of disability accommodations. In the second section, I offer a personal narrative of my experience as a Deaf academic who has been excluded from the interpreter selection process, highlighting some of the harmful consequences of such exclusion. In the subsequent two sections, I describe and analyze the process of choosing interpreter accommodations, starting with the process of requesting signed language interpreters and the institutionalization of this process, followed by a brief overview of privacy and autonomy concerns from the standpoint of the signing Deaf consumer. The penultimate section considers some objections to the proposal of involving more consumer choice in signed language accommodations. I conclude the paper with some concrete suggestions for a more Deaf-centered, inclusive process for choosing interpreter accommodations.

  20. Segmentation of British Sign Language (BSL): mind the gap!

    PubMed

    Orfanidou, Eleni; McQueen, James M; Adam, Robert; Morgan, Gary

    2015-01-01

    This study asks how users of British Sign Language (BSL) recognize individual signs in connected sign sequences. We examined whether this is achieved through modality-specific or modality-general segmentation procedures. A modality-specific feature of signed languages is that, during continuous signing, there are salient transitions between sign locations. We used the sign-spotting task to ask if and how BSL signers use these transitions in segmentation. A total of 96 real BSL signs were preceded by nonsense signs which were produced in either the target location or another location (with a small or large transition). Half of the transitions were within the same major body area (e.g., head) and half were across body areas (e.g., chest to hand). Deaf adult BSL users (a group of natives and early learners, and a group of late learners) spotted target signs best when there was a minimal transition and worst when there was a large transition. When location changes were present, both groups performed better when transitions were to a different body area than when they were within the same area. These findings suggest that transitions do not provide explicit sign-boundary cues in a modality-specific fashion. Instead, we argue that smaller transitions help recognition in a modality-general way by limiting lexical search to signs within location neighbourhoods, and that transitions across body areas also aid segmentation in a modality-general way, by providing a phonotactic cue to a sign boundary. We propose that sign segmentation is based on modality-general procedures which are core language-processing mechanisms.

  1. Linguistic Policies, Linguistic Planning, and Brazilian Sign Language in Brazil

    ERIC Educational Resources Information Center

    de Quadros, Ronice Muller

    2012-01-01

    This article explains the consolidation of Brazilian Sign Language in Brazil through a linguistic plan that arose from the Brazilian Sign Language Federal Law 10.436 of April 2002 and the subsequent Federal Decree 5695 of December 2005. Two concrete facts that emerged from this existing language plan are discussed: the implementation of bilingual…

  2. Recognition of Indian Sign Language in Live Video

    NASA Astrophysics Data System (ADS)

    Singha, Joyeeta; Das, Karen

    2013-05-01

    Sign Language Recognition has emerged as one of the important area of research in Computer Vision. The difficulty faced by the researchers is that the instances of signs vary with both motion and appearance. Thus, in this paper a novel approach for recognizing various alphabets of Indian Sign Language is proposed where continuous video sequences of the signs have been considered. The proposed system comprises of three stages: Preprocessing stage, Feature Extraction and Classification. Preprocessing stage includes skin filtering, histogram matching. Eigen values and Eigen Vectors were considered for feature extraction stage and finally Eigen value weighted Euclidean distance is used to recognize the sign. It deals with bare hands, thus allowing the user to interact with the system in natural way. We have considered 24 different alphabets in the video sequences and attained a success rate of 96.25%.

  3. Japanese Language and Culture: 9-Year Program Classroom Assessment Materials, Grade 4

    ERIC Educational Resources Information Center

    Alberta Education, 2008

    2008-01-01

    This document is designed to provide assessment materials for specific Grade 4 outcomes in the Japanese Language and Culture Nine-year Program, Grades 4-5-6. The assessment materials are designed for the beginner level in the context of teaching for communicative competence. Grade 4 learning outcomes from the Japanese Language and Culture…

  4. Sign Language in Astronomy and Space Sciences

    NASA Astrophysics Data System (ADS)

    Cova, J.; Movilio, V.; Gómez, Y.; Gutiérrez, F.; García, R.; Moreno, H.; González, F.; Díaz, J.; Villarroel, C.; Abreu, E.; Aparicio, D.; Cárdenas, J.; Casneiro, L.; Castillo, N.; Contreras, D.; La Verde, N.; Maita, M.; Martínez, A.; Villahermosa, J.; Quintero, A.

    2009-05-01

    Teaching science to school children with hearing deficiency and impairment can be a rewarding and valuable experience for both teacher and student, and necessary to society as a whole in order to reduce the discriminative policies in the formal educational system. The one most important obstacle to the teaching of science to students with hearing deficiency and impairments is the lack of vocabulary in sign language to express the precise concepts encountered in scientific endeavor. In a collaborative project between Centro de Investigaciones de Astronomía ``Francisco J. Duarte'' (CIDA), Universidad Pedagógica Experimental Libertador-Instituto Pedagógico de Maturín (UPEL-IPM) and Unidad Educativa Especial Bolivariana de Maturín (UEEBM) initiated in 2006, we have attempted to fill this gap by developing signs for astronomy and space sciences terminology. During two three-day workshops carried out at CIDA in Mérida in July 2006 and UPEL-IPM in Maturín in March 2007 a total of 112 concepts of astronomy and space sciences were coined in sign language using an interactive method which we describe in the text. The immediate goal of the project is to incorporate these terms into Venezuelan Sign Language (LSV).

  5. Technology to Support Sign Language for Students with Disabilities

    ERIC Educational Resources Information Center

    Donne, Vicki

    2013-01-01

    This systematic review of the literature provides a synthesis of research on the use of technology to support sign language. Background research on the use of sign language with students who are deaf/hard of hearing and students with low incidence disabilities, such as autism, intellectual disability, or communication disorders is provided. The…

  6. Silence in the Second Language Classrooms of Japanese Universities

    ERIC Educational Resources Information Center

    King, Jim

    2013-01-01

    Japanese language learners' proclivity for silence has been alluded to by various writers (e.g. Anderson 1993; Korst 1997; Greer 2000) and is supported by plenty of anecdotal evidence, but large-scale, empirical studies aimed at measuring the extent of macro-level silence within Japanese university L2 classrooms are notably lacking. This article…

  7. The Ideology of Interculturality in Japanese Language-in-Education Policy

    ERIC Educational Resources Information Center

    Liddicoat, Anthony J.

    2007-01-01

    Language learning is frequently justified as a vehicle for promoting intercultural communication and understanding, and language-in-education policies have increasingly come to reflect this preoccupation in their rhetoric. This paper will examine the ways in which concepts relating to interculturality are constructed in Japanese language policy…

  8. Sign Language Studies with Chimpanzees and Children.

    ERIC Educational Resources Information Center

    Van Cantfort, Thomas E.; Rimpau, James B.

    1982-01-01

    Reviews methodologies of sign language studies with chimpanzees and compares major findings of those studies with studies of human children. Considers relevance of input conditions for language acquisition, evidence used to demonstrate linguistic achievements, and application of rigorous testing procedures in developmental psycholinguistics.…

  9. The Bimodal Bilingual Brain: Effects of Sign Language Experience

    ERIC Educational Resources Information Center

    Emmorey, Karen; McCullough, Stephen

    2009-01-01

    Bimodal bilinguals are hearing individuals who know both a signed and a spoken language. Effects of bimodal bilingualism on behavior and brain organization are reviewed, and an fMRI investigation of the recognition of facial expressions by ASL-English bilinguals is reported. The fMRI results reveal separate effects of sign language and spoken…

  10. Meemul Tziij: An Indigenous Sign Language Complex of Mesoamerica

    ERIC Educational Resources Information Center

    Tree, Erich Fox

    2009-01-01

    This article examines sign languages that belong to a complex of indigenous sign languages in Mesoamerica that K'iche'an Maya people of Guatemala refer to collectively as Meemul Tziij. It explains the relationship between the Meemul Tziij variety of the Yukatek Maya village of Chican (state of Yucatan, Mexico) and the hitherto undescribed Meemul…

  11. The Use of Sign Language Pronouns by Native-Signing Children with Autism

    ERIC Educational Resources Information Center

    Shield, Aaron; Meier, Richard P.; Tager-Flusberg, Helen

    2015-01-01

    We report the first study on pronoun use by an under-studied research population, children with autism spectrum disorder (ASD) exposed to American Sign Language from birth by their deaf parents. Personal pronouns cause difficulties for hearing children with ASD, who sometimes reverse or avoid them. Unlike speech pronouns, sign pronouns are…

  12. Methodological and Theoretical Issues in the Adaptation of Sign Language Tests: An Example from the Adaptation of a Test to German Sign Language

    ERIC Educational Resources Information Center

    Haug, Tobias

    2012-01-01

    Despite the current need for reliable and valid test instruments in different countries in order to monitor the sign language acquisition of deaf children, very few tests are commercially available that offer strong evidence for their psychometric properties. This mirrors the current state of affairs for many sign languages, where very little…

  13. Operationalization of Sign Language Phonological Similarity and Its Effects on Lexical Access

    ERIC Educational Resources Information Center

    Williams, Joshua T.; Stone, Adam; Newman, Sharlene D.

    2017-01-01

    Cognitive mechanisms for sign language lexical access are fairly unknown. This study investigated whether phonological similarity facilitates lexical retrieval in sign languages using measures from a new lexical database for American Sign Language. Additionally, it aimed to determine which similarity metric best fits the present data in order to…

  14. Signs as Pictures and Signs as Words: Effect of Language Knowledge on Memory for New Vocabulary.

    ERIC Educational Resources Information Center

    Siple, Patricia; And Others

    1982-01-01

    The role of sensory attributes in a vocabulary learning task was investigated for a non-oral language using deaf and hearing individuals, more or less skilled in the use of sign language. Skilled signers encoded invented signs in terms of linguistic structure rather than as visual-pictorial events. (Author/RD)

  15. The Effect of New Technologies on Sign Language Research

    ERIC Educational Resources Information Center

    Lucas, Ceil; Mirus, Gene; Palmer, Jeffrey Levi; Roessler, Nicholas James; Frost, Adam

    2013-01-01

    This paper first reviews the fairly established ways of collecting sign language data. It then discusses the new technologies available and their impact on sign language research, both in terms of how data is collected and what new kinds of data are emerging as a result of technology. New data collection methods and new kinds of data are…

  16. The Multimedia Dictionary of American Sign Language: Learning Lessons About Language, Technology, and Business.

    ERIC Educational Resources Information Center

    Wilcox, Sherman

    2003-01-01

    Reports on the the Multimedia Dictionary of American Sign language, which was was conceived in he late 1980s as a melding of the pioneering work in American Sign language lexicography that had been carried out decades earlier and the newly emerging computer technologies that were integrating use of graphical user-interface designs, rapidly…

  17. Deficits in Narrative Abilities in Child British Sign Language Users with Specific Language Impairment

    ERIC Educational Resources Information Center

    Herman, Ros; Rowley, Katherine; Mason, Kathryn; Morgan, Gary

    2014-01-01

    This study details the first ever investigation of narrative skills in a group of 17 deaf signing children who have been diagnosed with disorders in their British Sign Language development compared with a control group of 17 deaf child signers matched for age, gender, education, quantity, and quality of language exposure and non-verbal…

  18. The history of sign language and deaf education in Turkey.

    PubMed

    Kemaloğlu, Yusuf Kemal; Kemaloğlu, Pınar Yaprak

    2012-01-01

    Sign language is the natural language of the prelingually deaf people particularly without hearing-speech rehabilitation. Otorhinolaryngologists, regarding health as complete physical, mental and psychosocial well-being, aim hearing by diagnosing deafness as deviance from normality. However, it's obvious that the perception conflicted with the behavior which does not meet the mental and social well-being of the individual also contradicts with the definition mentioned above. This article aims to investigate the effects of hearing-speech target ignoring the sign language in Turkish population and its consistency with the history through statistical data, scientific publications and historical documents and to support critical perspective on this issue. The study results showed that maximum 50% of the deaf benefited from hearing-speech program for last 60 years before hearing screening programs; however, systems including sign language in education were not generated. In the light of these data, it is clear that the approach ignoring sign language particularly before the development of screening programs is not reasonable. In addition, considering sign language being part of the Anatolian history from Hittites to Ottomans, it is a question to be answered that why evaluation, habilitation and education systems excluding sign language are still the only choice for deaf individuals in Turkey. Despite legislative amendments in the last 6-7 years, the primary cause of failure to come into force is probably because of inadequate conception of the issue content and importance, as well as limited effort to offer solutions by academicians and authorized politicians. Within this context, this paper aims to make a positive effect on this issue offering a review for the medical staff, particularly otorhinolaryngologists and audiologists.

  19. Sign language aphasia due to left occipital lesion in a deaf signer.

    PubMed

    Saito, Kozue; Otsuki, Mika; Ueno, Satoshi

    2007-10-02

    Localization of sign language production and comprehension in deaf people has been described as similar to that of spoken language aphasia. However, sign language employs a visuospatial modality through visual information. We present the first report of a deaf signer who showed substantial sign language aphasia with severe impairment in word production due to a left occipital lesion. This case may indicate the possibility of other localizations of plasticity.

  20. Information status and word order in Croatian Sign Language.

    PubMed

    Milkovic, Marina; Bradaric-Joncic, Sandra; Wilbur, Ronnie B

    2007-01-01

    This paper presents the results of research on information structure and word order in narrative sentences taken from signed short stories in Croatian Sign Language (HZJ). The basic word order in HZJ is SVO. Factors that result in other word orders include: reversible arguments, verb categories, locative constructions, contrastive focus, and prior context. Word order in context depends on communication rules, based on the relationship between old (theme) and new (rheme) information, which is predicated of the theme. In accordance with Grice's Maxim of Quantity, HZJ has a tendency to omit old information, or to reduce it to pronominal status. If old information is overtly signed in non-pronominal form, it precedes the rheme. We have observed a variety of sign language mechanisms that are used to show items of reduced contextual significance: use of assigned spatial location for previously introduced referents; eyegaze to indicate spatial location of previously introduced referents; use of the non-dominant hand for backgrounded information; use of a special category of signs known as classifiers as pronominal indicators of previously introduced referents; and complex noun phrases that allow a single occurrence of a noun to simultaneously serve multiple functions. These devices permit information to be conveyed without the need for separate signs for every referent, which would create longer constructions that could be taxing to both production and perception. The results of this research are compatible with well-known word order generalizations - HZJ has its own grammar, independent of spoken language, like any other sign language.

  1. Australian Aboriginal Deaf People and Aboriginal Sign Language

    ERIC Educational Resources Information Center

    Power, Des

    2013-01-01

    Many Australian Aboriginal people use a sign language ("hand talk") that mirrors their local spoken language and is used both in culturally appropriate settings when speech is taboo or counterindicated and for community communication. The characteristics of these languages are described, and early European settlers' reports of deaf…

  2. Identifying Overlapping Language Communities: The Case of Chiriquí and Panamanian Signed Languages

    ERIC Educational Resources Information Center

    Parks, Elizabeth S.

    2016-01-01

    In this paper, I use a holographic metaphor to explain the identification of overlapping sign language communities in Panama. By visualizing Panama's complex signing communities as emitting community "hotspots" through social drama on multiple stages, I employ ethnographic methods to explore overlapping contours of Panama's sign language…

  3. What You Don't Know Can Hurt You: The Risk of Language Deprivation by Impairing Sign Language Development in Deaf Children.

    PubMed

    Hall, Wyatte C

    2017-05-01

    A long-standing belief is that sign language interferes with spoken language development in deaf children, despite a chronic lack of evidence supporting this belief. This deserves discussion as poor life outcomes continue to be seen in the deaf population. This commentary synthesizes research outcomes with signing and non-signing children and highlights fully accessible language as a protective factor for healthy development. Brain changes associated with language deprivation may be misrepresented as sign language interfering with spoken language outcomes of cochlear implants. This may lead to professionals and organizations advocating for preventing sign language exposure before implantation and spreading misinformation. The existence of one-time-sensitive-language acquisition window means a strong possibility of permanent brain changes when spoken language is not fully accessible to the deaf child and sign language exposure is delayed, as is often standard practice. There is no empirical evidence for the harm of sign language exposure but there is some evidence for its benefits, and there is growing evidence that lack of language access has negative implications. This includes cognitive delays, mental health difficulties, lower quality of life, higher trauma, and limited health literacy. Claims of cochlear implant- and spoken language-only approaches being more effective than sign language-inclusive approaches are not empirically supported. Cochlear implants are an unreliable standalone first-language intervention for deaf children. Priorities of deaf child development should focus on healthy growth of all developmental domains through a fully-accessible first language foundation such as sign language, rather than auditory deprivation and speech skills.

  4. Computerized Sign Language-Based Literacy Training for Deaf and Hard-of-Hearing Children

    ERIC Educational Resources Information Center

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2017-01-01

    Strengthening the connections between sign language and written language may improve reading skills in deaf and hard-of-hearing (DHH) signing children. The main aim of the present study was to investigate whether computerized sign language-based literacy training improves reading skills in DHH signing children who are learning to read. Further,…

  5. A FACETS Analysis of Rater Bias in Measuring Japanese Second Language Writing Performance.

    ERIC Educational Resources Information Center

    Kondo-Brown, Kimi

    2002-01-01

    Using FACETS, investigates how judgments of trained teacher raters are biased toward certain types of candidates and certain criteria in assessing Japanese second language writing. Explores the potential for using a modified version of a rating scale for norm-referenced decisions about Japanese second language writing ability. (Author/VWL)

  6. On the linguistic status of ‘agreement’ in sign languages

    PubMed Central

    LILLO-MARTIN, DIANE; MEIER, RICHARD P.

    2013-01-01

    In signed languages, the arguments of verbs can be marked by a system of verbal modification that has been termed “agreement” (more neutrally, “directionality”). Fundamental issues regarding directionality remain unresolved and the phenomenon has characteristics that call into question its analysis as agreement. We conclude that directionality marks person in American Sign Language, and the ways person marking interacts with syntactic phenomena are largely analogous to morpho-syntactic properties of familiar agreement systems. Overall, signed languages provide a crucial test for how gestural and linguistic mechanisms can jointly contribute to the satisfaction of fundamental aspects of linguistic structure. PMID:23495262

  7. Can Experience with Co-Speech Gesture Influence the Prosody of a Sign Language? Sign Language Prosodic Cues in Bimodal Bilinguals

    ERIC Educational Resources Information Center

    Brentari, Diane; Nadolske, Marie A.; Wolford, George

    2012-01-01

    In this paper the prosodic structure of American Sign Language (ASL) narratives is analyzed in deaf native signers (L1-D), hearing native signers (L1-H), and highly proficient hearing second language signers (L2-H). The results of this study show that the prosodic patterns used by these groups are associated both with their ASL language experience…

  8. Hierarchically Structured Non-Intrusive Sign Language Recognition. Chapter 2

    NASA Technical Reports Server (NTRS)

    Zieren, Jorg; Zieren, Jorg; Kraiss, Karl-Friedrich

    2007-01-01

    This work presents a hierarchically structured approach at the nonintrusive recognition of sign language from a monocular frontal view. Robustness is achieved through sophisticated localization and tracking methods, including a combined EM/CAMSHIFT overlap resolution procedure and the parallel pursuit of multiple hypotheses about hands position and movement. This allows handling of ambiguities and automatically corrects tracking errors. A biomechanical skeleton model and dynamic motion prediction using Kalman filters represents high level knowledge. Classification is performed by Hidden Markov Models. 152 signs from German sign language were recognized with an accuracy of 97.6%.

  9. Language and Literacy Acquisition through Parental Mediation in American Sign Language

    ERIC Educational Resources Information Center

    Bailes, Cynthia Neese; Erting, Lynne C.; Thumann-Prezioso, Carlene; Erting, Carol J.

    2009-01-01

    This longitudinal case study examined the language and literacy acquisition of a Deaf child as mediated by her signing Deaf parents during her first three years of life. Results indicate that the parents' interactions with their child were guided by linguistic and cultural knowledge that produced an intuitive use of child-directed signing (CDSi)…

  10. Computerized Sign Language-Based Literacy Training for Deaf and Hard-of-Hearing Children

    PubMed Central

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2017-01-01

    Abstract Strengthening the connections between sign language and written language may improve reading skills in deaf and hard-of-hearing (DHH) signing children. The main aim of the present study was to investigate whether computerized sign language-based literacy training improves reading skills in DHH signing children who are learning to read. Further, longitudinal associations between sign language skills and developing reading skills were investigated. Participants were recruited from Swedish state special schools for DHH children, where pupils are taught in both sign language and spoken language. Reading skills were assessed at five occasions and the intervention was implemented in a cross-over design. Results indicated that reading skills improved over time and that development of word reading was predicted by the ability to imitate unfamiliar lexical signs, but there was only weak evidence that it was supported by the intervention. These results demonstrate for the first time a longitudinal link between sign-based abilities and word reading in DHH signing children who are learning to read. We suggest that the active construction of novel lexical forms may be a supramodal mechanism underlying word reading development. PMID:28961874

  11. Computerized Sign Language-Based Literacy Training for Deaf and Hard-of-Hearing Children.

    PubMed

    Holmer, Emil; Heimann, Mikael; Rudner, Mary

    2017-10-01

    Strengthening the connections between sign language and written language may improve reading skills in deaf and hard-of-hearing (DHH) signing children. The main aim of the present study was to investigate whether computerized sign language-based literacy training improves reading skills in DHH signing children who are learning to read. Further, longitudinal associations between sign language skills and developing reading skills were investigated. Participants were recruited from Swedish state special schools for DHH children, where pupils are taught in both sign language and spoken language. Reading skills were assessed at five occasions and the intervention was implemented in a cross-over design. Results indicated that reading skills improved over time and that development of word reading was predicted by the ability to imitate unfamiliar lexical signs, but there was only weak evidence that it was supported by the intervention. These results demonstrate for the first time a longitudinal link between sign-based abilities and word reading in DHH signing children who are learning to read. We suggest that the active construction of novel lexical forms may be a supramodal mechanism underlying word reading development. © The Author 2017. Published by Oxford University Press.

  12. Phonological reduplication in sign language: Rules rule

    PubMed Central

    Berent, Iris; Dupuis, Amanda; Brentari, Diane

    2014-01-01

    Productivity—the hallmark of linguistic competence—is typically attributed to algebraic rules that support broad generalizations. Past research on spoken language has documented such generalizations in both adults and infants. But whether algebraic rules form part of the linguistic competence of signers remains unknown. To address this question, here we gauge the generalization afforded by American Sign Language (ASL). As a case study, we examine reduplication (X→XX)—a rule that, inter alia, generates ASL nouns from verbs. If signers encode this rule, then they should freely extend it to novel syllables, including ones with features that are unattested in ASL. And since reduplicated disyllables are preferred in ASL, such a rule should favor novel reduplicated signs. Novel reduplicated signs should thus be preferred to nonreduplicative controls (in rating), and consequently, such stimuli should also be harder to classify as nonsigns (in the lexical decision task). The results of four experiments support this prediction. These findings suggest that the phonological knowledge of signers includes powerful algebraic rules. The convergence between these conclusions and previous evidence for phonological rules in spoken language suggests that the architecture of the phonological mind is partly amodal. PMID:24959158

  13. Phonological memory in sign language relies on the visuomotor neural system outside the left hemisphere language network.

    PubMed

    Kanazawa, Yuji; Nakamura, Kimihiro; Ishii, Toru; Aso, Toshihiko; Yamazaki, Hiroshi; Omori, Koichi

    2017-01-01

    Sign language is an essential medium for everyday social interaction for deaf people and plays a critical role in verbal learning. In particular, language development in those people should heavily rely on the verbal short-term memory (STM) via sign language. Most previous studies compared neural activations during signed language processing in deaf signers and those during spoken language processing in hearing speakers. For sign language users, it thus remains unclear how visuospatial inputs are converted into the verbal STM operating in the left-hemisphere language network. Using functional magnetic resonance imaging, the present study investigated neural activation while bilinguals of spoken and signed language were engaged in a sequence memory span task. On each trial, participants viewed a nonsense syllable sequence presented either as written letters or as fingerspelling (4-7 syllables in length) and then held the syllable sequence for 12 s. Behavioral analysis revealed that participants relied on phonological memory while holding verbal information regardless of the type of input modality. At the neural level, this maintenance stage broadly activated the left-hemisphere language network, including the inferior frontal gyrus, supplementary motor area, superior temporal gyrus and inferior parietal lobule, for both letter and fingerspelling conditions. Interestingly, while most participants reported that they relied on phonological memory during maintenance, direct comparisons between letters and fingers revealed strikingly different patterns of neural activation during the same period. Namely, the effortful maintenance of fingerspelling inputs relative to letter inputs activated the left superior parietal lobule and dorsal premotor area, i.e., brain regions known to play a role in visuomotor analysis of hand/arm movements. These findings suggest that the dorsal visuomotor neural system subserves verbal learning via sign language by relaying gestural inputs to

  14. Sign Language Echolalia in Deaf Children With Autism Spectrum Disorder.

    PubMed

    Shield, Aaron; Cooley, Frances; Meier, Richard P

    2017-06-10

    We present the first study of echolalia in deaf, signing children with autism spectrum disorder (ASD). We investigate the nature and prevalence of sign echolalia in native-signing children with ASD, the relationship between sign echolalia and receptive language, and potential modality differences between sign and speech. Seventeen deaf children with ASD and 18 typically developing (TD) deaf children were video-recorded in a series of tasks. Data were coded for type of signs produced (spontaneous, elicited, echo, or nonecho repetition). Echoes were coded as pure or partial, and timing and reduplication of echoes were coded. Seven of the 17 deaf children with ASD produced signed echoes, but none of the TD deaf children did. The echoic children had significantly lower receptive language scores than did both the nonechoic children with ASD and the TD children. Modality differences also were found in terms of the directionality, timing, and reduplication of echoes. Deaf children with ASD sometimes echo signs, just as hearing children with ASD sometimes echo words, and TD deaf children and those with ASD do so at similar stages of linguistic development, when comprehension is relatively low. The sign language modality might provide a powerful new framework for analyzing the purpose and function of echolalia in deaf children with ASD.

  15. Lexical prediction via forward models: N400 evidence from German Sign Language.

    PubMed

    Hosemann, Jana; Herrmann, Annika; Steinbach, Markus; Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias

    2013-09-01

    Models of language processing in the human brain often emphasize the prediction of upcoming input-for example in order to explain the rapidity of language understanding. However, the precise mechanisms of prediction are still poorly understood. Forward models, which draw upon the language production system to set up expectations during comprehension, provide a promising approach in this regard. Here, we present an event-related potential (ERP) study on German Sign Language (DGS) which tested the hypotheses of a forward model perspective on prediction. Sign languages involve relatively long transition phases between one sign and the next, which should be anticipated as part of a forward model-based prediction even though they are semantically empty. Native speakers of DGS watched videos of naturally signed DGS sentences which either ended with an expected or a (semantically) unexpected sign. Unexpected signs engendered a biphasic N400-late positivity pattern. Crucially, N400 onset preceded critical sign onset and was thus clearly elicited by properties of the transition phase. The comprehension system thereby clearly anticipated modality-specific information about the realization of the predicted semantic item. These results provide strong converging support for the application of forward models in language comprehension. © 2013 Elsevier Ltd. All rights reserved.

  16. Sign Language Subtitling by Highly Comprehensible "Semantroids"

    ERIC Educational Resources Information Center

    Adamo-Villani, Nicolleta; Beni, Gerardo

    2007-01-01

    We introduce a new method of sign language subtitling aimed at young deaf children who have not acquired reading skills yet, and can communicate only via signs. The method is based on: 1) the recently developed concept of "semantroid[TM]" (an animated 3D avatar limited to head and hands); 2) the design, development, and psychophysical evaluation…

  17. Why American Sign Language Gloss Must Matter

    ERIC Educational Resources Information Center

    Supalla, Samuel J.; Cripps, Jody H.; Byrne, Andrew P. J.

    2017-01-01

    Responding to an article by Grushkin (EJ1174123) on how deaf children best learn to read, published, along with the present article, in an "American Annals of the Deaf" special issue, the authors review American Sign Language gloss. Topics include how ASL gloss enables deaf children to learn to read in their own language and…

  18. Italian Sign Language (LIS) Poetry: Iconic Properties and Structural Regularities.

    ERIC Educational Resources Information Center

    Russo, Tommaso; Giuranna, Rosaria; Pizzuto, Elena

    2001-01-01

    Explores and describes from a crosslinguistic perspective, some of the major structural irregularities that characterize poetry in Italian Sign Language and distinguish poetic from nonpoetic texts. Reviews findings of previous studies of signed language poetry, and points out issues that need to be clarified to provide a more accurate description…

  19. Lexical Properties of Slovene Sign Language: A Corpus-Based Study

    ERIC Educational Resources Information Center

    Vintar, Špela

    2015-01-01

    Slovene Sign Language (SZJ) has as yet received little attention from linguists. This article presents some basic facts about SZJ, its history, current status, and a description of the Slovene Sign Language Corpus and Pilot Grammar (SIGNOR) project, which compiled and annotated a representative corpus of SZJ. Finally, selected quantitative data…

  20. Impacts of Visual Sonority and Handshape Markedness on Second Language Learning of American Sign Language

    ERIC Educational Resources Information Center

    Williams, Joshua T.; Newman, Sharlene D.

    2016-01-01

    The roles of visual sonority and handshape markedness in sign language acquisition and production were investigated. In Experiment 1, learners were taught sign-nonobject correspondences that varied in sign movement sonority and handshape markedness. Results from a sign-picture matching task revealed that high sonority signs were more accurately…

  1. Integrating Language and Content: Challenges in a Japanese Supplementary School in Victoria

    ERIC Educational Resources Information Center

    Okumura, Shinji; Obara, Yumi

    2017-01-01

    The Melbourne International School of Japanese (MISJ) is a supplementary Saturday school which offers Japanese language and mathematics taught in Japanese from kindergarten to senior secondary level. Classes are scheduled on Saturdays from 9am to 3pm and approximately half of the program is dedicated to mathematics. While mathematics education…

  2. Cross-Linguistic Differences in the Neural Representation of Human Language: Evidence from Users of Signed Languages

    PubMed Central

    Corina, David P.; Lawyer, Laurel A.; Cates, Deborah

    2013-01-01

    Studies of deaf individuals who are users of signed languages have provided profound insight into the neural representation of human language. Case studies of deaf signers who have incurred left- and right-hemisphere damage have shown that left-hemisphere resources are a necessary component of sign language processing. These data suggest that, despite frank differences in the input and output modality of language, core left perisylvian regions universally serve linguistic function. Neuroimaging studies of deaf signers have generally provided support for this claim. However, more fine-tuned studies of linguistic processing in deaf signers are beginning to show evidence of important differences in the representation of signed and spoken languages. In this paper, we provide a critical review of this literature and present compelling evidence for language-specific cortical representations in deaf signers. These data lend support to the claim that the neural representation of language may show substantive cross-linguistic differences. We discuss the theoretical implications of these findings with respect to an emerging understanding of the neurobiology of language. PMID:23293624

  3. Content Questions In American Sign Language: An RRG Analysis

    DTIC Science & Technology

    2004-12-08

    a temporal framework, someone might sign (29) DURING FIVE YEAR YONDER GALLAUDET … During my five years at Gallaudet …. Until a new topic is...Language: A Teacher’s Resource on Grammar and Culture. Washington, D.C.: Gallaudet University Press. BATTISON, ROBBIN. 1978. Loan Signs from...Typology and Syntactic Description, ed. by Timothy Shopen. Cambridge, MA: Cambridge University Press. —. In press b. Clause Types. Language Typology

  4. [The alteration of Japanese anatomical terminology in the early Showa period and the Japanese language reform campaign].

    PubMed

    Sawai, Tadashi; Sakai, Tatsuo

    2010-03-01

    In the second decade of the Showa period, great changes were made in the Japanese anatomical terms. It has been proposed that the presentation of JNA (Jenaer nomina anatomica) was one of the factors leading to the change. The Japanese language reform campaign, however, played an important role. The party kokugoaigo doumei and its successor kokugo kyokai required concise and unified technical terms. The anatomical nomenclature committee of the Japanese Association of Anatomists worked to satisfy this requirement. The committee consulted with nomenclature committees of other medical associations and took account of their opinions. The anatomical nomenclature committee abandoned the literal translation from Latin to Japanese and shaped a succinct Japanese terminology. Modern Japanese anatomical terms are based on this terminology.

  5. A dictionary of Astronomy for the French Sign Language (LSF)

    NASA Astrophysics Data System (ADS)

    Proust, Dominique; Abbou, Daniel; Chab, Nasro

    2011-06-01

    Since a few years, the french deaf communauty have access to astronomy at Paris-Meudon observatory through a specific teaching adapted from the French Sign Language (Langue des Signes Françcaise, LSF) including direct observations with the observatory telescopes. From this experience, an encyclopedic dictionary of astronomy The Hands in the Stars is now available, containing more than 200 astronomical concepts. Many of them did not existed in Sign Language and can be now fully expressed and explained.

  6. The Psychotherapist and the Sign Language Interpreter

    ERIC Educational Resources Information Center

    de Bruin, Ed; Brugmans, Petra

    2006-01-01

    Specialized psychotherapy for deaf people in the Dutch and Western European mental health systems is still a rather young specialism. A key policy principle in Dutch mental health care for the deaf is that they should receive treatment in the language most accessible to them, which is usually Dutch Sign Language (Nederlandse Gebarentaal or NGT).…

  7. Mobile Sign Language Learning Outside the Classroom

    ERIC Educational Resources Information Center

    Weaver, Kimberly A.; Starner, Thad

    2012-01-01

    The majority of deaf children in the United States are born to hearing parents with limited prior exposure to American Sign Language (ASL). Our research involves creating and validating a mobile language tool called SMARTSign. The goal is to help hearing parents learn ASL in a way that fits seamlessly into their daily routine. (Contains 3 figures.)

  8. The Road to Language Learning Is Not Entirely Iconic: Iconicity, Neighborhood Density, and Frequency Facilitate Acquisition of Sign Language.

    PubMed

    Caselli, Naomi K; Pyers, Jennie E

    2017-07-01

    Iconic mappings between words and their meanings are far more prevalent than once estimated and seem to support children's acquisition of new words, spoken or signed. We asked whether iconicity's prevalence in sign language overshadows two other factors known to support the acquisition of spoken vocabulary: neighborhood density (the number of lexical items phonologically similar to the target) and lexical frequency. Using mixed-effects logistic regressions, we reanalyzed 58 parental reports of native-signing deaf children's productive acquisition of 332 signs in American Sign Language (ASL; Anderson & Reilly, 2002) and found that iconicity, neighborhood density, and lexical frequency independently facilitated vocabulary acquisition. Despite differences in iconicity and phonological structure between signed and spoken language, signing children, like children learning a spoken language, track statistical information about lexical items and their phonological properties and leverage this information to expand their vocabulary.

  9. Sign Language Echolalia in Deaf Children With Autism Spectrum Disorder

    PubMed Central

    Cooley, Frances; Meier, Richard P.

    2017-01-01

    Purpose We present the first study of echolalia in deaf, signing children with autism spectrum disorder (ASD). We investigate the nature and prevalence of sign echolalia in native-signing children with ASD, the relationship between sign echolalia and receptive language, and potential modality differences between sign and speech. Method Seventeen deaf children with ASD and 18 typically developing (TD) deaf children were video-recorded in a series of tasks. Data were coded for type of signs produced (spontaneous, elicited, echo, or nonecho repetition). Echoes were coded as pure or partial, and timing and reduplication of echoes were coded. Results Seven of the 17 deaf children with ASD produced signed echoes, but none of the TD deaf children did. The echoic children had significantly lower receptive language scores than did both the nonechoic children with ASD and the TD children. Modality differences also were found in terms of the directionality, timing, and reduplication of echoes. Conclusions Deaf children with ASD sometimes echo signs, just as hearing children with ASD sometimes echo words, and TD deaf children and those with ASD do so at similar stages of linguistic development, when comprehension is relatively low. The sign language modality might provide a powerful new framework for analyzing the purpose and function of echolalia in deaf children with ASD. PMID:28586822

  10. Towards a Sign Language Synthesizer: a Bridge to Communication Gap of the Hearing/Speech Impaired Community

    NASA Astrophysics Data System (ADS)

    Maarif, H. A.; Akmeliawati, R.; Gunawan, T. S.; Shafie, A. A.

    2013-12-01

    Sign language synthesizer is a method to visualize the sign language movement from the spoken language. The sign language (SL) is one of means used by HSI people to communicate to normal people. But, unfortunately the number of people, including the HSI people, who are familiar with sign language is very limited. These cause difficulties in the communication between the normal people and the HSI people. The sign language is not only hand movement but also the face expression. Those two elements have complimentary aspect each other. The hand movement will show the meaning of each signing and the face expression will show the emotion of a person. Generally, Sign language synthesizer will recognize the spoken language by using speech recognition, the grammatical process will involve context free grammar, and 3D synthesizer will take part by involving recorded avatar. This paper will analyze and compare the existing techniques of developing a sign language synthesizer, which leads to IIUM Sign Language Synthesizer.

  11. Sign Language Echolalia in Deaf Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Shield, Aaron; Cooley, Frances; Meier, Richard P.

    2017-01-01

    Purpose: We present the first study of echolalia in deaf, signing children with autism spectrum disorder (ASD). We investigate the nature and prevalence of sign echolalia in native-signing children with ASD, the relationship between sign echolalia and receptive language, and potential modality differences between sign and speech. Method: Seventeen…

  12. Facilitating Exposure to Sign Languages of the World: The Case for Mobile Assisted Language Learning

    ERIC Educational Resources Information Center

    Parton, Becky Sue

    2014-01-01

    Foreign sign language instruction is an important, but overlooked area of study. Thus the purpose of this paper was two-fold. First, the researcher sought to determine the level of knowledge and interest in foreign sign language among Deaf teenagers along with their learning preferences. Results from a survey indicated that over a third of the…

  13. Information Transfer Capacity of Articulators in American Sign Language.

    PubMed

    Malaia, Evie; Borneman, Joshua D; Wilbur, Ronnie B

    2018-03-01

    The ability to convey information is a fundamental property of communicative signals. For sign languages, which are overtly produced with multiple, completely visible articulators, the question arises as to how the various channels co-ordinate and interact with each other. We analyze motion capture data of American Sign Language (ASL) narratives, and show that the capacity of information throughput, mathematically defined, is highest on the dominant hand (DH). We further demonstrate that information transfer capacity is also significant for the non-dominant hand (NDH), and the head channel too, as compared to control channels (ankles). We discuss both redundancy and independence in articulator motion in sign language, and argue that the NDH and the head articulators contribute to the overall information transfer capacity, indicating that they are neither completely redundant to, nor completely independent of, the DH.

  14. Telesign: a videophone system for sign language distant communication

    NASA Astrophysics Data System (ADS)

    Mozelle, Gerard; Preteux, Francoise J.; Viallet, Jean-Emmanuel

    1998-09-01

    This paper presents a low bit rate videophone system for deaf people communicating by means of sign language. Classic video conferencing systems have focused on head and shoulders sequences which are not well-suited for sign language video transmission since hearing impaired people also use their hands and arms to communicate. To address the above-mentioned functionality, we have developed a two-step content-based video coding system based on: (1) A segmentation step. Four or five video objects (VO) are extracted using a cooperative approach between color-based and morphological segmentation. (2) VO coding are achieved by using a standardized MPEG-4 video toolbox. Results of encoded sign language video sequences, presented for three target bit rates (32 kbits/s, 48 kbits/s and 64 kbits/s), demonstrate the efficiency of the approach presented in this paper.

  15. Evaluating Effects of Language Recognition on Language Rights and the Vitality of New Zealand Sign Language

    ERIC Educational Resources Information Center

    McKee, Rachel Locker; Manning, Victoria

    2015-01-01

    Status planning through legislation made New Zealand Sign Language (NZSL) an official language in 2006. But this strong symbolic action did not create resources or mechanisms to further the aims of the act. In this article we discuss the extent to which legal recognition and ensuing language-planning activities by state and community have affected…

  16. A Sign Language Screen Reader for Deaf

    NASA Astrophysics Data System (ADS)

    El Ghoul, Oussama; Jemni, Mohamed

    Screen reader technology has appeared first to allow blind and people with reading difficulties to use computer and to access to the digital information. Until now, this technology is exploited mainly to help blind community. During our work with deaf people, we noticed that a screen reader can facilitate the manipulation of computers and the reading of textual information. In this paper, we propose a novel screen reader dedicated to deaf. The output of the reader is a visual translation of the text to sign language. The screen reader is composed by two essential modules: the first one is designed to capture the activities of users (mouse and keyboard events). For this purpose, we adopted Microsoft MSAA application programming interfaces. The second module, which is in classical screen readers a text to speech engine (TTS), is replaced by a novel text to sign (TTSign) engine. This module converts text into sign language animation based on avatar technology.

  17. Proactive Interference & Language Change in Hearing Adult Students of American Sign Language.

    ERIC Educational Resources Information Center

    Hoemann, Harry W.; Kreske, Catherine M.

    1995-01-01

    Describes a study that found, contrary to previous reports, that a strong, symmetrical release from proactive interference (PI) is the normal outcome for switches between American Sign Language (ASL) signs and English words and with switches between Manual and English alphabet characters. Subjects were college students enrolled in their first ASL…

  18. Effects of Iconicity and Semantic Relatedness on Lexical Access in American Sign Language

    ERIC Educational Resources Information Center

    Bosworth, Rain G.; Emmorey, Karen

    2010-01-01

    Iconicity is a property that pervades the lexicon of many sign languages, including American Sign Language (ASL). Iconic signs exhibit a motivated, nonarbitrary mapping between the form of the sign and its meaning. We investigated whether iconicity enhances semantic priming effects for ASL and whether iconic signs are recognized more quickly than…

  19. The influence of the visual modality on language structure and conventionalization: insights from sign language and gesture.

    PubMed

    Perniss, Pamela; Özyürek, Asli; Morgan, Gary

    2015-01-01

    For humans, the ability to communicate and use language is instantiated not only in the vocal modality but also in the visual modality. The main examples of this are sign languages and (co-speech) gestures. Sign languages, the natural languages of Deaf communities, use systematic and conventionalized movements of the hands, face, and body for linguistic expression. Co-speech gestures, though non-linguistic, are produced in tight semantic and temporal integration with speech and constitute an integral part of language together with speech. The articles in this issue explore and document how gestures and sign languages are similar or different and how communicative expression in the visual modality can change from being gestural to grammatical in nature through processes of conventionalization. As such, this issue contributes to our understanding of how the visual modality shapes language and the emergence of linguistic structure in newly developing systems. Studying the relationship between signs and gestures provides a new window onto the human ability to recruit multiple levels of representation (e.g., categorical, gradient, iconic, abstract) in the service of using or creating conventionalized communicative systems. Copyright © 2015 Cognitive Science Society, Inc.

  20. Input Processing at First Exposure to a Sign Language

    ERIC Educational Resources Information Center

    Ortega, Gerardo; Morgan, Gary

    2015-01-01

    There is growing interest in learners' cognitive capacities to process a second language (L2) at first exposure to the target language. Evidence suggests that L2 learners are capable of processing novel words by exploiting phonological information from their first language (L1). Hearing adult learners of a sign language, however, cannot fall back…

  1. Directionality effects in simultaneous language interpreting: the case of sign language interpreters in The Netherlands.

    PubMed

    Van Dijk, Rick; Boers, Eveline; Christoffels, Ingrid; Hermans, Daan

    2011-01-01

    The quality of interpretations produced by sign language interpreters was investigated. Twenty-five experienced interpreters were instructed to interpret narratives from (a) spoken Dutch to Sign Language of The Netherlands (SLN), (b) spoken Dutch to Sign Supported Dutch (SSD), and (c) SLN to spoken Dutch. The quality of the interpreted narratives was assessed by 5 certified sign language interpreters who did not participate in the study. Two measures were used to assess interpreting quality: the propositional accuracy of the interpreters' interpretations and a subjective quality measure. The results showed that the interpreted narratives in the SLN-to-Dutch interpreting direction were of lower quality (on both measures) than the interpreted narratives in the Dutch-to-SLN and Dutch-to-SSD directions. Furthermore, interpreters who had begun acquiring SLN when they entered the interpreter training program performed as well in all 3 interpreting directions as interpreters who had acquired SLN from birth.

  2. Sign language recognition and translation: a multidisciplined approach from the field of artificial intelligence.

    PubMed

    Parton, Becky Sue

    2006-01-01

    In recent years, research has progressed steadily in regard to the use of computers to recognize and render sign language. This paper reviews significant projects in the field beginning with finger-spelling hands such as "Ralph" (robotics), CyberGloves (virtual reality sensors to capture isolated and continuous signs), camera-based projects such as the CopyCat interactive American Sign Language game (computer vision), and sign recognition software (Hidden Markov Modeling and neural network systems). Avatars such as "Tessa" (Text and Sign Support Assistant; three-dimensional imaging) and spoken language to sign language translation systems such as Poland's project entitled "THETOS" (Text into Sign Language Automatic Translator, which operates in Polish; natural language processing) are addressed. The application of this research to education is also explored. The "ICICLE" (Interactive Computer Identification and Correction of Language Errors) project, for example, uses intelligent computer-aided instruction to build a tutorial system for deaf or hard-of-hearing children that analyzes their English writing and makes tailored lessons and recommendations. Finally, the article considers synthesized sign, which is being added to educational material and has the potential to be developed by students themselves.

  3. Tools for language: patterned iconicity in sign language nouns and verbs.

    PubMed

    Padden, Carol; Hwang, So-One; Lepic, Ryan; Seegers, Sharon

    2015-01-01

    When naming certain hand-held, man-made tools, American Sign Language (ASL) signers exhibit either of two iconic strategies: a handling strategy, where the hands show holding or grasping an imagined object in action, or an instrument strategy, where the hands represent the shape or a dimension of the object in a typical action. The same strategies are also observed in the gestures of hearing nonsigners identifying pictures of the same set of tools. In this paper, we compare spontaneously created gestures from hearing nonsigning participants to commonly used lexical signs in ASL. Signers and gesturers were asked to respond to pictures of tools and to video vignettes of actions involving the same tools. Nonsigning gesturers overwhelmingly prefer the handling strategy for both the Picture and Video conditions. Nevertheless, they use more instrument forms when identifying tools in pictures, and more handling forms when identifying actions with tools. We found that ASL signers generally favor the instrument strategy when naming tools, but when describing tools being used by an actor, they are significantly more likely to use more handling forms. The finding that both gesturers and signers are more likely to alternate strategies when the stimuli are pictures or video suggests a common cognitive basis for differentiating objects from actions. Furthermore, the presence of a systematic handling/instrument iconic pattern in a sign language demonstrates that a conventionalized sign language exploits the distinction for grammatical purpose, to distinguish nouns and verbs related to tool use. Copyright © 2014 Cognitive Science Society, Inc.

  4. Development of Web-based Distributed Cooperative Development Environmentof Sign-Language Animation System and its Evaluation

    NASA Astrophysics Data System (ADS)

    Yuizono, Takaya; Hara, Kousuke; Nakayama, Shigeru

    A web-based distributed cooperative development environment of sign-language animation system has been developed. We have extended the system from the previous animation system that was constructed as three tiered system which consists of sign-language animation interface layer, sign-language data processing layer, and sign-language animation database. Two components of a web client using VRML plug-in and web servlet are added to the previous system. The systems can support humanoid-model avatar for interoperability, and can use the stored sign language animation data shared on the database. It is noted in the evaluation of this system that the inverse kinematics function of web client improves the sign-language animation making.

  5. Independent transmission of sign language interpreter in DVB: assessment of image compression

    NASA Astrophysics Data System (ADS)

    Zatloukal, Petr; Bernas, Martin; Dvořák, LukáÅ.¡

    2015-02-01

    Sign language on television provides information to deaf that they cannot get from the audio content. If we consider the transmission of the sign language interpreter over an independent data stream, the aim is to ensure sufficient intelligibility and subjective image quality of the interpreter with minimum bit rate. The work deals with the ROI-based video compression of Czech sign language interpreter implemented to the x264 open source library. The results of this approach are verified in subjective tests with the deaf. They examine the intelligibility of sign language expressions containing minimal pairs for different levels of compression and various resolution of image with interpreter and evaluate the subjective quality of the final image for a good viewing experience.

  6. Air Writing as a Technique for the Acquisition of Sino-japanese Characters by Second Language Learners

    ERIC Educational Resources Information Center

    Thomas, Margaret

    2015-01-01

    This article calls attention to a facet of the expertise of second language (L2) learners of Japanese at the intersection of language, memory, gesture, and the psycholinguistics of a logographic writing system. Previous research has shown that adult L2 learners of Japanese living in Japan (similarly to native speakers of Japanese) often…

  7. The link between form and meaning in American Sign Language: lexical processing effects.

    PubMed

    Thompson, Robin L; Vinson, David P; Vigliocco, Gabriella

    2009-03-01

    Signed languages exploit iconicity (the transparent relationship between meaning and form) to a greater extent than spoken languages. where it is largely limited to onomatopoeia. In a picture-sign matching experiment measuring reaction times, the authors examined the potential advantage of iconicity both for 1st- and 2nd-language learners of American Sign Language (ASL). The results show that native ASL signers are faster to respond when a specific property iconically represented in a sign is made salient in the corresponding picture, thus providing evidence that a closer mapping between meaning and form can aid in lexical retrieval. While late 2nd-language learners appear to use iconicity as an aid to learning sign (R. Campbell, P. Martin, & T. White, 1992), they did not show the same facilitation effect as native ASL signers, suggesting that the task tapped into more automatic language processes. Overall, the findings suggest that completely arbitrary mappings between meaning and form may not be more advantageous in language and that, rather, arbitrariness may simply be an accident of modality. (c) 2009 APA, all rights reserved

  8. Static sign language recognition using 1D descriptors and neural networks

    NASA Astrophysics Data System (ADS)

    Solís, José F.; Toxqui, Carina; Padilla, Alfonso; Santiago, César

    2012-10-01

    A frame work for static sign language recognition using descriptors which represents 2D images in 1D data and artificial neural networks is presented in this work. The 1D descriptors were computed by two methods, first one consists in a correlation rotational operator.1 and second is based on contour analysis of hand shape. One of the main problems in sign language recognition is segmentation; most of papers report a special color in gloves or background for hand shape analysis. In order to avoid the use of gloves or special clothing, a thermal imaging camera was used to capture images. Static signs were picked up from 1 to 9 digits of American Sign Language, a multilayer perceptron reached 100% recognition with cross-validation.

  9. Modelling the Perceived Value of Compulsory English Language Education in Undergraduate Non-Language Majors of Japanese Nationality

    ERIC Educational Resources Information Center

    Rivers, Damian J.

    2012-01-01

    Adopting mixed methods of data collection and analysis, the current study models the "perceived value of compulsory English language education" in a sample of 138 undergraduate non-language majors of Japanese nationality at a national university in Japan. During the orientation period of a compulsory 15-week English language programme,…

  10. Languages Are More than Words: Spanish and American Sign Language in Early Childhood Settings

    ERIC Educational Resources Information Center

    Sherman, Judy; Torres-Crespo, Marisel N.

    2015-01-01

    Capitalizing on preschoolers' inherent enthusiasm and capacity for learning, the authors developed and implemented a dual-language program to enable young children to experience diversity and multiculturalism by learning two new languages: Spanish and American Sign Language. Details of the curriculum, findings, and strategies are shared.

  11. The effects of sign language on spoken language acquisition in children with hearing loss: a systematic review protocol.

    PubMed

    Fitzpatrick, Elizabeth M; Stevens, Adrienne; Garritty, Chantelle; Moher, David

    2013-12-06

    Permanent childhood hearing loss affects 1 to 3 per 1000 children and frequently disrupts typical spoken language acquisition. Early identification of hearing loss through universal newborn hearing screening and the use of new hearing technologies including cochlear implants make spoken language an option for most children. However, there is no consensus on what constitutes optimal interventions for children when spoken language is the desired outcome. Intervention and educational approaches ranging from oral language only to oral language combined with various forms of sign language have evolved. Parents are therefore faced with important decisions in the first months of their child's life. This article presents the protocol for a systematic review of the effects of using sign language in combination with oral language intervention on spoken language acquisition. Studies addressing early intervention will be selected in which therapy involving oral language intervention and any form of sign language or sign support is used. Comparison groups will include children in early oral language intervention programs without sign support. The primary outcomes of interest to be examined include all measures of auditory, vocabulary, language, speech production, and speech intelligibility skills. We will include randomized controlled trials, controlled clinical trials, and other quasi-experimental designs that include comparator groups as well as prospective and retrospective cohort studies. Case-control, cross-sectional, case series, and case studies will be excluded. Several electronic databases will be searched (for example, MEDLINE, EMBASE, CINAHL, PsycINFO) as well as grey literature and key websites. We anticipate that a narrative synthesis of the evidence will be required. We will carry out meta-analysis for outcomes if clinical similarity, quantity and quality permit quantitative pooling of data. We will conduct subgroup analyses if possible according to severity

  12. Comic Books: A Learning Tool for Meaningful Acquisition of Written Sign Language

    ERIC Educational Resources Information Center

    Guimarães, Cayley; Oliveira Machado, Milton César; Fernandes, Sueli F.

    2018-01-01

    Deaf people use Sign Language (SL) for intellectual development, communications and other human activities that are mediated by language--such as the expression of complex and abstract thoughts and feelings; and for literature, culture and knowledge. The Brazilian Sign Language (Libras) is a complete linguistic system of visual-spatial manner,…

  13. Child Modifiability as a Predictor of Language Abilities in Deaf Children Who Use American Sign Language.

    PubMed

    Mann, Wolfgang; Peña, Elizabeth D; Morgan, Gary

    2015-08-01

    This research explored the use of dynamic assessment (DA) for language-learning abilities in signing deaf children from deaf and hearing families. Thirty-seven deaf children, aged 6 to 11 years, were identified as either stronger (n = 26) or weaker (n = 11) language learners according to teacher or speech-language pathologist report. All children received 2 scripted, mediated learning experience sessions targeting vocabulary knowledge—specifically, the use of semantic categories that were carried out in American Sign Language. Participant responses to learning were measured in terms of an index of child modifiability. This index was determined separately at the end of the 2 individual sessions. It combined ratings reflecting each child's learning abilities and responses to mediation, including social-emotional behavior, cognitive arousal, and cognitive elaboration. Group results showed that modifiability ratings were significantly better for stronger language learners than for weaker language learners. The strongest predictors of language ability were cognitive arousal and cognitive elaboration. Mediator ratings of child modifiability (i.e., combined score of social-emotional factors and cognitive factors) are highly sensitive to language-learning abilities in deaf children who use sign language as their primary mode of communication. This method can be used to design targeted interventions.

  14. Language, Culture and Ethnicity: Interplay of Ideologies within a Japanese Community in Brazil

    ERIC Educational Resources Information Center

    Sakuma, Tomoko

    2011-01-01

    This dissertation is a sociolinguistic study of the ideologies about language, culture and ethnicity among Japanese immigrants and descendants in Brazil (hereafter, Nikkeis) who gather at a local Japanese cultural association, searching for what it means to be "Japanese" in Brazil. This study focuses on how linguistic behaviors are…

  15. Sign language spotting with a threshold model based on conditional random fields.

    PubMed

    Yang, Hee-Deok; Sclaroff, Stan; Lee, Seong-Whan

    2009-07-01

    Sign language spotting is the task of detecting and recognizing signs in a signed utterance, in a set vocabulary. The difficulty of sign language spotting is that instances of signs vary in both motion and appearance. Moreover, signs appear within a continuous gesture stream, interspersed with transitional movements between signs in a vocabulary and nonsign patterns (which include out-of-vocabulary signs, epentheses, and other movements that do not correspond to signs). In this paper, a novel method for designing threshold models in a conditional random field (CRF) model is proposed which performs an adaptive threshold for distinguishing between signs in a vocabulary and nonsign patterns. A short-sign detector, a hand appearance-based sign verification method, and a subsign reasoning method are included to further improve sign language spotting accuracy. Experiments demonstrate that our system can spot signs from continuous data with an 87.0 percent spotting rate and can recognize signs from isolated data with a 93.5 percent recognition rate versus 73.5 percent and 85.4 percent, respectively, for CRFs without a threshold model, short-sign detection, subsign reasoning, and hand appearance-based sign verification. Our system can also achieve a 15.0 percent sign error rate (SER) from continuous data and a 6.4 percent SER from isolated data versus 76.2 percent and 14.5 percent, respectively, for conventional CRFs.

  16. Selected Lexical Patterns in Saudi Arabian Sign Language

    ERIC Educational Resources Information Center

    Young, Lesa; Palmer, Jeffrey Levi; Reynolds, Wanette

    2012-01-01

    This combined paper will focus on the description of two selected lexical patterns in Saudi Arabian Sign Language (SASL): metaphor and metonymy in emotion-related signs (Young) and lexicalization patterns of objects and their derivational roots (Palmer and Reynolds). The over-arcing methodology used by both studies is detailed in Stephen and…

  17. Operationalization of Sign Language Phonological Similarity and its Effects on Lexical Access.

    PubMed

    Williams, Joshua T; Stone, Adam; Newman, Sharlene D

    2017-07-01

    Cognitive mechanisms for sign language lexical access are fairly unknown. This study investigated whether phonological similarity facilitates lexical retrieval in sign languages using measures from a new lexical database for American Sign Language. Additionally, it aimed to determine which similarity metric best fits the present data in order to inform theories of how phonological similarity is constructed within the lexicon and to aid in the operationalization of phonological similarity in sign language. Sign repetition latencies and accuracy were obtained when native signers were asked to reproduce a sign displayed on a computer screen. Results indicated that, as predicted, phonological similarity facilitated repetition latencies and accuracy as long as there were no strict constraints on the type of sublexical features that overlapped. The data converged to suggest that one similarity measure, MaxD, defined as the overlap of any 4 sublexical features, likely best represents mechanisms of phonological similarity in the mental lexicon. Together, these data suggest that lexical access in sign language is facilitated by phonologically similar lexical representations in memory and the optimal operationalization is defined as liberal constraints on overlap of 4 out of 5 sublexical features-similar to the majority of extant definitions in the literature. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. A Comparison of Comprehension Processes in Sign Language Interpreter Videos with or without Captions.

    PubMed

    Debevc, Matjaž; Milošević, Danijela; Kožuh, Ines

    2015-01-01

    One important theme in captioning is whether the implementation of captions in individual sign language interpreter videos can positively affect viewers' comprehension when compared with sign language interpreter videos without captions. In our study, an experiment was conducted using four video clips with information about everyday events. Fifty-one deaf and hard of hearing sign language users alternately watched the sign language interpreter videos with, and without, captions. Afterwards, they answered ten questions. The results showed that the presence of captions positively affected their rates of comprehension, which increased by 24% among deaf viewers and 42% among hard of hearing viewers. The most obvious differences in comprehension between watching sign language interpreter videos with and without captions were found for the subjects of hiking and culture, where comprehension was higher when captions were used. The results led to suggestions for the consistent use of captions in sign language interpreter videos in various media.

  19. A Comparison of Comprehension Processes in Sign Language Interpreter Videos with or without Captions

    PubMed Central

    Debevc, Matjaž; Milošević, Danijela; Kožuh, Ines

    2015-01-01

    One important theme in captioning is whether the implementation of captions in individual sign language interpreter videos can positively affect viewers’ comprehension when compared with sign language interpreter videos without captions. In our study, an experiment was conducted using four video clips with information about everyday events. Fifty-one deaf and hard of hearing sign language users alternately watched the sign language interpreter videos with, and without, captions. Afterwards, they answered ten questions. The results showed that the presence of captions positively affected their rates of comprehension, which increased by 24% among deaf viewers and 42% among hard of hearing viewers. The most obvious differences in comprehension between watching sign language interpreter videos with and without captions were found for the subjects of hiking and culture, where comprehension was higher when captions were used. The results led to suggestions for the consistent use of captions in sign language interpreter videos in various media. PMID:26010899

  20. Comprehending Sentences with the Body: Action Compatibility in British Sign Language?

    ERIC Educational Resources Information Center

    Vinson, David; Perniss, Pamela; Fox, Neil; Vigliocco, Gabriella

    2017-01-01

    Previous studies show that reading sentences about actions leads to specific motor activity associated with actually performing those actions. We investigate how sign language input may modulate motor activation, using British Sign Language (BSL) sentences, some of which explicitly encode direction of motion, versus written English, where motion…

  1. Ideologies and Attitudes toward Sign Languages: An Approximation

    ERIC Educational Resources Information Center

    Krausneker, Verena

    2015-01-01

    Attitudes are complex and little research in the field of linguistics has focused on language attitudes. This article deals with attitudes toward sign languages and those who use them--attitudes that are influenced by ideological constructions. The article reviews five categories of such constructions and discusses examples in each one.

  2. Graph theoretical analysis of functional network for comprehension of sign language.

    PubMed

    Liu, Lanfang; Yan, Xin; Liu, Jin; Xia, Mingrui; Lu, Chunming; Emmorey, Karen; Chu, Mingyuan; Ding, Guosheng

    2017-09-15

    Signed languages are natural human languages using the visual-motor modality. Previous neuroimaging studies based on univariate activation analysis show that a widely overlapped cortical network is recruited regardless whether the sign language is comprehended (for signers) or not (for non-signers). Here we move beyond previous studies by examining whether the functional connectivity profiles and the underlying organizational structure of the overlapped neural network may differ between signers and non-signers when watching sign language. Using graph theoretical analysis (GTA) and fMRI, we compared the large-scale functional network organization in hearing signers with non-signers during the observation of sentences in Chinese Sign Language. We found that signed sentences elicited highly similar cortical activations in the two groups of participants, with slightly larger responses within the left frontal and left temporal gyrus in signers than in non-signers. Crucially, further GTA revealed substantial group differences in the topologies of this activation network. Globally, the network engaged by signers showed higher local efficiency (t (24) =2.379, p=0.026), small-worldness (t (24) =2.604, p=0.016) and modularity (t (24) =3.513, p=0.002), and exhibited different modular structures, compared to the network engaged by non-signers. Locally, the left ventral pars opercularis served as a network hub in the signer group but not in the non-signer group. These findings suggest that, despite overlap in cortical activation, the neural substrates underlying sign language comprehension are distinguishable at the network level from those for the processing of gestural action. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Pedagogical Perspectives on Gendered Speech Styles in the Teaching and Learning of Japanese as a Foreign Language

    ERIC Educational Resources Information Center

    Bohn, Mariko Tajima

    2015-01-01

    This study examines student perspectives on gender differences in Japanese speech. Expanding on a small-scale survey by Siegal & Okamoto (2003) that investigated the views of eleven Japanese-language college teachers, this study analyzes 238 questionnaire responses from 220 Japanese-language students at four universities and a US government…

  4. Learning an Embodied Visual Language: Four Imitation Strategies Available to Sign Learners

    PubMed Central

    Shield, Aaron; Meier, Richard P.

    2018-01-01

    The parts of the body that are used to produce and perceive signed languages (the hands, face, and visual system) differ from those used to produce and perceive spoken languages (the vocal tract and auditory system). In this paper we address two factors that have important consequences for sign language acquisition. First, there are three types of lexical signs: one-handed, two-handed symmetrical, and two-handed asymmetrical. Natural variation in hand dominance in the population leads to varied input to children learning sign. Children must learn that signs are not specified for the right or left hand but for dominant and non-dominant. Second, we posit that children have at least four imitation strategies available for imitating signs: anatomical (Activate the same muscles as the sign model), which could lead learners to inappropriately use their non-dominant hand; mirroring (Produce a mirror image of the modeled sign), which could lead learners to produce lateral movement reversal errors or to use the non-dominant hand; visual matching (Reproduce what you see from your perspective), which could lead learners to produce inward–outward movement and palm orientation reversals; and reversing (Reproduce what the sign model would see from his/her perspective). This last strategy is the only one that always yields correct phonological forms in signed languages. To test our hypotheses, we turn to evidence from typical and atypical hearing and deaf children as well as from typical adults; the data come from studies of both sign acquisition and gesture imitation. Specifically, we posit that all children initially use a visual matching strategy but typical children switch to a mirroring strategy sometime in the second year of life; typical adults tend to use a mirroring strategy in learning signs and imitating gestures. By contrast, children and adults with autism spectrum disorder (ASD) appear to use the visual matching strategy well into childhood or even adulthood

  5. Visual Sonority Modulates Infants' Attraction to Sign Language

    ERIC Educational Resources Information Center

    Stone, Adam; Petitto, Laura-Ann; Bosworth, Rain

    2018-01-01

    The infant brain may be predisposed to identify perceptually salient cues that are common to both signed and spoken languages. Recent theory based on spoken languages has advanced sonority as one of these potential language acquisition cues. Using a preferential looking paradigm with an infrared eye tracker, we explored visual attention of hearing…

  6. Learning To See: American Sign Language as a Second Language. Language in Education: Theory and Practice 76.

    ERIC Educational Resources Information Center

    Wilcox, Sherman; Wilcox, Phyllis

    During the last decade, the study of American Sign Language (ASL) as a second language has become enormously popular. More and more schools and universities recognize the important role that ASL can play in foreign language education. This monograph provides a comprehensive introduction to the history and structure of ASL, to the Deaf community…

  7. A Case of Specific Language Impairment in a Deaf Signer of American Sign Language

    ERIC Educational Resources Information Center

    Quinto-Pozos, David; Singleton, Jenny L.; Hauser, Peter C.

    2017-01-01

    This article describes the case of a deaf native signer of American Sign Language (ASL) with a specific language impairment (SLI). School records documented normal cognitive development but atypical language development. Data include school records; interviews with the child, his mother, and school professionals; ASL and English evaluations; and a…

  8. Endorsements on Teaching Certificates: K-12 Japanese in Washington Language Endorsements.

    ERIC Educational Resources Information Center

    Toma, Yumi

    1994-01-01

    This article from a quarterly newsletter discusses the requirements necessary to obtain an endorsement on a Washington Teaching Certificate (WTC) to teach Japanese as a Second Language in K-12 public schools in the state. Candidates are required to have 24 quarter hours (16 semester hours) in Japanese writing/composition, conversation, reading, or…

  9. Continuous Chinese sign language recognition with CNN-LSTM

    NASA Astrophysics Data System (ADS)

    Yang, Su; Zhu, Qing

    2017-07-01

    The goal of sign language recognition (SLR) is to translate the sign language into text, and provide a convenient tool for the communication between the deaf-mute and the ordinary. In this paper, we formulate an appropriate model based on convolutional neural network (CNN) combined with Long Short-Term Memory (LSTM) network, in order to accomplish the continuous recognition work. With the strong ability of CNN, the information of pictures captured from Chinese sign language (CSL) videos can be learned and transformed into vector. Since the video can be regarded as an ordered sequence of frames, LSTM model is employed to connect with the fully-connected layer of CNN. As a recurrent neural network (RNN), it is suitable for sequence learning tasks with the capability of recognizing patterns defined by temporal distance. Compared with traditional RNN, LSTM has performed better on storing and accessing information. We evaluate this method on our self-built dataset including 40 daily vocabularies. The experimental results show that the recognition method with CNN-LSTM can achieve a high recognition rate with small training sets, which will meet the needs of real-time SLR system.

  10. Promotion in Times of Endangerment: The Sign Language Act in Finland

    ERIC Educational Resources Information Center

    De Meulder, Maartje

    2017-01-01

    The development of sign language recognition legislation is a relatively recent phenomenon in the field of language policy. So far only few authors have documented signing communities' aspirations for recognition legislation, how they work with their governments to achieve legislation which most reflects these goals, and whether and why outcomes…

  11. Generation of Signs within Semantic and Phonological Categories: Data from Deaf Adults and Children Who Use American Sign Language

    ERIC Educational Resources Information Center

    Beal-Alvarez, Jennifer S.; Figueroa, Daileen M.

    2017-01-01

    Two key areas of language development include semantic and phonological knowledge. Semantic knowledge relates to word and concept knowledge. Phonological knowledge relates to how language parameters combine to create meaning. We investigated signing deaf adults' and children's semantic and phonological sign generation via one-minute tasks,…

  12. Japanese Elementary School Teachers and English Language Anxiety

    ERIC Educational Resources Information Center

    Machida, Tomohisa

    2016-01-01

    "Foreign language activities" (English) officially began in Japanese elementary schools in April 2011. Since that starting date, and despite insufficient knowledge and preparation, classroom teachers have been required to instruct in English. They also have been required to team-teach with native-English-speaking assistant language…

  13. Cardiovascular risk factors and retinal microvascular signs in an adult Japanese population: the Funagata Study.

    PubMed

    Kawasaki, Ryo; Wang, Jie Jin; Rochtchina, Elena; Taylor, Bronwen; Wong, Tien Yin; Tominaga, Makoto; Kato, Takeo; Daimon, Makoto; Oizumi, Toshihide; Kawata, Sumio; Kayama, Takamasa; Yamashita, Hidetoshi; Mitchell, Paul

    2006-08-01

    To describe the prevalence of retinal vascular signs and their association with cardiovascular risk factors in a Japanese population. Population-based cross-sectional study. Adult persons aged 35 years or older from Funagata, Yamagata Prefecture, Japan (n = 1481). The Funagata Study is a Japanese population-based study of persons aged 35 years or older, and included 1961 nondiabetic participants (53.3% of 3676 eligible subjects). A nonmydriatic retinal photograph was taken of 1 eye to assess retinal microvascular signs. Retinal arteriolar wall signs (focal arteriolar narrowing, arteriovenous nicking, enhanced arteriolar wall reflex) and retinopathy were assessed in 1481 participants without diabetes (40.3% of eligible persons) using a standardized protocol. Using a computer-assisted method, retinal vessel diameters were measured in 921 participants with gradable retinal image (25.1% of eligible persons). Prevalence of retinal microvascular signs and their association with cardiovascular risk factors. Moderate or severe focal arteriolar narrowing, arteriovenous nicking, enhanced arteriolar wall reflex, and retinopathy were found in 8.3%, 15.2%, 18.7%, and 9.0%, respectively, of the study population. Mean (+/-standard error) values for retinal arteriolar diameter were 178.6+/-21.0 mum, and mean values (+/-standard error) for venular diameter were 214.9+/-20.6 mum. Older persons were more likely to have retinal arteriolar wall signs, retinopathy, and narrower retinal vessel diameters. After adjusting for multiple factors, each 10-mmHg increase in mean arterial blood pressure was associated with a 20% to 40% increased likelihood of retinal arteriolar signs and a 2.8-mum reduction in arteriolar diameter. Retinopathy was associated with higher body mass index and both impaired glucose tolerance and impaired fasting glucose. In nondiabetic Japanese adults, retinal arteriolar wall signs were associated with older age and increased blood pressure, whereas retinopathy was

  14. Extricating Manual and Non-Manual Features for Subunit Level Medical Sign Modelling in Automatic Sign Language Classification and Recognition.

    PubMed

    R, Elakkiya; K, Selvamani

    2017-09-22

    Subunit segmenting and modelling in medical sign language is one of the important studies in linguistic-oriented and vision-based Sign Language Recognition (SLR). Many efforts were made in the precedent to focus the functional subunits from the view of linguistic syllables but the problem is implementing such subunit extraction using syllables is not feasible in real-world computer vision techniques. And also, the present recognition systems are designed in such a way that it can detect the signer dependent actions under restricted and laboratory conditions. This research paper aims at solving these two important issues (1) Subunit extraction and (2) Signer independent action on visual sign language recognition. Subunit extraction involved in the sequential and parallel breakdown of sign gestures without any prior knowledge on syllables and number of subunits. A novel Bayesian Parallel Hidden Markov Model (BPaHMM) is introduced for subunit extraction to combine the features of manual and non-manual parameters to yield better results in classification and recognition of signs. Signer independent action aims in using a single web camera for different signer behaviour patterns and for cross-signer validation. Experimental results have proved that the proposed signer independent subunit level modelling for sign language classification and recognition has shown improvement and variations when compared with other existing works.

  15. "Foreign Language Activities" in Japanese Elementary Schools: Negotiating Teacher Roles and Identities within a New Language Education Policy

    ERIC Educational Resources Information Center

    Horii, Sachiko Yokoi

    2012-01-01

    In 2008, a new language education policy called "Gaikokugo Katsudou" [Foreign Language Activities] was issued by the Ministry of Education, Culture, Sport, Science, and Technology (MEXT) in Japan. Effective 2011, foreign language education became mandatory in all Japanese public elementary schools for the first time. With this dramatic…

  16. Hand and mouth: Cortical correlates of lexical processing in British Sign Language and speechreading English

    PubMed Central

    Capek, Cheryl M.; Waters, Dafydd; Woll, Bencie; MacSweeney, Mairéad; Brammer, Michael J.; McGuire, Philip K.; David, Anthony S.; Campbell, Ruth

    2012-01-01

    Spoken languages use one set of articulators – the vocal tract, whereas signed languages use multiple articulators, including both manual and facial actions. How sensitive are the cortical circuits for language processing to the particular articulators that are observed? This question can only be addressed with participants who use both speech and a signed language. In this study, we used fMRI to compare the processing of speechreading and sign processing in deaf native signers of British Sign Language (BSL) who were also proficient speechreaders. The following questions were addressed: To what extent do these different language types rely on a common brain network? To what extent do the patterns of activation differ? How are these networks affected by the articulators that languages use? Common perisylvian regions were activated both for speechreading English words and for BSL signs. Distinctive activation was also observed reflecting the language form. Speechreading elicited greater activation in the left mid-superior temporal cortex than BSL, whereas BSL processing generated greater activation at the parieto-occipito-temporal junction in both hemispheres. We probed this distinction further within BSL, where manual signs can be accompanied by different sorts of mouth action. BSL signs with speech-like mouth actions showed greater superior temporal activation, while signs made with non-speech-like mouth actions showed more activation in posterior and inferior temporal regions. Distinct regions within the temporal cortex are not only differentially sensitive to perception of the distinctive articulators for speech and for sign, but also show sensitivity to the different articulators within the (signed) language. PMID:18284353

  17. Numeral-Incorporating Roots in Numeral Systems: A Comparative Analysis of Two Sign Languages

    ERIC Educational Resources Information Center

    Fuentes, Mariana; Massone, Maria Ignacia; Fernandez-Viader, Maria del Pilar; Makotrinsky, Alejandro; Pulgarin, Francisca

    2010-01-01

    Numeral-incorporating roots in the numeral systems of Argentine Sign Language (LSA) and Catalan Sign Language (LSC), as well as the main features of the number systems of both languages, are described and compared. Informants discussed the use of numerals and roots in both languages (in most cases in natural contexts). Ten informants took part in…

  18. Observations on Word Order in Saudi Arabian Sign Language

    ERIC Educational Resources Information Center

    Sprenger, Kristen; Mathur, Gaurav

    2012-01-01

    This article focuses on the syntactic level of the grammar of Saudi Arabian Sign Language by exploring some word orders that occur in personal narratives in the language. Word order is one of the main ways in which languages indicate the main syntactic roles of subjects, verbs, and objects; others are verbal agreement and nominal case morphology.…

  19. Medical Signbank as a Model for Sign Language Planning? A Review of Community Engagement

    ERIC Educational Resources Information Center

    Napier, Jemina; Major, George; Ferrara, Lindsay; Johnston, Trevor

    2015-01-01

    This paper reviews a sign language planning project conducted in Australia with deaf Auslan users. The Medical Signbank project utilised a cooperative language planning process to engage with the Deaf community and sign language interpreters to develop an online interactive resource of health-related signs, in order to address a gap in the health…

  20. Sign Language Use and the Appreciation of Diversity in Hearing Classrooms

    ERIC Educational Resources Information Center

    Brereton, Amy

    2008-01-01

    This article is the result of a year-long study into the effects of sign language use on participation in one mainstream preschool setting. Observations and interviews were the primary data-collection tools used during this investigation. This article focuses on how the use of sign language in the classroom affected the learning community's…

  1. American Sign Language Syntax and Analogical Reasoning Skills Are Influenced by Early Acquisition and Age of Entry to Signing Schools for the Deaf

    PubMed Central

    Henner, Jon; Caldwell-Harris, Catherine L.; Novogrodsky, Rama; Hoffmeister, Robert

    2016-01-01

    Failing to acquire language in early childhood because of language deprivation is a rare and exceptional event, except in one population. Deaf children who grow up without access to indirect language through listening, speech-reading, or sign language experience language deprivation. Studies of Deaf adults have revealed that late acquisition of sign language is associated with lasting deficits. However, much remains unknown about language deprivation in Deaf children, allowing myths and misunderstandings regarding sign language to flourish. To fill this gap, we examined signing ability in a large naturalistic sample of Deaf children attending schools for the Deaf where American Sign Language (ASL) is used by peers and teachers. Ability in ASL was measured using a syntactic judgment test and language-based analogical reasoning test, which are two sub-tests of the ASL Assessment Inventory. The influence of two age-related variables were examined: whether or not ASL was acquired from birth in the home from one or more Deaf parents, and the age of entry to the school for the Deaf. Note that for non-native signers, this latter variable is often the age of first systematic exposure to ASL. Both of these types of age-dependent language experiences influenced subsequent signing ability. Scores on the two tasks declined with increasing age of school entry. The influence of age of starting school was not linear. Test scores were generally lower for Deaf children who entered the school of assessment after the age of 12. The positive influence of signing from birth was found for students at all ages tested (7;6–18;5 years old) and for children of all age-of-entry groupings. Our results reflect a continuum of outcomes which show that experience with language is a continuous variable that is sensitive to maturational age. PMID:28082932

  2. American Sign Language Syntax and Analogical Reasoning Skills Are Influenced by Early Acquisition and Age of Entry to Signing Schools for the Deaf.

    PubMed

    Henner, Jon; Caldwell-Harris, Catherine L; Novogrodsky, Rama; Hoffmeister, Robert

    2016-01-01

    Failing to acquire language in early childhood because of language deprivation is a rare and exceptional event, except in one population. Deaf children who grow up without access to indirect language through listening, speech-reading, or sign language experience language deprivation. Studies of Deaf adults have revealed that late acquisition of sign language is associated with lasting deficits. However, much remains unknown about language deprivation in Deaf children, allowing myths and misunderstandings regarding sign language to flourish. To fill this gap, we examined signing ability in a large naturalistic sample of Deaf children attending schools for the Deaf where American Sign Language (ASL) is used by peers and teachers. Ability in ASL was measured using a syntactic judgment test and language-based analogical reasoning test, which are two sub-tests of the ASL Assessment Inventory. The influence of two age-related variables were examined: whether or not ASL was acquired from birth in the home from one or more Deaf parents, and the age of entry to the school for the Deaf. Note that for non-native signers, this latter variable is often the age of first systematic exposure to ASL. Both of these types of age-dependent language experiences influenced subsequent signing ability. Scores on the two tasks declined with increasing age of school entry. The influence of age of starting school was not linear. Test scores were generally lower for Deaf children who entered the school of assessment after the age of 12. The positive influence of signing from birth was found for students at all ages tested (7;6-18;5 years old) and for children of all age-of-entry groupings. Our results reflect a continuum of outcomes which show that experience with language is a continuous variable that is sensitive to maturational age.

  3. On Selected Morphemes in Saudi Arabian Sign Language

    ERIC Educational Resources Information Center

    Morris, Carla; Schneider, Erin

    2012-01-01

    Following a year of study of Saudi Arabian Sign Language (SASL), we are documenting our findings to provide a grammatical sketch of the language. This paper represents one part of that endeavor and focuses on a description of selected morphemes, both manual and non-manual, that have appeared in the course of data collection. While some of the…

  4. Neural Basis of Action Understanding: Evidence from Sign Language Aphasia.

    PubMed

    Rogalsky, Corianne; Raphel, Kristin; Tomkovicz, Vivian; O'Grady, Lucinda; Damasio, Hanna; Bellugi, Ursula; Hickok, Gregory

    2013-01-01

    The neural basis of action understanding is a hotly debated issue. The mirror neuron account holds that motor simulation in fronto-parietal circuits is critical to action understanding including speech comprehension, while others emphasize the ventral stream in the temporal lobe. Evidence from speech strongly supports the ventral stream account, but on the other hand, evidence from manual gesture comprehension (e.g., in limb apraxia) has led to contradictory findings. Here we present a lesion analysis of sign language comprehension. Sign language is an excellent model for studying mirror system function in that it bridges the gap between the visual-manual system in which mirror neurons are best characterized and language systems which have represented a theoretical target of mirror neuron research. Twenty-one life long deaf signers with focal cortical lesions performed two tasks: one involving the comprehension of individual signs and the other involving comprehension of signed sentences (commands). Participants' lesions, as indicated on MRI or CT scans, were mapped onto a template brain to explore the relationship between lesion location and sign comprehension measures. Single sign comprehension was not significantly affected by left hemisphere damage. Sentence sign comprehension impairments were associated with left temporal-parietal damage. We found that damage to mirror system related regions in the left frontal lobe were not associated with deficits on either of these comprehension tasks. We conclude that the mirror system is not critically involved in action understanding.

  5. Why Doesn't Everyone Here Speak Sign Language? Questions of Language Policy, Ideology and Economics

    ERIC Educational Resources Information Center

    Rayman, Jennifer

    2009-01-01

    This paper is a thought experiment exploring the possibility of establishing universal bilingualism in Sign Languages. Focusing in the first part on historical examples of inclusive signing societies such as Martha's Vineyard, the author suggests that it is not possible to create such naturally occurring practices of Sign Bilingualism in societies…

  6. Psychological Testing of Sign Language Interpreters

    ERIC Educational Resources Information Center

    Seal, Brenda C.

    2004-01-01

    Twenty-eight sign language interpreters participated in a battery of tests to determine if a profile of cognitive, motor, attention, and personality attributes might distinguish them as a group and at different credential levels. Eight interpreters held Level II and nine held Level III Virginia Quality Assurance Screenings (VQAS); the other 11…

  7. Syntactic Complexity Measures and Their Relation to Oral Proficiency in Japanese as a Foreign Language

    ERIC Educational Resources Information Center

    Iwashita, Noriko

    2006-01-01

    The study reported in this article is a part of a large-scale study investigating syntactic complexity in second language (L2) oral data in commonly taught foreign languages (English, German, Japanese, and Spanish; Ortega, Iwashita, Rabie, & Norris, in preparation). In this article, preliminary findings of the analysis of the Japanese data are…

  8. A Lexical Comparison of Signs from Icelandic and Danish Sign Languages

    ERIC Educational Resources Information Center

    Aldersson, Russell R.; McEntee-Atalianis, Lisa J.

    2008-01-01

    This article reports on a comparison of lexical items in the vocabulary of Icelandic and Danish sign languages prompted by anecdotal reports of similarity and historical records detailing close contact between the two communities. Drawing on previous studies, including Bickford (2005), McKee and Kennedy (1998, 2000a, 2000b) and Parkhurst and…

  9. Effects of Iconicity and Semantic Relatedness on Lexical Access in American Sign Language

    PubMed Central

    Bosworth, Rain G.; Emmorey, Karen

    2010-01-01

    Iconicity is a property that pervades the lexicon of many sign languages, including American Sign Language (ASL). Iconic signs exhibit a motivated, non-arbitrary mapping between the form of the sign and its meaning. We investigated whether iconicity enhances semantic priming effects for ASL and whether iconic signs are recognized more quickly than non-iconic signs (controlling for strength of iconicity, semantic relatedness, familiarity, and imageability). Twenty deaf signers made lexical decisions to the second item of a prime-target pair. Iconic target signs were preceded by prime signs that were a) iconic and semantically related, b) non-iconic and semantically related, or c) semantically unrelated. In addition, a set of non-iconic target signs was preceded by semantically unrelated primes. Significant facilitation was observed for target signs when preceded by semantically related primes. However, iconicity did not increase the priming effect (e.g., the target sign PIANO was primed equally by the iconic sign GUITAR and the non-iconic sign MUSIC). In addition, iconic signs were not recognized faster or more accurately than non-iconic signs. These results confirm the existence of semantic priming for sign language and suggest that iconicity does not play a robust role in on-line lexical processing. PMID:20919784

  10. Phonological Development in Hearing Learners of a Sign Language: The Influence of Phonological Parameters, Sign Complexity, and Iconicity

    ERIC Educational Resources Information Center

    Ortega, Gerardo; Morgan, Gary

    2015-01-01

    The present study implemented a sign-repetition task at two points in time to hearing adult learners of British Sign Language and explored how each phonological parameter, sign complexity, and iconicity affected sign production over an 11-week (22-hour) instructional period. The results show that training improves articulation accuracy and that…

  11. Moving beyond Communicative Language Teaching: A Situated Pedagogy for Japanese EFL Classrooms

    ERIC Educational Resources Information Center

    Lochland, Paul W.

    2013-01-01

    This article questions the appropriateness of communicative language teaching (CLT) in classrooms teaching English as a foreign language (EFL) to Japanese students. The four main criticisms of CLT are the ambiguity of its description, the benefits of CLT for language learning, the amalgamation of CLT methods with local classroom practices, and the…

  12. Post-glossectomy in lingual carcinomas: a scope for sign language in rehabilitation

    PubMed Central

    Cumberbatch, Keren; Jones, Thaon

    2017-01-01

    The treatment option for cancers of the tongue is glossectomy, which may be partial, sub-total, or total, depending on the size of the tumour. Glossectomies result in speech deficits for these patients, and rehabilitative therapy involving communication modalities is highly recommended. Sign language is a possible therapeutic solution for post-glossectomy oral cancer patients. Patients with tongue cancers who have undergone total glossectomy as a surgical treatment can utilise sign language to replace their loss of speech production and maintain their engagement in life. This manuscript emphasises the importance of sign language in rehabilitation strategies in post-glossectomy patients. PMID:28947881

  13. Post-glossectomy in lingual carcinomas: a scope for sign language in rehabilitation.

    PubMed

    Rajendra Santosh, Arvind Babu; Cumberbatch, Keren; Jones, Thaon

    2017-01-01

    The treatment option for cancers of the tongue is glossectomy, which may be partial, sub-total, or total, depending on the size of the tumour. Glossectomies result in speech deficits for these patients, and rehabilitative therapy involving communication modalities is highly recommended. Sign language is a possible therapeutic solution for post-glossectomy oral cancer patients. Patients with tongue cancers who have undergone total glossectomy as a surgical treatment can utilise sign language to replace their loss of speech production and maintain their engagement in life. This manuscript emphasises the importance of sign language in rehabilitation strategies in post-glossectomy patients.

  14. Language Use in the Context of Double Minority: The Case of Japanese-Catalan/Spanish Families in Catalonia

    ERIC Educational Resources Information Center

    Fukuda, Makiko

    2017-01-01

    This study explores language use in Japanese-Catalan/Spanish families in Catalonia with a special attention to Japanese. In a community such as Catalonia wherein two languages of different status are in conflict within its own territory, the ability of families to maintain a socially "weaker" language and transmit yet another language…

  15. Japanese Language Proficiency, Social Networking, and Language Use during Study Abroad: Learners' Perspectives

    ERIC Educational Resources Information Center

    Dewey, Dan P.; Bown, Jennifer; Eggett, Dennis

    2012-01-01

    This study examines the self-perceived speaking proficiency development of 204 learners of Japanese who studied abroad in Japan and analyzes connections between self-reported social network development, language use, and speaking development. Learners perceived that they gained the most in areas associated with the intermediate and advanced levels…

  16. Bimodal bilingualism as multisensory training?: Evidence for improved audiovisual speech perception after sign language exposure.

    PubMed

    Williams, Joshua T; Darcy, Isabelle; Newman, Sharlene D

    2016-02-15

    The aim of the present study was to characterize effects of learning a sign language on the processing of a spoken language. Specifically, audiovisual phoneme comprehension was assessed before and after 13 weeks of sign language exposure. L2 ASL learners performed this task in the fMRI scanner. Results indicated that L2 American Sign Language (ASL) learners' behavioral classification of the speech sounds improved with time compared to hearing nonsigners. Results indicated increased activation in the supramarginal gyrus (SMG) after sign language exposure, which suggests concomitant increased phonological processing of speech. A multiple regression analysis indicated that learner's rating on co-sign speech use and lipreading ability was correlated with SMG activation. This pattern of results indicates that the increased use of mouthing and possibly lipreading during sign language acquisition may concurrently improve audiovisual speech processing in budding hearing bimodal bilinguals. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Social construction of American sign language--English interpreters.

    PubMed

    McDermid, Campbell

    2009-01-01

    Instructors in 5 American Sign Language--English Interpreter Programs and 4 Deaf Studies Programs in Canada were interviewed and asked to discuss their experiences as educators. Within a qualitative research paradigm, their comments were grouped into a number of categories tied to the social construction of American Sign Language--English interpreters, such as learners' age and education and the characteristics of good citizens within the Deaf community. According to the participants, younger students were adept at language acquisition, whereas older learners more readily understood the purpose of lessons. Children of deaf adults were seen as more culturally aware. The participants' beliefs echoed the theories of P. Freire (1970/1970) that educators consider the reality of each student and their praxis and were responsible for facilitating student self-awareness. Important characteristics in the social construction of students included independence, an appropriate attitude, an understanding of Deaf culture, ethical behavior, community involvement, and a willingness to pursue lifelong learning.

  18. Cross-language perception of Japanese vowel length contrasts: comparison of listeners from different first language backgrounds.

    PubMed

    Tsukada, Kimiko; Hirata, Yukari; Roengpitya, Rungpat

    2014-06-01

    The purpose of this research was to compare the perception of Japanese vowel length contrasts by 4 groups of listeners who differed in their familiarity with length contrasts in their first language (L1; i.e., American English, Italian, Japanese, and Thai). Of the 3 nonnative groups, native Thai listeners were expected to outperform American English and Italian listeners, because vowel length is contrastive in their L1. Native Italian listeners were expected to demonstrate a higher level of accuracy for length contrasts than American English listeners, because the former are familiar with consonant (but not vowel) length contrasts (i.e., singleton vs. geminate) in their L1. A 2-alternative forced-choice AXB discrimination test that included 125 trials was administered to all the participants, and the listeners' discrimination accuracy (d') was reported. As expected, Japanese listeners were more accurate than all 3 nonnative groups in their discrimination of Japanese vowel length contrasts. The 3 nonnative groups did not differ from one another in their discrimination accuracy despite varying experience with length contrasts in their L1. Only Thai listeners were more accurate in their length discrimination when the target vowel was long than when it was short. Being familiar with vowel length contrasts in L1 may affect the listeners' cross-language perception, but it does not guarantee that their L1 experience automatically results in efficient processing of length contrasts in unfamiliar languages. The extent of success may be related to how length contrasts are phonetically implemented in listeners' L1.

  19. Word Order in Russian Sign Language

    ERIC Educational Resources Information Center

    Kimmelman, Vadim

    2012-01-01

    In this paper the results of an investigation of word order in Russian Sign Language (RSL) are presented. A small corpus of narratives based on comic strips by nine native signers was analyzed and a picture-description experiment (based on Volterra et al. 1984) was conducted with six native signers. The results are the following: the most frequent…

  20. The Sign Language Situation in Mali

    ERIC Educational Resources Information Center

    Nyst, Victoria

    2015-01-01

    This article gives a first overview of the sign language situation in Mali and its capital, Bamako, located in the West African Sahel. Mali is a highly multilingual country with a significant incidence of deafness, for which meningitis appears to be the main cause, coupled with limited access to adequate health care. In comparison to neighboring…

  1. Processing of Formational, Semantic, and Iconic Information in American Sign Language.

    ERIC Educational Resources Information Center

    Poizner, Howard; And Others

    1981-01-01

    Three experiments examined short-term encoding processes of deaf signers for different aspects of signs from American Sign Language. Results indicated that deaf signers code signs at one level in terms of linguistically significant formational parameters. The semantic and iconic information of signs, however, has little effect on short-term…

  2. Sign Language and Spoken Language for Children With Hearing Loss: A Systematic Review.

    PubMed

    Fitzpatrick, Elizabeth M; Hamel, Candyce; Stevens, Adrienne; Pratt, Misty; Moher, David; Doucet, Suzanne P; Neuss, Deirdre; Bernstein, Anita; Na, Eunjung

    2016-01-01

    Permanent hearing loss affects 1 to 3 per 1000 children and interferes with typical communication development. Early detection through newborn hearing screening and hearing technology provide most children with the option of spoken language acquisition. However, no consensus exists on optimal interventions for spoken language development. To conduct a systematic review of the effectiveness of early sign and oral language intervention compared with oral language intervention only for children with permanent hearing loss. An a priori protocol was developed. Electronic databases (eg, Medline, Embase, CINAHL) from 1995 to June 2013 and gray literature sources were searched. Studies in English and French were included. Two reviewers screened potentially relevant articles. Outcomes of interest were measures of auditory, vocabulary, language, and speech production skills. All data collection and risk of bias assessments were completed and then verified by a second person. Grades of Recommendation, Assessment, Development, and Evaluation (GRADE) was used to judge the strength of evidence. Eleven cohort studies met inclusion criteria, of which 8 included only children with severe to profound hearing loss with cochlear implants. Language development was the most frequently reported outcome. Other reported outcomes included speech and speech perception. Several measures and metrics were reported across studies, and descriptions of interventions were sometimes unclear. Very limited, and hence insufficient, high-quality evidence exists to determine whether sign language in combination with oral language is more effective than oral language therapy alone. More research is needed to supplement the evidence base. Copyright © 2016 by the American Academy of Pediatrics.

  3. Language specificity in the perception of voiceless sibilant fricatives in Japanese and English: Implications for cross-language differences in speech-sound development

    PubMed Central

    Li, Fangfang; Munson, Benjamin; Edwards, Jan; Yoneyama, Kiyoko; Hall, Kathleen

    2011-01-01

    Both English and Japanese have two voiceless sibilant fricatives, an anterior fricative ∕s∕ contrasting with a more posterior fricative ∕∫∕. When children acquire sibilant fricatives, English children typically substitute [s] for ∕∫∕, whereas Japanese children typically substitute [∫] for ∕∫∕. This study examined English- and Japanese-speaking adults’ perception of children’s productions of voiceless sibilant fricatives to investigate whether the apparent asymmetry in the acquisition of voiceless sibilant fricatives reported previously in the two languages was due in part to how adults perceive children’s speech. The results of this study show that adult speakers of English and Japanese weighed acoustic parameters differently when identifying fricatives produced by children and that these differences explain, in part, the apparent cross-language asymmetry in fricative acquisition. This study shows that generalizations about universal and language-specific patterns in speech-sound development cannot be determined without considering all sources of variation including speech perception. PMID:21361456

  4. Software Junctus: Joining Sign Language and Alphabetical Writing

    NASA Astrophysics Data System (ADS)

    Valentini, Carla Beatris; Bisol, Cláudia A.; Dalla Santa, Cristiane

    The authors’ aim is to describe the workshops developed to test the use of an authorship program that allows the simultaneous use of sign language and alphabetical writing. The workshops were prepared and conducted by a Computer Science undergraduate, with the support of the Program of Students’ Integration and Mediation (Programa de Integração e Mediação do Acadêmico - PIMA) at the University of Caxias do Sul. Two sign language interpreters, two deaf students and one hearing student, who also teach at a special school for the deaf, participated in the workshops. The main characteristics of the software and the development of the workshops are presented with examples of educational projects created during their development. Possible improvements are also outlined.

  5. Generation of Signs Within Semantic and Phonological Categories: Data from Deaf Adults and Children Who Use American Sign Language.

    PubMed

    Beal-Alvarez, Jennifer S; Figueroa, Daileen M

    2017-04-01

    Two key areas of language development include semantic and phonological knowledge. Semantic knowledge relates to word and concept knowledge. Phonological knowledge relates to how language parameters combine to create meaning. We investigated signing deaf adults' and children's semantic and phonological sign generation via one-minute tasks, including animals, foods, and specific handshapes. We investigated the effects of chronological age, age of sign language acquisition/years at school site, gender, presence of a disability, and geographical location (i.e., USA and Puerto Rico) on participants' performance and relations among tasks. In general, the phonological task appeared more difficult than the semantic tasks, students generated more animals than foods, age, and semantic performance correlated for the larger sample of U.S. students, and geographical variation included use of fingerspelling and specific signs. Compared to their peers, deaf students with disabilities generated fewer semantic items. These results provide an initial snapshot of students' semantic and phonological sign generation. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Neural systems supporting linguistic structure, linguistic experience, and symbolic communication in sign language and gesture.

    PubMed

    Newman, Aaron J; Supalla, Ted; Fernandez, Nina; Newport, Elissa L; Bavelier, Daphne

    2015-09-15

    Sign languages used by deaf communities around the world possess the same structural and organizational properties as spoken languages: In particular, they are richly expressive and also tightly grammatically constrained. They therefore offer the opportunity to investigate the extent to which the neural organization for language is modality independent, as well as to identify ways in which modality influences this organization. The fact that sign languages share the visual-manual modality with a nonlinguistic symbolic communicative system-gesture-further allows us to investigate where the boundaries lie between language and symbolic communication more generally. In the present study, we had three goals: to investigate the neural processing of linguistic structure in American Sign Language (using verbs of motion classifier constructions, which may lie at the boundary between language and gesture); to determine whether we could dissociate the brain systems involved in deriving meaning from symbolic communication (including both language and gesture) from those specifically engaged by linguistically structured content (sign language); and to assess whether sign language experience influences the neural systems used for understanding nonlinguistic gesture. The results demonstrated that even sign language constructions that appear on the surface to be similar to gesture are processed within the left-lateralized frontal-temporal network used for spoken languages-supporting claims that these constructions are linguistically structured. Moreover, although nonsigners engage regions involved in human action perception to process communicative, symbolic gestures, signers instead engage parts of the language-processing network-demonstrating an influence of experience on the perception of nonlinguistic stimuli.

  7. Information Status and Word Order in Croatian Sign Language

    ERIC Educational Resources Information Center

    Milkovic, Marina; Bradaric-Joncic, Sandra; Wilbur, Ronnie B.

    2007-01-01

    This paper presents the results of research on information structure and word order in narrative sentences taken from signed short stories in Croatian Sign Language (HZJ). The basic word order in HZJ is SVO. Factors that result in other word orders include: reversible arguments, verb categories, locative constructions, contrastive focus, and prior…

  8. Writing Signed Languages: What for? What Form? A Response

    ERIC Educational Resources Information Center

    Moores, Donald F.

    2017-01-01

    In his article in an "American Annals of the Deaf" special issue that also includes the present article, Grushkin (EJ1174123) divides his discussion of a written sign system into three basic parts. The first presents arguments against the development of a written form of American Sign Language; the second provides a rationale…

  9. Three-dimensional grammar in the brain: Dissociating the neural correlates of natural sign language and manually coded spoken language.

    PubMed

    Jednoróg, Katarzyna; Bola, Łukasz; Mostowski, Piotr; Szwed, Marcin; Boguszewski, Paweł M; Marchewka, Artur; Rutkowski, Paweł

    2015-05-01

    In several countries natural sign languages were considered inadequate for education. Instead, new sign-supported systems were created, based on the belief that spoken/written language is grammatically superior. One such system called SJM (system językowo-migowy) preserves the grammatical and lexical structure of spoken Polish and since 1960s has been extensively employed in schools and on TV. Nevertheless, the Deaf community avoids using SJM for everyday communication, its preferred language being PJM (polski język migowy), a natural sign language, structurally and grammatically independent of spoken Polish and featuring classifier constructions (CCs). Here, for the first time, we compare, with fMRI method, the neural bases of natural vs. devised communication systems. Deaf signers were presented with three types of signed sentences (SJM and PJM with/without CCs). Consistent with previous findings, PJM with CCs compared to either SJM or PJM without CCs recruited the parietal lobes. The reverse comparison revealed activation in the anterior temporal lobes, suggesting increased semantic combinatory processes in lexical sign comprehension. Finally, PJM compared with SJM engaged left posterior superior temporal gyrus and anterior temporal lobe, areas crucial for sentence-level speech comprehension. We suggest that activity in these two areas reflects greater processing efficiency for naturally evolved sign language. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Where "Sign Language Studies" Has Led Us in Forty Years: Opening High School and University Education for Deaf People in Viet Nam through Sign Language Analysis, Teaching, and Interpretation

    ERIC Educational Resources Information Center

    Woodward, James; Hoa, Nguyen Thi

    2012-01-01

    This paper discusses how the Nippon Foundation-funded project "Opening University Education to Deaf People in Viet Nam through Sign Language Analysis, Teaching, and Interpretation," also known as the Dong Nai Deaf Education Project, has been implemented through sign language studies from 2000 through 2012. This project has provided deaf…

  11. Identifying Movement Patterns and Severity of Associated Pain in Sign Language Interpreters

    ERIC Educational Resources Information Center

    Freeman, Julie K.; Rogers, Janet L.

    2010-01-01

    Our research sought to identify the most common movement patterns and postures performed by sign language interpreters and the frequency and severity of any pain that may be associated with the movements. A survey was developed and mailed to registered sign language interpreters throughout the state of Illinois. For each specific upper extremity…

  12. The Language Continuum: Narrative Discourse Skills in English-Japanese Bilingual Children.

    ERIC Educational Resources Information Center

    Minami, Masahiko

    In this research, using the "Frog, Where Are You?" picture book, 40 bilingual children age 6-12 years were asked to narrate the story in two languages, English and Japanese. Both quantitative and qualitative analyses were performed in order to study the relationship between the use of the two languages. The results generally suggest that…

  13. The Nature of Object Marking in American Sign Language

    ERIC Educational Resources Information Center

    Gokgoz, Kadir

    2013-01-01

    In this dissertation, I examine the nature of object marking in American Sign Language (ASL). I investigate object marking by means of directionality (the movement of the verb towards a certain location in signing space) and by means of handling classifiers (certain handshapes accompanying the verb). I propose that object marking in ASL is…

  14. Development of Japanese Children's Narrative Skills: Linguistic Devices and Strategies To Encode Their Perspective and Emotion.

    ERIC Educational Resources Information Center

    Minami, Masahiko

    Studies on child language acquisition suggest that Japanese children begin to use a variety of linguistic signs very early. However, even if young Japanese children learned the social pragmatic functions and interactional dimensions of such linguistic means and communicative devices, they might not have acquired the subtleties of those devices…

  15. Signing Earth Science: Accommodations for Students Who Are Deaf or Hard of Hearing and Whose First Language Is Sign

    NASA Astrophysics Data System (ADS)

    Vesel, J.; Hurdich, J.

    2014-12-01

    TERC and Vcom3D used the SigningAvatar® accessibility software to research and develop a Signing Earth Science Dictionary (SESD) of approximately 750 standards-based Earth science terms for high school students who are deaf and hard of hearing and whose first language is sign. The partners also evaluated the extent to which use of the SESD furthers understanding of Earth science content, command of the language of Earth science, and the ability to study Earth science independently. Disseminated as a Web-based version and App, the SESD is intended to serve the ~36,000 grade 9-12 students who are deaf or hard of hearing and whose first language is sign, the majority of whom leave high school reading at the fifth grade or below. It is also intended for teachers and interpreters who interact with members of this population and professionals working with Earth science education programs during field trips, internships etc. The signed SESD terms have been incorporated into a Mobile Communication App (MCA). This App for Androids is intended to facilitate communication between English speakers and persons who communicate in American Sign Language (ASL) or Signed English. It can translate words, phrases, or whole sentences from written or spoken English to animated signing. It can also fingerspell proper names and other words for which there are no signs. For our presentation, we will demonstrate the interactive features of the SigningAvatar® accessibility software that support the three principles of Universal Design for Learning (UDL) and have been incorporated into the SESD and MCA. Results from national field-tests will provide insight into the SESD's and MCA's potential applicability beyond grade 12 as accommodations that can be used for accessing the vocabulary deaf and hard of hearing students need for study of the geosciences and for facilitating communication about content. This work was funded in part by grants from NSF and the U.S. Department of Education.

  16. Referential shift in Nicaraguan Sign Language: a transition from lexical to spatial devices

    PubMed Central

    Kocab, Annemarie; Pyers, Jennie; Senghas, Ann

    2015-01-01

    Even the simplest narratives combine multiple strands of information, integrating different characters and their actions by expressing multiple perspectives of events. We examined the emergence of referential shift devices, which indicate changes among these perspectives, in Nicaraguan Sign Language (NSL). Sign languages, like spoken languages, mark referential shift grammatically with a shift in deictic perspective. In addition, sign languages can mark the shift with a point or a movement of the body to a specified spatial location in the three-dimensional space in front of the signer, capitalizing on the spatial affordances of the manual modality. We asked whether the use of space to mark referential shift emerges early in a new sign language by comparing the first two age cohorts of deaf signers of NSL. Eight first-cohort signers and 10 second-cohort signers watched video vignettes and described them in NSL. Narratives were coded for lexical (use of words) and spatial (use of signing space) devices. Although the cohorts did not differ significantly in the number of perspectives represented, second-cohort signers used referential shift devices to explicitly mark a shift in perspective in more of their narratives. Furthermore, while there was no significant difference between cohorts in the use of non-spatial, lexical devices, there was a difference in spatial devices, with second-cohort signers using them in significantly more of their narratives. This suggests that spatial devices have only recently increased as systematic markers of referential shift. Spatial referential shift devices may have emerged more slowly because they depend on the establishment of fundamental spatial conventions in the language. While the modality of sign languages can ultimately engender the syntactic use of three-dimensional space, we propose that a language must first develop systematic spatial distinctions before harnessing space for grammatical functions. PMID:25713541

  17. Referential shift in Nicaraguan Sign Language: a transition from lexical to spatial devices.

    PubMed

    Kocab, Annemarie; Pyers, Jennie; Senghas, Ann

    2014-01-01

    Even the simplest narratives combine multiple strands of information, integrating different characters and their actions by expressing multiple perspectives of events. We examined the emergence of referential shift devices, which indicate changes among these perspectives, in Nicaraguan Sign Language (NSL). Sign languages, like spoken languages, mark referential shift grammatically with a shift in deictic perspective. In addition, sign languages can mark the shift with a point or a movement of the body to a specified spatial location in the three-dimensional space in front of the signer, capitalizing on the spatial affordances of the manual modality. We asked whether the use of space to mark referential shift emerges early in a new sign language by comparing the first two age cohorts of deaf signers of NSL. Eight first-cohort signers and 10 second-cohort signers watched video vignettes and described them in NSL. Narratives were coded for lexical (use of words) and spatial (use of signing space) devices. Although the cohorts did not differ significantly in the number of perspectives represented, second-cohort signers used referential shift devices to explicitly mark a shift in perspective in more of their narratives. Furthermore, while there was no significant difference between cohorts in the use of non-spatial, lexical devices, there was a difference in spatial devices, with second-cohort signers using them in significantly more of their narratives. This suggests that spatial devices have only recently increased as systematic markers of referential shift. Spatial referential shift devices may have emerged more slowly because they depend on the establishment of fundamental spatial conventions in the language. While the modality of sign languages can ultimately engender the syntactic use of three-dimensional space, we propose that a language must first develop systematic spatial distinctions before harnessing space for grammatical functions.

  18. Automatic Mexican sign language and digits recognition using normalized central moments

    NASA Astrophysics Data System (ADS)

    Solís, Francisco; Martínez, David; Espinosa, Oscar; Toxqui, Carina

    2016-09-01

    This work presents a framework for automatic Mexican sign language and digits recognition based on computer vision system using normalized central moments and artificial neural networks. Images are captured by digital IP camera, four LED reflectors and a green background in order to reduce computational costs and prevent the use of special gloves. 42 normalized central moments are computed per frame and used in a Multi-Layer Perceptron to recognize each database. Four versions per sign and digit were used in training phase. 93% and 95% of recognition rates were achieved for Mexican sign language and digits respectively.

  19. Students' Choice of Language and Initial Motivation for Studying Japanese at the University of Jyväskylä Language Centre

    ERIC Educational Resources Information Center

    Takala, Pauliina

    2015-01-01

    Elective language courses, particularly those starting from the beginner level, constitute their own special group within the communication and language course offerings of universities. The elementary courses of less commonly taught languages (LCTL), such as Japanese, provide students with the opportunity to acquire, among other benefits, a…

  20. The effect of sign language structure on complex word reading in Chinese deaf adolescents.

    PubMed

    Lu, Aitao; Yu, Yanping; Niu, Jiaxin; Zhang, John X

    2015-01-01

    The present study was carried out to investigate whether sign language structure plays a role in the processing of complex words (i.e., derivational and compound words), in particular, the delay of complex word reading in deaf adolescents. Chinese deaf adolescents were found to respond faster to derivational words than to compound words for one-sign-structure words, but showed comparable performance for two-sign-structure words. For both derivational and compound words, response latencies to one-sign-structure words were shorter than to two-sign-structure words. These results provide strong evidence that the structure of sign language affects written word processing in Chinese. Additionally, differences between derivational and compound words in the one-sign-structure condition indicate that Chinese deaf adolescents acquire print morphological awareness. The results also showed that delayed word reading was found in derivational words with two signs (DW-2), compound words with one sign (CW-1), and compound words with two signs (CW-2), but not in derivational words with one sign (DW-1), with the delay being maximum in DW-2, medium in CW-2, and minimum in CW-1, suggesting that the structure of sign language has an impact on the delayed processing of Chinese written words in deaf adolescents. These results provide insight into the mechanisms about how sign language structure affects written word processing and its delayed processing relative to their hearing peers of the same age.

  1. Teachers' perceptions of promoting sign language phonological awareness in an ASL/English bilingual program.

    PubMed

    Crume, Peter K

    2013-10-01

    The National Reading Panel emphasizes that spoken language phonological awareness (PA) developed at home and school can lead to improvements in reading performance in young children. However, research indicates that many deaf children are good readers even though they have limited spoken language PA. Is it possible that some deaf students benefit from teachers who promote sign language PA instead? The purpose of this qualitative study is to examine teachers' beliefs and instructional practices related to sign language PA. A thematic analysis is conducted on 10 participant interviews at an ASL/English bilingual school for the deaf to understand their views and instructional practices. The findings reveal that the participants had strong beliefs in developing students' structural knowledge of signs and used a variety of instructional strategies to build students' knowledge of sign structures in order to promote their language and literacy skills.

  2. Symbiotic symbolization by hand and mouth in sign language*

    PubMed Central

    Sandler, Wendy

    2010-01-01

    Current conceptions of human language include a gestural component in the communicative event. However, determining how the linguistic and gestural signals are distinguished, how each is structured, and how they interact still poses a challenge for the construction of a comprehensive model of language. This study attempts to advance our understanding of these issues with evidence from sign language. The study adopts McNeill’s criteria for distinguishing gestures from the linguistically organized signal, and provides a brief description of the linguistic organization of sign languages. Focusing on the subcategory of iconic gestures, the paper shows that signers create iconic gestures with the mouth, an articulator that acts symbiotically with the hands to complement the linguistic description of objects and events. A new distinction between the mimetic replica and the iconic symbol accounts for the nature and distribution of iconic mouth gestures and distinguishes them from mimetic uses of the mouth. Symbiotic symbolization by hand and mouth is a salient feature of human language, regardless of whether the primary linguistic modality is oral or manual. Speakers gesture with their hands, and signers gesture with their mouths. PMID:20445832

  3. Modality-specific processing precedes amodal linguistic processing during L2 sign language acquisition: A longitudinal study.

    PubMed

    Williams, Joshua T; Darcy, Isabelle; Newman, Sharlene D

    2016-02-01

    The present study tracked activation pattern differences in response to sign language processing by late hearing second language learners of American Sign Language. Learners were scanned before the start of their language courses. They were scanned again after their first semester of instruction and their second, for a total of 10 months of instruction. The study aimed to characterize modality-specific to modality-general processing throughout the acquisition of sign language. Results indicated that before the acquisition of sign language, neural substrates related to modality-specific processing were present. After approximately 45 h of instruction, the learners transitioned into processing signs on a phonological basis (e.g., supramarginal gyrus, putamen). After one more semester of input, learners transitioned once more to a lexico-semantic processing stage (e.g., left inferior frontal gyrus) at which language control mechanisms (e.g., left caudate, cingulate gyrus) were activated. During these transitional steps right hemispheric recruitment was observed, with increasing left-lateralization, which is similar to other native signers and L2 learners of spoken language; however, specialization for sign language processing with activation in the inferior parietal lobule (i.e., angular gyrus), even for late learners, was observed. As such, the present study is the first to track L2 acquisition of sign language learners in order to characterize modality-independent and modality-specific mechanisms for bilingual language processing. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Text generation from Taiwanese Sign Language using a PST-based language model for augmentative communication.

    PubMed

    Wu, Chung-Hsien; Chiu, Yu-Hsien; Guo, Chi-Shiang

    2004-12-01

    This paper proposes a novel approach to the generation of Chinese sentences from ill-formed Taiwanese Sign Language (TSL) for people with hearing impairments. First, a sign icon-based virtual keyboard is constructed to provide a visualized interface to retrieve sign icons from a sign database. A proposed language model (LM), based on a predictive sentence template (PST) tree, integrates a statistical variable n-gram LM and linguistic constraints to deal with the translation problem from ill-formed sign sequences to grammatical written sentences. The PST tree trained by a corpus collected from the deaf schools was used to model the correspondence between signed and written Chinese. In addition, a set of phrase formation rules, based on trigger pair category, was derived for sentence pattern expansion. These approaches improved the efficiency of text generation and the accuracy of word prediction and, therefore, improved the input rate. For the assessment of practical communication aids, a reading-comprehension training program with ten profoundly deaf students was undertaken in a deaf school in Tainan, Taiwan. Evaluation results show that the literacy aptitude test and subjective satisfactory level are significantly improved.

  5. An Intelligent Computer-Based System for Sign Language Tutoring

    ERIC Educational Resources Information Center

    Ritchings, Tim; Khadragi, Ahmed; Saeb, Magdy

    2012-01-01

    A computer-based system for sign language tutoring has been developed using a low-cost data glove and a software application that processes the movement signals for signs in real-time and uses Pattern Matching techniques to decide if a trainee has closely replicated a teacher's recorded movements. The data glove provides 17 movement signals from…

  6. Argumentation Text Construction by Japanese as a Foreign Language Writers: A Dynamic View of Transfer

    ERIC Educational Resources Information Center

    Rinnert, Carol; Kobauashi, Hiroe; Katayama, Akemi

    2015-01-01

    This study takes a dynamic view of transfer as reusing and reshaping previous knowledge in new writing contexts to investigate how novice Japanese as a foreign language (JFL) writers draw on knowledge across languages to construct L1 and L2 texts. We analyzed L1 English and L2 Japanese argumentation essays by the same JFL writers (N = 19) and L1…

  7. Learning Styles and the Japanese University Second Language Student.

    ERIC Educational Resources Information Center

    Rausch, Anthony S.

    This study investigated learning styles and learning strategies among Japanese university students whose majors are directly related to English. Data were gathered in a survey of 365 students in English literature, language, or linguistics courses at two universities. The survey included questions about study outside class time, study using…

  8. Atypical Speech and Language Development: A Consensus Study on Clinical Signs in the Netherlands

    ERIC Educational Resources Information Center

    Visser-Bochane, Margot I.; Gerrits, Ellen; van der Schans, Cees P.; Reijneveld, Sijmen A.; Luinge, Margreet R.

    2017-01-01

    Background: Atypical speech and language development is one of the most common developmental difficulties in young children. However, which clinical signs characterize atypical speech-language development at what age is not clear. Aim: To achieve a national and valid consensus on clinical signs and red flags (i.e. most urgent clinical signs) for…

  9. Hand/Wrist Disorders among Sign Language Communicators.

    ERIC Educational Resources Information Center

    Smith, Susan M.; Kress, Tyler A.; Hart, William M.

    2000-01-01

    A study assessed the frequency of self-reported hand/wrist problems among 184 sign-language communicators. Fifty-nine percent reported experiencing hand/wrist problems, 26 percent reported experiencing hand/wrist problems severe enough to limit their ability to work, and 18 percent reported a medical diagnosis of wrist tendinitis, carpal tunnel…

  10. Sign Language and Pantomime Production Differentially Engage Frontal and Parietal Cortices

    ERIC Educational Resources Information Center

    Emmorey, Karen; McCullough, Stephen; Mehta, Sonya; Ponto, Laura L. B.; Grabowski, Thomas J.

    2011-01-01

    We investigated the functional organisation of neural systems supporting language production when the primary language articulators are also used for meaningful, but nonlinguistic, expression such as pantomime. Fourteen hearing nonsigners and 10 deaf native users of American Sign Language (ASL) participated in an H[subscript 2][superscript…

  11. Visual sign phonology: insights into human reading and language from a natural soundless phonology.

    PubMed

    Petitto, L A; Langdon, C; Stone, A; Andriola, D; Kartheiser, G; Cochran, C

    2016-11-01

    Among the most prevailing assumptions in science and society about the human reading process is that sound and sound-based phonology are critical to young readers. The child's sound-to-letter decoding is viewed as universal and vital to deriving meaning from print. We offer a different view. The crucial link for early reading success is not between segmental sounds and print. Instead the human brain's capacity to segment, categorize, and discern linguistic patterning makes possible the capacity to segment all languages. This biological process includes the segmentation of languages on the hands in signed languages. Exposure to natural sign language in early life equally affords the child's discovery of silent segmental units in visual sign phonology (VSP) that can also facilitate segmental decoding of print. We consider powerful biological evidence about the brain, how it builds sound and sign phonology, and why sound and sign phonology are equally important in language learning and reading. We offer a testable theoretical account, reading model, and predictions about how VSP can facilitate segmentation and mapping between print and meaning. We explain how VSP can be a powerful facilitator of all children's reading success (deaf and hearing)-an account with profound transformative impact on learning to read in deaf children with different language backgrounds. The existence of VSP has important implications for understanding core properties of all human language and reading, challenges assumptions about language and reading as being tied to sound, and provides novel insight into a remarkable biological equivalence in signed and spoken languages. WIREs Cogn Sci 2016, 7:366-381. doi: 10.1002/wcs.1404 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.

  12. Use of Information and Communication Technologies in Sign Language Test Development: Results of an International Survey

    ERIC Educational Resources Information Center

    Haug, Tobias

    2015-01-01

    Sign language test development is a relatively new field within sign linguistics, motivated by the practical need for assessment instruments to evaluate language development in different groups of learners (L1, L2). Due to the lack of research on the structure and acquisition of many sign languages, developing an assessment instrument poses…

  13. American Sign Language Teachers: Practices and Perceptions.

    ERIC Educational Resources Information Center

    Newell, William J.

    1995-01-01

    Reports on a survey of 359 teachers of American Sign Language (ASL) conducted in 1993-94. Results found that the ability to apply appropriate methods, professional knowledge of ASL teaching practice, and bilingual skills in ASL and English were considered very important. Knowledge of theoretical issues and classroom management skills were viewed…

  14. Is Teaching Sign Language in Early Childhood Classrooms Feasible for Busy Teachers and Beneficial for Children?

    ERIC Educational Resources Information Center

    Brereton, Amy Elizabeth

    2010-01-01

    Infants' hands are ready to construct words using sign language before their mouths are ready to speak. These research findings may explain the popularity of parents and caregivers teaching and using sign language with infants and toddlers, along with speech. The advantages of using sign language with young children go beyond the infant and…

  15. Perspectives on the Sign Language Factor in Sub-Saharan Africa: Challenges of Sustainability

    ERIC Educational Resources Information Center

    Lutalo-Kiingi, Sam; De Clerck, Goedele A. M.

    2017-01-01

    This article has been excerpted from "Perspectives on the Sign Language Factor in Sub-Saharan Africa: Challenges of Sustainability" (Lutalo-Kiingi and De Clerck) in "Sign Language, Sustainable Development, and Equal Opportunities: Envisioning the Future for Deaf Students" (G. A. M. De Clerck and P. V. Paul (Eds.) 2016). In this…

  16. Dissociating linguistic and non-linguistic gesture processing: electrophysiological evidence from American Sign Language.

    PubMed

    Grosvald, Michael; Gutierrez, Eva; Hafer, Sarah; Corina, David

    2012-04-01

    A fundamental advance in our understanding of human language would come from a detailed account of how non-linguistic and linguistic manual actions are differentiated in real time by language users. To explore this issue, we targeted the N400, an ERP component known to be sensitive to semantic context. Deaf signers saw 120 American Sign Language sentences, each consisting of a "frame" (a sentence without the last word; e.g. BOY SLEEP IN HIS) followed by a "last item" belonging to one of four categories: a high-close-probability sign (a "semantically reasonable" completion to the sentence; e.g. BED), a low-close-probability sign (a real sign that is nonetheless a "semantically odd" completion to the sentence; e.g. LEMON), a pseudo-sign (phonologically legal but non-lexical form), or a non-linguistic grooming gesture (e.g. the performer scratching her face). We found significant N400-like responses in the incongruent and pseudo-sign contexts, while the gestures elicited a large positivity. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. Dissociating linguistic and non-linguistic gesture processing: Electrophysiological evidence from American Sign Language

    PubMed Central

    Grosvald, Michael; Gutierrez, Eva; Hafer, Sarah; Corina, David

    2012-01-01

    A fundamental advance in our understanding of human language would come from a detailed account of how non-linguistic and linguistic manual actions are differentiated in real time by language users. To explore this issue, we targeted the N400, an ERP component known to be sensitive to semantic context. Deaf signers saw 120 American Sign Language sentences, each consisting of a “frame” (a sentence without the last word; e.g. BOY SLEEP IN HIS) followed by a “last item” belonging to one of four categories: a high-cloze-probability sign (a “semantically reasonable” completion to the sentence; e.g. BED), a low-cloze-probability sign (a real sign that is nonetheless a “semantically odd” completion to the sentence; e.g. LEMON), a pseudo-sign (phonologically legal but non-lexical form), or a non-linguistic grooming gesture (e.g. the performer scratching her face). We found significant N400-like responses in the incongruent and pseudo-sign contexts, while the gestures elicited a large positivity. PMID:22341555

  18. Contesting Visions at a Japanese School for the Deaf

    ERIC Educational Resources Information Center

    Hayashi, Akiko; Tobin, Joseph

    2015-01-01

    This paper tells the story of the struggle to introduce a Japanese sign language program in a school for the deaf in Japan that until recently had followed the government's approach that emphasizes oral communication. Our method and conceptual framework is ethnographic, as we emphasize the cultural beliefs that underlie the three competing…

  19. Identifying Specific Language Impairment in Deaf Children Acquiring British Sign Language: Implications for Theory and Practice

    ERIC Educational Resources Information Center

    Mason, Kathryn; Rowley, Katherine; Marshall, Chloe R.; Atkinson, Joanna R.; Herman, Rosalind; Woll, Bencie; Morgan, Gary

    2010-01-01

    This paper presents the first ever group study of specific language impairment (SLI) in users of sign language. A group of 50 children were referred to the study by teachers and speech and language therapists. Individuals who fitted pre-determined criteria for SLI were then systematically assessed. Here, we describe in detail the performance of 13…

  20. Cross-Modal Recruitment of Auditory and Orofacial Areas During Sign Language in a Deaf Subject.

    PubMed

    Martino, Juan; Velasquez, Carlos; Vázquez-Bourgon, Javier; de Lucas, Enrique Marco; Gomez, Elsa

    2017-09-01

    Modern sign languages used by deaf people are fully expressive, natural human languages that are perceived visually and produced manually. The literature contains little data concerning human brain organization in conditions of deficient sensory information such as deafness. A deaf-mute patient underwent surgery of a left temporoinsular low-grade glioma. The patient underwent awake surgery with intraoperative electrical stimulation mapping, allowing direct study of the cortical and subcortical organization of sign language. We found a similar distribution of language sites to what has been reported in mapping studies of patients with oral language, including 1) speech perception areas inducing anomias and alexias close to the auditory cortex (at the posterior portion of the superior temporal gyrus and supramarginal gyrus); 2) speech production areas inducing speech arrest (anarthria) at the ventral premotor cortex, close to the lip motor area and away from the hand motor area; and 3) subcortical stimulation-induced semantic paraphasias at the inferior fronto-occipital fasciculus at the temporal isthmus. The intraoperative setup for sign language mapping with intraoperative electrical stimulation in deaf-mute patients is similar to the setup described in patients with oral language. To elucidate the type of language errors, a sign language interpreter in close interaction with the neuropsychologist is necessary. Sign language is perceived visually and produced manually; however, this case revealed a cross-modal recruitment of auditory and orofacial motor areas. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Semantic Fluency in Deaf Children Who Use Spoken and Signed Language in Comparison with Hearing Peers

    ERIC Educational Resources Information Center

    Marshall, C. R.; Jones, A.; Fastelli, A.; Atkinson, J.; Botting, N.; Morgan, G.

    2018-01-01

    Background: Deafness has an adverse impact on children's ability to acquire spoken languages. Signed languages offer a more accessible input for deaf children, but because the vast majority are born to hearing parents who do not sign, their early exposure to sign language is limited. Deaf children as a whole are therefore at high risk of language…

  2. ASL-LEX: A lexical database of American Sign Language.

    PubMed

    Caselli, Naomi K; Sehyr, Zed Sevcikova; Cohen-Goldberg, Ariel M; Emmorey, Karen

    2017-04-01

    ASL-LEX is a lexical database that catalogues information about nearly 1,000 signs in American Sign Language (ASL). It includes the following information: subjective frequency ratings from 25-31 deaf signers, iconicity ratings from 21-37 hearing non-signers, videoclip duration, sign length (onset and offset), grammatical class, and whether the sign is initialized, a fingerspelled loan sign, or a compound. Information about English translations is available for a subset of signs (e.g., alternate translations, translation consistency). In addition, phonological properties (sign type, selected fingers, flexion, major and minor location, and movement) were coded and used to generate sub-lexical frequency and neighborhood density estimates. ASL-LEX is intended for use by researchers, educators, and students who are interested in the properties of the ASL lexicon. An interactive website where the database can be browsed and downloaded is available at http://asl-lex.org .

  3. ASL-LEX: A lexical database of American Sign Language

    PubMed Central

    Caselli, Naomi K.; Sehyr, Zed Sevcikova; Cohen-Goldberg, Ariel M.; Emmorey, Karen

    2016-01-01

    ASL-LEX is a lexical database that catalogues information about nearly 1,000 signs in American Sign Language (ASL). It includes the following information: subjective frequency ratings from 25–31 deaf signers, iconicity ratings from 21–37 hearing non-signers, videoclip duration, sign length (onset and offset), grammatical class, and whether the sign is initialized, a fingerspelled loan sign or a compound. Information about English translations is available for a subset of signs (e.g., alternate translations, translation consistency). In addition, phonological properties (sign type, selected fingers, flexion, major and minor location, and movement) were coded and used to generate sub-lexical frequency and neighborhood density estimates. ASL-LEX is intended for use by researchers, educators, and students who are interested in the properties of the ASL lexicon. An interactive website where the database can be browsed and downloaded is available at http://asl-lex.org. PMID:27193158

  4. Kinect-based sign language recognition of static and dynamic hand movements

    NASA Astrophysics Data System (ADS)

    Dalawis, Rando C.; Olayao, Kenneth Deniel R.; Ramos, Evan Geoffrey I.; Samonte, Mary Jane C.

    2017-02-01

    A different approach of sign language recognition of static and dynamic hand movements was developed in this study using normalized correlation algorithm. The goal of this research was to translate fingerspelling sign language into text using MATLAB and Microsoft Kinect. Digital input image captured by Kinect devices are matched from template samples stored in a database. This Human Computer Interaction (HCI) prototype was developed to help people with communication disability to express their thoughts with ease. Frame segmentation and feature extraction was used to give meaning to the captured images. Sequential and random testing was used to test both static and dynamic fingerspelling gestures. The researchers explained some factors they encountered causing some misclassification of signs.

  5. Real-time lexical comprehension in young children learning American Sign Language.

    PubMed

    MacDonald, Kyle; LaMarr, Todd; Corina, David; Marchman, Virginia A; Fernald, Anne

    2018-04-16

    When children interpret spoken language in real time, linguistic information drives rapid shifts in visual attention to objects in the visual world. This language-vision interaction can provide insights into children's developing efficiency in language comprehension. But how does language influence visual attention when the linguistic signal and the visual world are both processed via the visual channel? Here, we measured eye movements during real-time comprehension of a visual-manual language, American Sign Language (ASL), by 29 native ASL-learning children (16-53 mos, 16 deaf, 13 hearing) and 16 fluent deaf adult signers. All signers showed evidence of rapid, incremental language comprehension, tending to initiate an eye movement before sign offset. Deaf and hearing ASL-learners showed similar gaze patterns, suggesting that the in-the-moment dynamics of eye movements during ASL processing are shaped by the constraints of processing a visual language in real time and not by differential access to auditory information in day-to-day life. Finally, variation in children's ASL processing was positively correlated with age and vocabulary size. Thus, despite competition for attention within a single modality, the timing and accuracy of visual fixations during ASL comprehension reflect information processing skills that are important for language acquisition regardless of language modality. © 2018 John Wiley & Sons Ltd.

  6. Sign Language Planning in the Netherlands between 1980 and 2010

    ERIC Educational Resources Information Center

    Schermer, Trude

    2012-01-01

    This article discusses several aspects of language planning with respect to Sign Language of the Netherlands, or Nederlandse Gebarentaal (NGT). For nearly thirty years members of the Deaf community, the Dutch Deaf Council (Dovenschap) have been working together with researchers, several organizations in deaf education, and the organization of…

  7. MobileASL: intelligibility of sign language video over mobile phones.

    PubMed

    Cavender, Anna; Vanam, Rahul; Barney, Dane K; Ladner, Richard E; Riskin, Eve A

    2008-01-01

    For Deaf people, access to the mobile telephone network in the United States is currently limited to text messaging, forcing communication in English as opposed to American Sign Language (ASL), the preferred language. Because ASL is a visual language, mobile video phones have the potential to give Deaf people access to real-time mobile communication in their preferred language. However, even today's best video compression techniques can not yield intelligible ASL at limited cell phone network bandwidths. Motivated by this constraint, we conducted one focus group and two user studies with members of the Deaf Community to determine the intelligibility effects of video compression techniques that exploit the visual nature of sign language. Inspired by eye tracking results that show high resolution foveal vision is maintained around the face, we studied region-of-interest encodings (where the face is encoded at higher quality) as well as reduced frame rates (where fewer, better quality, frames are displayed every second). At all bit rates studied here, participants preferred moderate quality increases in the face region, sacrificing quality in other regions. They also preferred slightly lower frame rates because they yield better quality frames for a fixed bit rate. The limited processing power of cell phones is a serious concern because a real-time video encoder and decoder will be needed. Choosing less complex settings for the encoder can reduce encoding time, but will affect video quality. We studied the intelligibility effects of this tradeoff and found that we can significantly speed up encoding time without severely affecting intelligibility. These results show promise for real-time access to the current low-bandwidth cell phone network through sign-language-specific encoding techniques.

  8. Electrophysiological evidence for phonological priming in Spanish Sign Language lexical access.

    PubMed

    Gutiérrez, Eva; Müller, Oliver; Baus, Cristina; Carreiras, Manuel

    2012-06-01

    Interactive activation models of lexical access assume that the presentation of a given word activates not only its lexical representation but also those corresponding to words similar in form. Current theories are based on data from oral and written languages, and therefore signed languages represent a special challenge for existing theories of word recognition and lexical access since they allow us to question what the genuine fundamentals of human language are and what might be modality-specific adaptation. The aim of the present study is to determine the electrophysiological correlates and time course of phonological processing of Spanish Sign Language (LSE). Ten deaf native LSE signers and ten deaf non-native but highly proficient LSE signers participated in the experiment. We used the ERP methodology and form-based priming in the context of a delayed lexical decision task, manipulating phonological overlap (i.e. related prime-target pairs shared either handshape or location parameters). Results showed that both parameters under study modulated brain responses to the stimuli in different time windows. Phonological priming of location resulted in a higher amplitude of the N400 component (300-500 ms window) for signs but not for non-signs. This effect may be explained in terms of initial competition among candidates. Moreover, the fact that a higher amplitude N400 for related pairs was found for signs but not for non-signs points to an effect at the lexical level. Handshape overlap produced a later effect (600-800 ms window). In this window, a more negative-going wave for the related condition than for the unrelated condition was found for non-signs in the native signers group. The findings are discussed in relation to current models of lexical access and word recognition. Finally, differences between native and non-native signers point to a less efficient use of phonological information among the non-native signers. Copyright © 2012 Elsevier Ltd. All rights

  9. Exploring the use of dynamic language assessment with deaf children, who use American Sign Language: Two case studies.

    PubMed

    Mann, Wolfgang; Peña, Elizabeth D; Morgan, Gary

    2014-01-01

    We describe a model for assessment of lexical-semantic organization skills in American Sign Language (ASL) within the framework of dynamic vocabulary assessment and discuss the applicability and validity of the use of mediated learning experiences (MLE) with deaf signing children. Two elementary students (ages 7;6 and 8;4) completed a set of four vocabulary tasks and received two 30-minute mediations in ASL. Each session consisted of several scripted activities focusing on the use of categorization. Both had experienced difficulties in providing categorically related responses in one of the vocabulary tasks used previously. Results showed that the two students exhibited notable differences with regards to their learning pace, information uptake, and effort required by the mediator. Furthermore, we observed signs of a shift in strategic behavior by the lower performing student during the second mediation. Results suggest that the use of dynamic assessment procedures in a vocabulary context was helpful in understanding children's strategies as related to learning potential. These results are discussed in terms of deaf children's cognitive modifiability with implications for planning instruction and how MLE can be used with a population that uses ASL. The reader will (1) recognize the challenges in appropriate language assessment of deaf signing children; (2) recall the three areas explored to investigate whether a dynamic assessment approach is sensitive to differences in deaf signing children's language learning profiles (3) discuss how dynamic assessment procedures can make deaf signing children's individual language learning differences visible. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. The Processing of Biologically Plausible and Implausible forms in American Sign Language: Evidence for Perceptual Tuning.

    PubMed

    Almeida, Diogo; Poeppel, David; Corina, David

    The human auditory system distinguishes speech-like information from general auditory signals in a remarkably fast and efficient way. Combining psychophysics and neurophysiology (MEG), we demonstrate a similar result for the processing of visual information used for language communication in users of sign languages. We demonstrate that the earliest visual cortical responses in deaf signers viewing American Sign Language (ASL) signs show specific modulations to violations of anatomic constraints that would make the sign either possible or impossible to articulate. These neural data are accompanied with a significantly increased perceptual sensitivity to the anatomical incongruity. The differential effects in the early visual evoked potentials arguably reflect an expectation-driven assessment of somatic representational integrity, suggesting that language experience and/or auditory deprivation may shape the neuronal mechanisms underlying the analysis of complex human form. The data demonstrate that the perceptual tuning that underlies the discrimination of language and non-language information is not limited to spoken languages but extends to languages expressed in the visual modality.

  11. Becoming a Japanese Language Learner, User, and Teacher: Revelations From Life History Research

    ERIC Educational Resources Information Center

    Armour, William S.

    2004-01-01

    This article discusses how Sarah Lamond, a Japanese language teacher in Sydney, Australia has juggled three of her identities: second language (L2) learner, L2 user, and L2 teacher. Data come from four interviews used to create an edited life history. These data are used to draw attention to the relationship between L2 learner and language user.…

  12. Conceptual Representation of Actions in Sign Language

    ERIC Educational Resources Information Center

    Dobel, Christian; Enriquez-Geppert, Stefanie; Hummert, Marja; Zwitserlood, Pienie; Bolte, Jens

    2011-01-01

    The idea that knowledge of events entails a universal spatial component, that is conceiving agents left of patients, was put to test by investigating native users of German sign language and native users of spoken German. Participants heard or saw event descriptions and had to illustrate the meaning of these events by means of drawing or arranging…

  13. Assessing language skills in adult key word signers with intellectual disabilities: Insights from sign linguistics.

    PubMed

    Grove, Nicola; Woll, Bencie

    2017-03-01

    Manual signing is one of the most widely used approaches to support the communication and language skills of children and adults who have intellectual or developmental disabilities, and problems with communication in spoken language. A recent series of papers reporting findings from this population raises critical issues for professionals in the assessment of multimodal language skills of key word signers. Approaches to assessment will differ depending on whether key word signing (KWS) is viewed as discrete from, or related to, natural sign languages. Two available assessments from these different perspectives are compared. Procedures appropriate to the assessment of sign language production are recommended as a valuable addition to the clinician's toolkit. Sign and speech need to be viewed as multimodal, complementary communicative endeavours, rather than as polarities. Whilst narrative has been shown to be a fruitful context for eliciting language samples, assessments for adult users should be designed to suit the strengths, needs and values of adult signers with intellectual disabilities, using materials that are compatible with their life course stage rather than those designed for young children. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Deaf Students' Receptive and Expressive American Sign Language Skills: Comparisons and Relations

    ERIC Educational Resources Information Center

    Beal-Alvarez, Jennifer S.

    2014-01-01

    This article presents receptive and expressive American Sign Language skills of 85 students, 6 through 22 years of age at a residential school for the deaf using the American Sign Language Receptive Skills Test and the Ozcaliskan Motion Stimuli. Results are presented by ages and indicate that students' receptive skills increased with age and…

  15. On the temporal dynamics of sign production: An ERP study in Catalan Sign Language (LSC).

    PubMed

    Baus, Cristina; Costa, Albert

    2015-06-03

    This study investigates the temporal dynamics of sign production and how particular aspects of the signed modality influence the early stages of lexical access. To that end, we explored the electrophysiological correlates associated to sign frequency and iconicity in a picture signing task in a group of bimodal bilinguals. Moreover, a subset of the same participants was tested in the same task but naming the pictures instead. Our results revealed that both frequency and iconicity influenced lexical access in sign production. At the ERP level, iconicity effects originated very early in the course of signing (while absent in the spoken modality), suggesting a stronger activation of the semantic properties for iconic signs. Moreover, frequency effects were modulated by iconicity, suggesting that lexical access in signed language is determined by the iconic properties of the signs. These results support the idea that lexical access is sensitive to the same phenomena in word and sign production, but its time-course is modulated by particular aspects of the modality in which a lexical item will be finally articulated. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Less Frequently Taught Languages: Basic Information and Instruction.

    ERIC Educational Resources Information Center

    Conwell, Marilyn; And Others

    The following articles are presented in the section of the Northeast Conference Report on less frequently taught languages: (1) "American Sign Language," by M. Conwell and A. Nelson; (2) "Chinese," by D. Gidman; (3) "Japanese," by J. P. Berwald and T. Phipps; (4) "Latin," by M. Cleary; (5) "Portuguese," by R. Pedro Carvalho; and (6) "Russian," by…

  17. Motives and Outcomes of New Zealand Sign Language Legislation: A Comparative Study between New Zealand and Finland

    ERIC Educational Resources Information Center

    Reffell, Hayley; McKee, Rachel Locker

    2009-01-01

    The medicalized interpretation of deafness has until recently seen the rights and protections of sign language users embedded in disability law. Yet the rights and protections crucial to sign language users centre predominantly on matters of language access, maintenance and identity. Legislators, motivated by pressure from sign language…

  18. Persistence in Japanese Language Study and Learners' Cultural/Linguistic Backgrounds

    ERIC Educational Resources Information Center

    Matsumoto, Masanori

    2009-01-01

    Motivational characteristics of students learning Japanese as a foreign language at universities in Australia were investigated to find out what affecting factors are closely related to their intentions for continuing/discontinuing their study. The results showed that students' cultural/linguistic backgrounds have a significant impact on their…

  19. Deaf children attending different school environments: sign language abilities and theory of mind.

    PubMed

    Tomasuolo, Elena; Valeri, Giovanni; Di Renzo, Alessio; Pasqualetti, Patrizio; Volterra, Virginia

    2013-01-01

    The present study examined whether full access to sign language as a medium for instruction could influence performance in Theory of Mind (ToM) tasks. Three groups of Italian participants (age range: 6-14 years) participated in the study: Two groups of deaf signing children and one group of hearing-speaking children. The two groups of deaf children differed only in their school environment: One group attended a school with a teaching assistant (TA; Sign Language is offered only by the TA to a single deaf child), and the other group attended a bilingual program (Italian Sign Language and Italian). Linguistic abilities and understanding of false belief were assessed using similar materials and procedures in spoken Italian with hearing children and in Italian Sign Language with deaf children. Deaf children attending the bilingual school performed significantly better than deaf children attending school with the TA in tasks assessing lexical comprehension and ToM, whereas the performance of hearing children was in between that of the two deaf groups. As for lexical production, deaf children attending the bilingual school performed significantly better than the two other groups. No significant differences were found between early and late signers or between children with deaf and hearing parents.

  20. Prediction in a visual language: real-time sentence processing in American Sign Language across development.

    PubMed

    Lieberman, Amy M; Borovsky, Arielle; Mayberry, Rachel I

    2018-01-01

    Prediction during sign language comprehension may enable signers to integrate linguistic and non-linguistic information within the visual modality. In two eyetracking experiments, we investigated American Sign language (ASL) semantic prediction in deaf adults and children (aged 4-8 years). Participants viewed ASL sentences in a visual world paradigm in which the sentence-initial verb was either neutral or constrained relative to the sentence-final target noun. Adults and children made anticipatory looks to the target picture before the onset of the target noun in the constrained condition only, showing evidence for semantic prediction. Crucially, signers alternated gaze between the stimulus sign and the target picture only when the sentential object could be predicted from the verb. Signers therefore engage in prediction by optimizing visual attention between divided linguistic and referential signals. These patterns suggest that prediction is a modality-independent process, and theoretical implications are discussed.

  1. Spatial and Facial Processing in the Signed Discourse of Two Groups of Deaf Signers with Clinical Language Impairment

    ERIC Educational Resources Information Center

    Penn, Claire; Commerford, Ann; Ogilvy, Dale

    2007-01-01

    The linguistic and cognitive profiles of five deaf adults with a sign language disorder were compared with those of matched deaf controls. The test involved a battery of sign language tests, a signed narrative discourse task and a neuropsychological test protocol administered in sign language. Spatial syntax and facial processing were examined in…

  2. The Study on Reading Strategy of Students Learning Japanese as a Second Language.

    ERIC Educational Resources Information Center

    Toriyama, Kyoko

    A study investigated whether a classification scheme for learning strategies used in ESL (English-as-a-Second-Language) instruction is applicable to strategies used in learning Japanese as a second language. Four metacognitive strategies were examined (directed attention, selective attention, self-monitoring, self-management). Subjects were 30…

  3. Language Justice for Sign Language Peoples: The UN Convention on the Rights of Persons with Disabilities

    ERIC Educational Resources Information Center

    Batterbury, Sarah C. E.

    2012-01-01

    Sign Language Peoples (SLPs) across the world have developed their own languages and visuo-gestural-tactile cultures embodying their collective sense of Deafhood (Ladd 2003). Despite this, most nation-states treat their respective SLPs as disabled individuals, favoring disability benefits, cochlear implants, and mainstream education over language…

  4. Sign Language Legislation as a Tool for Sustainability

    ERIC Educational Resources Information Center

    Pabsch, Annika

    2017-01-01

    This article explores three models of sustainability (environmental, economic, and social) and identifies characteristics of a sustainable community necessary to sustain the Deaf community as a whole. It is argued that sign language legislation is a valuable tool for achieving sustainability for the generations to come.

  5. Student Preconceptions of Japanese Language Learning in 1989 and 2004

    ERIC Educational Resources Information Center

    Hayashi, Atsuko

    2009-01-01

    This study compares student preconceptions and expectations of Japanese language learning from studies conducted in 1989 and 2004. Over the years, student interests and pedagogical approaches have changed. However, the changes do not reflect on the student preconceptions and expectations. They still believe in traditional approaches to language…

  6. Japanese Language and Culture 10-20-30: Guide to Implementation.

    ERIC Educational Resources Information Center

    Alberta Learning, Edmonton (Canada). Curriculum Standards Branch.

    This teacher's guide provides an innovative program of studies for teaching Japanese at the secondary level, featuring a content-based curriculum, an integrated approach, results (outcomes)-based orientation, and the use of language for effective interaction. This guide provides teachers with suggestions for designing and planning a Japanese…

  7. How Grammar Can Cope with Limited Short-Term Memory: Simultaneity and Seriality in Sign Languages

    ERIC Educational Resources Information Center

    Geraci, Carlo; Gozzi, Marta; Papagno, Costanza; Cecchetto, Carlo

    2008-01-01

    It is known that in American Sign Language (ASL) span is shorter than in English, but this discrepancy has never been systematically investigated using other pairs of signed and spoken languages. This finding is at odds with results showing that short-term memory (STM) for signs has an internal organization similar to STM for words. Moreover, some…

  8. Impacts of Visual Sonority and Handshape Markedness on Second Language Learning of American Sign Language

    PubMed Central

    Newman, Sharlene D.

    2016-01-01

    The roles of visual sonority and handshape markedness in sign language acquisition and production were investigated. In Experiment 1, learners were taught sign-nonobject correspondences that varied in sign movement sonority and handshape markedness. Results from a sign-picture matching task revealed that high sonority signs were more accurately matched, especially when the sign contained a marked handshape. In Experiment 2, learners produced these familiar signs in addition to novel signs, which differed based on sonority and markedness. Results from a key-release reaction time reproduction task showed that learners tended to produce high sonority signs much more quickly than low sonority signs, especially when the sign contained an unmarked handshape. This effect was only present in familiar signs. Sign production accuracy rates revealed that high sonority signs were more accurate than low sonority signs. Similarly, signs with unmarked handshapes were produced more accurately than those with marked handshapes. Together, results from Experiments 1 and 2 suggested that signs that contain high sonority movements are more easily processed, both perceptually and productively, and handshape markedness plays a differential role in perception and production. PMID:26644551

  9. V2S: Voice to Sign Language Translation System for Malaysian Deaf People

    NASA Astrophysics Data System (ADS)

    Mean Foong, Oi; Low, Tang Jung; La, Wai Wan

    The process of learning and understand the sign language may be cumbersome to some, and therefore, this paper proposes a solution to this problem by providing a voice (English Language) to sign language translation system using Speech and Image processing technique. Speech processing which includes Speech Recognition is the study of recognizing the words being spoken, regardless of whom the speaker is. This project uses template-based recognition as the main approach in which the V2S system first needs to be trained with speech pattern based on some generic spectral parameter set. These spectral parameter set will then be stored as template in a database. The system will perform the recognition process through matching the parameter set of the input speech with the stored templates to finally display the sign language in video format. Empirical results show that the system has 80.3% recognition rate.

  10. The impact of input quality on early sign development in native and non-native language learners.

    PubMed

    Lu, Jenny; Jones, Anna; Morgan, Gary

    2016-05-01

    There is debate about how input variation influences child language. Most deaf children are exposed to a sign language from their non-fluent hearing parents and experience a delay in exposure to accessible language. A small number of children receive language input from their deaf parents who are fluent signers. Thus it is possible to document the impact of quality of input on early sign acquisition. The current study explores the outcomes of differential input in two groups of children aged two to five years: deaf children of hearing parents (DCHP) and deaf children of deaf parents (DCDP). Analysis of child sign language revealed DCDP had a more developed vocabulary and more phonological handshape types compared with DCHP. In naturalistic conversations deaf parents used more sign tokens and more phonological types than hearing parents. Results are discussed in terms of the effects of early input on subsequent language abilities.

  11. Foreign Language Curricula in Japanese High Schools: A Case Study in Miyagi Prefecture

    ERIC Educational Resources Information Center

    Ball, Daniel

    2007-01-01

    The purpose of this study was to examine the foreign language curricula in Japanese high schools for the purpose of gaining insight into alternative views of foreign language education. Teachers, administrators, and staff at two high schools in Miyagi Prefecture were interviewed. Teachers were asked about testing, placement procedures, standards,…

  12. Heritage Language Acquisition and Maintenance: Home Literacy Practices of Japanese-Speaking Families in Canada

    ERIC Educational Resources Information Center

    Nomura, Takako; Caidi, Nadia

    2013-01-01

    Introduction: In this study, we examine the case of Japanese-speaking families in Canada and their experiences with teaching a heritage language at home, along with the uses and perceived usefulness of public library resources, collections, and services in the process. Methods: We interviewed fourteen mothers who speak Japanese to their children.…

  13. Real-time processing of ASL signs: Delayed first language acquisition affects organization of the mental lexicon

    PubMed Central

    Lieberman, Amy M.; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I.

    2014-01-01

    Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of linguistic input and for deaf individuals is highly heterogeneous, which is rarely the case for hearing learners of spoken languages. Little is known about how these modality and developmental factors affect real-time lexical processing. In this study, we ask how these factors impact real-time recognition of American Sign Language (ASL) signs using a novel adaptation of the visual world paradigm in deaf adults who learned sign from birth (Experiment 1), and in deaf individuals who were late-learners of ASL (Experiment 2). Results revealed that although both groups of signers demonstrated rapid, incremental processing of ASL signs, only native-signers demonstrated early and robust activation of sub-lexical features of signs during real-time recognition. Our findings suggest that the organization of the mental lexicon into units of both form and meaning is a product of infant language learning and not the sensory and motor modality through which the linguistic signal is sent and received. PMID:25528091

  14. Signed language and human action processing: evidence for functional constraints on the human mirror-neuron system.

    PubMed

    Corina, David P; Knapp, Heather Patterson

    2008-12-01

    In the quest to further understand the neural underpinning of human communication, researchers have turned to studies of naturally occurring signed languages used in Deaf communities. The comparison of the commonalities and differences between spoken and signed languages provides an opportunity to determine core neural systems responsible for linguistic communication independent of the modality in which a language is expressed. The present article examines such studies, and in addition asks what we can learn about human languages by contrasting formal visual-gestural linguistic systems (signed languages) with more general human action perception. To understand visual language perception, it is important to distinguish the demands of general human motion processing from the highly task-dependent demands associated with extracting linguistic meaning from arbitrary, conventionalized gestures. This endeavor is particularly important because theorists have suggested close homologies between perception and production of actions and functions of human language and social communication. We review recent behavioral, functional imaging, and neuropsychological studies that explore dissociations between the processing of human actions and signed languages. These data suggest incomplete overlap between the mirror-neuron systems proposed to mediate human action and language.

  15. Psychometric properties of a sign language version of the Mini International Neuropsychiatric Interview (MINI).

    PubMed

    Øhre, Beate; Saltnes, Hege; von Tetzchner, Stephen; Falkum, Erik

    2014-05-22

    There is a need for psychiatric assessment instruments that enable reliable diagnoses in persons with hearing loss who have sign language as their primary language. The objective of this study was to assess the validity of the Norwegian Sign Language (NSL) version of the Mini International Neuropsychiatric Interview (MINI). The MINI was translated into NSL. Forty-one signing patients consecutively referred to two specialised psychiatric units were assessed with a diagnostic interview by clinical experts and with the MINI. Inter-rater reliability was assessed with Cohen's kappa and "observed agreement". There was 65% agreement between MINI diagnoses and clinical expert diagnoses. Kappa values indicated fair to moderate agreement, and observed agreement was above 76% for all diagnoses. The MINI diagnosed more co-morbid conditions than did the clinical expert interview (mean diagnoses: 1.9 versus 1.2). Kappa values indicated moderate to substantial agreement, and "observed agreement" was above 88%. The NSL version performs similarly to other MINI versions and demonstrates adequate reliability and validity as a diagnostic instrument for assessing mental disorders in persons who have sign language as their primary and preferred language.

  16. The emergence of embedded structure: insights from Kafr Qasem Sign Language

    PubMed Central

    Kastner, Itamar; Meir, Irit; Sandler, Wendy; Dachkovsky, Svetlana

    2014-01-01

    This paper introduces data from Kafr Qasem Sign Language (KQSL), an as-yet undescribed sign language, and identifies the earliest indications of embedding in this young language. Using semantic and prosodic criteria, we identify predicates that form a constituent with a noun, functionally modifying it. We analyze these structures as instances of embedded predicates, exhibiting what can be regarded as very early stages in the development of subordinate constructions, and argue that these structures may bear directly on questions about the development of embedding and subordination in language in general. Deutscher (2009) argues persuasively that nominalization of a verb is the first step—and the crucial step—toward syntactic embedding. It has also been suggested that prosodic marking may precede syntactic marking of embedding (Mithun, 2009). However, the relevant data from the stage at which embedding first emerges have not previously been available. KQSL might be the missing piece of the puzzle: a language in which a noun can be modified by an additional predicate, forming a proposition within a proposition, sustained entirely by prosodic means. PMID:24917837

  17. Second Language Acquisition across Modalities: Production Variability in Adult L2 Learners of American Sign Language

    ERIC Educational Resources Information Center

    Hilger, Allison I.; Loucks, Torrey M. J.; Quinto-Pozos, David; Dye, Matthew W. G.

    2015-01-01

    A study was conducted to examine production variability in American Sign Language (ASL) in order to gain insight into the development of motor control in a language produced in another modality. Production variability was characterized through the spatiotemporal index (STI), which represents production stability in whole utterances and is a…

  18. Sign Language Recognition System using Neural Network for Digital Hardware Implementation

    NASA Astrophysics Data System (ADS)

    Vargas, Lorena P.; Barba, Leiner; Torres, C. O.; Mattos, L.

    2011-01-01

    This work presents an image pattern recognition system using neural network for the identification of sign language to deaf people. The system has several stored image that show the specific symbol in this kind of language, which is employed to teach a multilayer neural network using a back propagation algorithm. Initially, the images are processed to adapt them and to improve the performance of discriminating of the network, including in this process of filtering, reduction and elimination noise algorithms as well as edge detection. The system is evaluated using the signs without including movement in their representation.

  19. Impacts of Visual Sonority and Handshape Markedness on Second Language Learning of American Sign Language.

    PubMed

    Williams, Joshua T; Newman, Sharlene D

    2016-04-01

    The roles of visual sonority and handshape markedness in sign language acquisition and production were investigated. In Experiment 1, learners were taught sign-nonobject correspondences that varied in sign movement sonority and handshape markedness. Results from a sign-picture matching task revealed that high sonority signs were more accurately matched, especially when the sign contained a marked handshape. In Experiment 2, learners produced these familiar signs in addition to novel signs, which differed based on sonority and markedness. Results from a key-release reaction time reproduction task showed that learners tended to produce high sonority signs much more quickly than low sonority signs, especially when the sign contained an unmarked handshape. This effect was only present in familiar signs. Sign production accuracy rates revealed that high sonority signs were more accurate than low sonority signs. Similarly, signs with unmarked handshapes were produced more accurately than those with marked handshapes. Together, results from Experiments 1 and 2 suggested that signs that contain high sonority movements are more easily processed, both perceptually and productively, and handshape markedness plays a differential role in perception and production. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Japanese Classroom Behavior: A Micro-Analysis of Self-Reports versus Classroom Observations--With Implications for Language Teachers

    ERIC Educational Resources Information Center

    Bohn, Mariko T.

    2004-01-01

    This article examines the influence of Japanese cultural values, beliefs, and educational style on Japanese students learning English as a second language in an American classroom. In contrast to the Japanese students' high motivation to learn English, their classroom behavior and roles reflect their own cultural perspectives rather than the…

  1. The British Sign Language Variant of Stokoe Notation: Report on a Type-Design Project.

    ERIC Educational Resources Information Center

    Thoutenhoofd, Ernst

    2003-01-01

    Explores the outcome of a publicly-funded research project titled "Redesign of the British Sign Language (BSL) Notation System with a New Font for Use in ICT." The aim of the project was to redesign the British Sign Language variant of Stokoe notation for practical use in information technology systems and software, such as lexical…

  2. Visual Iconicity Across Sign Languages: Large-Scale Automated Video Analysis of Iconic Articulators and Locations

    PubMed Central

    Östling, Robert; Börstell, Carl; Courtaux, Servane

    2018-01-01

    We use automatic processing of 120,000 sign videos in 31 different sign languages to show a cross-linguistic pattern for two types of iconic form–meaning relationships in the visual modality. First, we demonstrate that the degree of inherent plurality of concepts, based on individual ratings by non-signers, strongly correlates with the number of hands used in the sign forms encoding the same concepts across sign languages. Second, we show that certain concepts are iconically articulated around specific parts of the body, as predicted by the associational intuitions by non-signers. The implications of our results are both theoretical and methodological. With regard to theoretical implications, we corroborate previous research by demonstrating and quantifying, using a much larger material than previously available, the iconic nature of languages in the visual modality. As for the methodological implications, we show how automatic methods are, in fact, useful for performing large-scale analysis of sign language data, to a high level of accuracy, as indicated by our manual error analysis. PMID:29867684

  3. Exploring the Value of Bilingual Language Assistants with Japanese English as a Foreign Language Learners

    ERIC Educational Resources Information Center

    Macaro, Ernesto; Nakatani, Yasuo; Hayashi, Yuko; Khabbazbashi, Nahal

    2014-01-01

    We report on a small-scale exploratory study of Japanese students' reactions to the use of a bilingual language assistant on an EFL study-abroad course in the UK and we give an insight into the possible effect of using bilingual assistants on speaking production. First-year university students were divided into three groups all taught by a…

  4. Referential Strategy Training for Second Language Reading Comprehension of Japanese Texts.

    ERIC Educational Resources Information Center

    Kitajima, Ryu

    1997-01-01

    Examines whether strategy training orienting second language (L2) students' attention toward referential processes improves their comprehension of Japanese narrative. Findings revealed that experimental students comprehended the story at the macro level significantly better than control students, suggesting that the strategy training is beneficial…

  5. Students who are deaf and hard of hearing and use sign language: considerations and strategies for developing spoken language and literacy skills.

    PubMed

    Nussbaum, Debra; Waddy-Smith, Bettie; Doyle, Jane

    2012-11-01

    There is a core body of knowledge, experience, and skills integral to facilitating auditory, speech, and spoken language development when working with the general population of students who are deaf and hard of hearing. There are additional issues, strategies, and challenges inherent in speech habilitation/rehabilitation practices essential to the population of deaf and hard of hearing students who also use sign language. This article will highlight philosophical and practical considerations related to practices used to facilitate spoken language development and associated literacy skills for children and adolescents who sign. It will discuss considerations for planning and implementing practices that acknowledge and utilize a student's abilities in sign language, and address how to link these skills to developing and using spoken language. Included will be considerations for children from early childhood through high school with a broad range of auditory access, language, and communication characteristics. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  6. What Is Business Japanese? Designing a Japanese Course for Business Communication.

    ERIC Educational Resources Information Center

    Koike, Shohei

    Experiences in developing "Business Japanese" courses for the undergraduate major in Language and International Trade at Eastern Michigan University are described. In 1987, six new courses in Japanese were proposed so that Japanese could be offered as a language specialty in the program. Issues considered in defining business Japanese…

  7. A Component-Based Vocabulary-Extensible Sign Language Gesture Recognition Framework.

    PubMed

    Wei, Shengjing; Chen, Xiang; Yang, Xidong; Cao, Shuai; Zhang, Xu

    2016-04-19

    Sign language recognition (SLR) can provide a helpful tool for the communication between the deaf and the external world. This paper proposed a component-based vocabulary extensible SLR framework using data from surface electromyographic (sEMG) sensors, accelerometers (ACC), and gyroscopes (GYRO). In this framework, a sign word was considered to be a combination of five common sign components, including hand shape, axis, orientation, rotation, and trajectory, and sign classification was implemented based on the recognition of five components. Especially, the proposed SLR framework consisted of two major parts. The first part was to obtain the component-based form of sign gestures and establish the code table of target sign gesture set using data from a reference subject. In the second part, which was designed for new users, component classifiers were trained using a training set suggested by the reference subject and the classification of unknown gestures was performed with a code matching method. Five subjects participated in this study and recognition experiments under different size of training sets were implemented on a target gesture set consisting of 110 frequently-used Chinese Sign Language (CSL) sign words. The experimental results demonstrated that the proposed framework can realize large-scale gesture set recognition with a small-scale training set. With the smallest training sets (containing about one-third gestures of the target gesture set) suggested by two reference subjects, (82.6 ± 13.2)% and (79.7 ± 13.4)% average recognition accuracy were obtained for 110 words respectively, and the average recognition accuracy climbed up to (88 ± 13.7)% and (86.3 ± 13.7)% when the training set included 50~60 gestures (about half of the target gesture set). The proposed framework can significantly reduce the user's training burden in large-scale gesture recognition, which will facilitate the implementation of a practical SLR system.

  8. The Signs B [Image Omitted] and B-Bent [Image Omitted] in Israeli Sign Language According to the Theory of Phonology as Human Behavior

    ERIC Educational Resources Information Center

    Fuks, Orit; Tobin, Yishai

    2008-01-01

    The purpose of the present research is to examine which of the two factors: (1) the iconic-semiotic factor; or (2) the human-phonetic factor is more relevant in explaining the appearance and distribution of the hand shape B-bent in Israeli Sign Language (ISL). The B-bent shape has been the subject of much attention in sign language research…

  9. Language Interdependence between American Sign Language and English: A Review of Empirical Studies

    ERIC Educational Resources Information Center

    Rusher, Melissa Ausbrooks

    2012-01-01

    This study provides a contemporary definition of American Sign Language/English bilingual education (AEBE) and outlines an essential theoretical framework. Included is a history and evolution of the methodology. The author also summarizes the general findings of twenty-six (26) empirical studies conducted in the United States that directly or…

  10. Effects of Hearing Status and Sign Language Use on Working Memory

    PubMed Central

    Sarchet, Thomastine; Trani, Alexandra

    2016-01-01

    Deaf individuals have been found to score lower than hearing individuals across a variety of memory tasks involving both verbal and nonverbal stimuli, particularly those requiring retention of serial order. Deaf individuals who are native signers, meanwhile, have been found to score higher on visual-spatial memory tasks than on verbal-sequential tasks and higher on some visual-spatial tasks than hearing nonsigners. However, hearing status and preferred language modality (signed or spoken) frequently are confounded in such studies. That situation is resolved in the present study by including deaf students who use spoken language and sign language interpreting students (hearing signers) as well as deaf signers and hearing nonsigners. Three complex memory span tasks revealed overall advantages for hearing signers and nonsigners over both deaf signers and deaf nonsigners on 2 tasks involving memory for verbal stimuli (letters). There were no differences among the groups on the task involving visual-spatial stimuli. The results are consistent with and extend recent findings concerning the effects of hearing status and language on memory and are discussed in terms of language modality, hearing status, and cognitive abilities among deaf and hearing individuals. PMID:26755684

  11. Japanese Students' Perceptions of Digital Game Use for English-Language Learning in Higher Education

    ERIC Educational Resources Information Center

    Bolliger, Doris U.; Mills, Daniel; White, Jeremy; Kohyama, Megumi

    2015-01-01

    Researchers investigated perceptions of Japanese college students toward the use of digital games in English-language learning. The study was conducted at one large private university in Japan. Undergraduate students who were enrolled in 14 English-language courses were invited to complete a paper-based survey during class time. The survey…

  12. Heritage Language Education without Inheriting Hegemonic Ideologies: Shifting Perspectives on "Korea" in a Weekend Japanese-Language School in the United States

    ERIC Educational Resources Information Center

    Doerr, Neriko Musha; Lee, Kiri

    2016-01-01

    Learning a heritage language can be celebrated to enhance marginalized groups' self-esteem, but a heritage can also encompass ideologies prevalent in the groups' original homeland. Based on ethnographic fieldwork (2007-2011) at a weekend Japanese-language school in the United States, this article investigates how ideologies on race politics…

  13. A FAQ-Based e-Learning Environment to Support Japanese Language Learning

    ERIC Educational Resources Information Center

    Liu, Yuqin; Yin, Chengjiu; Ogata, Hiroaki; Qiao, Guojun; Yano, Yoneo

    2011-01-01

    In traditional classes, having many questions from learners is important because these questions indicate difficult points for learners and for teachers. This paper proposes a FAQ-based e-Learning environment to support Japanese language learning that focuses on learner questions. This knowledge sharing system enables learners to interact and…

  14. Japanese Language as an Organizational Barrier for International Students to Access to University Services: A Case of Aoyama Gakuin University

    ERIC Educational Resources Information Center

    Hiratsuka, Hiroyoshi

    2016-01-01

    In 2011, Aoyama Gakuin University (AGU) started a government-funded degree program (taught in English) to accept international students with limited or no Japanese language proficiency. However, the students faced obstacles in accessing all of the university resources provided. In this article, I investigated Japanese language as an organizational…

  15. Exploring the Efficacy of Online American Sign Language Instruction

    ERIC Educational Resources Information Center

    Radford, Curt L.

    2012-01-01

    Advances in technology have significantly influenced educational delivery options, particularly in the area of American Sign Language (ASL) instruction. As a result, ASL online courses are currently being explored in higher education. The review of literature remains relatively unexplored regarding the effectiveness of learning ASL online. In…

  16. Recognition of Arabic Sign Language Alphabet Using Polynomial Classifiers

    NASA Astrophysics Data System (ADS)

    Assaleh, Khaled; Al-Rousan, M.

    2005-12-01

    Building an accurate automatic sign language recognition system is of great importance in facilitating efficient communication with deaf people. In this paper, we propose the use of polynomial classifiers as a classification engine for the recognition of Arabic sign language (ArSL) alphabet. Polynomial classifiers have several advantages over other classifiers in that they do not require iterative training, and that they are highly computationally scalable with the number of classes. Based on polynomial classifiers, we have built an ArSL system and measured its performance using real ArSL data collected from deaf people. We show that the proposed system provides superior recognition results when compared with previously published results using ANFIS-based classification on the same dataset and feature extraction methodology. The comparison is shown in terms of the number of misclassified test patterns. The reduction in the rate of misclassified patterns was very significant. In particular, we have achieved a 36% reduction of misclassifications on the training data and 57% on the test data.

  17. Factors associated with recognition of the signs of dating violence by Japanese junior high school students.

    PubMed

    Nagamatsu, Miyuki; Hamada, Yukiko; Hara, Kenichi

    2016-01-01

    This study investigated factors associated with the ability of Japanese junior high school students to recognize the signs of dating violence. During a period of 20 months (from June 2011 to January 2013), a survey was distributed to 3340 students aged 13-15 years in the second and third grades at 18 junior high schools in a Japanese prefecture. The survey examined gender, recognition of the signs of dating violence, knowledge of dating violence, self-esteem, attitudes toward sexual activity, attitudes toward an equal dating relationship, and relationships with school teachers. Multiple linear regression analyses were performed to identify predictors of the ability of boys and girls respondents to recognize the signs of physical and psychological dating violence. Binary multiple logistic regression analysis was also performed to identify predictors of the ability of boys and girls respondents to recognize the sign of sexual dating violence. The Ethics Committee of Saga University Medical School approved the study protocol. A total of 3050 (91.3%) students participated in this study (1547 boys and 1503 girls). Gender differences were noted with regard to the scores for some of the variables measured. The results indicated that boys who had more knowledge of dating violence, who focused on an equal dating relationship, and had a positive relationship with their teachers showed a greater ability to recognize the signs of dating violence. In addition, boys with a conservative attitude toward sexual activity showed a greater ability to recognize the signs of physical and sexual violence. Furthermore, girls with more knowledge of dating violence had a conservative attitude toward sexual activity, and girls who focused on an equal dating relationship showed greater ability to recognize the signs of dating violence. These findings suggest that education programs to prevent dating violence should promote understanding about dating violence with consideration of gender

  18. The "handedness" of language: Directional symmetry breaking of sign usage in words.

    PubMed

    Ashraf, Md Izhar; Sinha, Sitabhra

    2018-01-01

    Language, which allows complex ideas to be communicated through symbolic sequences, is a characteristic feature of our species and manifested in a multitude of forms. Using large written corpora for many different languages and scripts, we show that the occurrence probability distributions of signs at the left and right ends of words have a distinct heterogeneous nature. Characterizing this asymmetry using quantitative inequality measures, viz. information entropy and the Gini index, we show that the beginning of a word is less restrictive in sign usage than the end. This property is not simply attributable to the use of common affixes as it is seen even when only word roots are considered. We use the existence of this asymmetry to infer the direction of writing in undeciphered inscriptions that agrees with the archaeological evidence. Unlike traditional investigations of phonotactic constraints which focus on language-specific patterns, our study reveals a property valid across languages and writing systems. As both language and writing are unique aspects of our species, this universal signature may reflect an innate feature of the human cognitive phenomenon.

  19. HAPPEN CAN'T HEAR: An Analysis of Code-Blends in Hearing, Native Signers of American Sign Language

    ERIC Educational Resources Information Center

    Bishop, Michele

    2011-01-01

    Hearing native signers often learn sign language as their first language and acquire features that are characteristic of sign languages but are not present in equivalent ways in English (e.g., grammatical facial expressions and the structured use of space for setting up tokens and surrogates). Previous research has indicated that bimodal…

  20. Constructing an Online Test Framework, Using the Example of a Sign Language Receptive Skills Test

    ERIC Educational Resources Information Center

    Haug, Tobias; Herman, Rosalind; Woll, Bencie

    2015-01-01

    This paper presents the features of an online test framework for a receptive skills test that has been adapted, based on a British template, into different sign languages. The online test includes features that meet the needs of the different sign language versions. Features such as usability of the test, automatic saving of scores, and score…

  1. The question of sign-language and the utility of signs in the instruction of the deaf: two papers by Alexander Graham Bell (1898).

    PubMed

    Bell, Alexander Graham

    2005-01-01

    Alexander Graham Bell is often portrayed as either hero or villain of deaf individuals and the Deaf community. His writings, however, indicate that he was neither, and was not as clearly definite in his beliefs about language as is often supposed. The following two articles, reprinted from The Educator (1898), Vol. V, pp. 3-4 and pp. 38-44, capture Bell's thinking about sign language and its use in the classroom. Contrary to frequent claims, Bell does not demand "oral" training for all deaf children--even if he thinks it is the superior alternative--but does advocate for it for "the semi-deaf" and "the semi-mute." "In regard to the others," he writes, "I am not so sure." Although he clearly voices his support for oral methods and fingerspelling (the Rochester method) over sign language, Bell acknowledges the use and utility of signing in a carefully-crafted discussion that includes both linguistics and educational philosophy. In separating the language used at home from that in school and on the playground, Bell reveals a far more complex view of language learning by deaf children than he is often granted. (M. Marschark).

  2. We Need to Communicate! Helping Hearing Parents of Deaf Children Learn American Sign Language

    ERIC Educational Resources Information Center

    Weaver, Kimberly A.; Starner, Thad

    2011-01-01

    Language immersion from birth is crucial to a child's language development. However, language immersion can be particularly challenging for hearing parents of deaf children to provide as they may have to overcome many difficulties while learning American Sign Language (ASL). We are in the process of creating a mobile application to help hearing…

  3. An Investigation of the Need for Sign Language Assessment in Deaf Education

    ERIC Educational Resources Information Center

    Mann, Wolfgang; Prinz, Philip M.

    2006-01-01

    The attitudes of educators of the deaf and other professionals in deaf education concerning assessment of the use of American Sign Language (ASL) and other sign systems was investigated. A questionnaire was distributed to teachers in a residential school for the deaf in California. In addition to questions regarding the availability of sign…

  4. The Relationship between American Sign Language Vocabulary and the Development of Language-Based Reasoning Skills in Deaf Children

    ERIC Educational Resources Information Center

    Henner, Jonathan

    2016-01-01

    The language-based analogical reasoning abilities of Deaf children are a controversial topic. Researchers lack agreement about whether Deaf children possess the ability to reason using language-based analogies, or whether this ability is limited by a lack of access to vocabulary, both written and signed. This dissertation examines factors that…

  5. The Relationship between Kenyan Sign Language and English Literacy

    ERIC Educational Resources Information Center

    Aura, Lillie Josephine; Venville, Grady; Marais, Ida

    2016-01-01

    This paper presents results of an investigation into the relationship between Kenyan Sign Language (KSL) and English literacy skills. It is derived from research undertaken towards an MEd degree awarded by The University of Western Australia in 2011. The study employed a correlational survey strategy. Sixty upper primary deaf students from four…

  6. On Selected Phonological Patterns in Saudi Arabian Sign Language

    ERIC Educational Resources Information Center

    Tomita, Nozomi; Kozak, Viola

    2012-01-01

    This paper focuses on two selected phonological patterns that appear unique to Saudi Arabian Sign Language (SASL). For both sections of this paper, the overall methodology is the same as that discussed in Stephen and Mathur (this volume), with some additional modifications tailored to the specific studies discussed here, which will be expanded…

  7. Practical low-cost visual communication using binary images for deaf sign language.

    PubMed

    Manoranjan, M D; Robinson, J A

    2000-03-01

    Deaf sign language transmitted by video requires a temporal resolution of 8 to 10 frames/s for effective communication. Conventional videoconferencing applications, when operated over low bandwidth telephone lines, provide very low temporal resolution of pictures, of the order of less than a frame per second, resulting in jerky movement of objects. This paper presents a practical solution for sign language communication, offering adequate temporal resolution of images using moving binary sketches or cartoons, implemented on standard personal computer hardware with low-cost cameras and communicating over telephone lines. To extract cartoon points an efficient feature extraction algorithm adaptive to the global statistics of the image is proposed. To improve the subjective quality of the binary images, irreversible preprocessing techniques, such as isolated point removal and predictive filtering, are used. A simple, efficient and fast recursive temporal prefiltering scheme, using histograms of successive frames, reduces the additive and multiplicative noise from low-cost cameras. An efficient three-dimensional (3-D) compression scheme codes the binary sketches. Subjective tests performed on the system confirm that it can be used for sign language communication over telephone lines.

  8. Using the Hands to Represent Objects in Space: Gesture as a Substrate for Signed Language Acquisition.

    PubMed

    Janke, Vikki; Marshall, Chloë R

    2017-01-01

    An ongoing issue of interest in second language research concerns what transfers from a speaker's first language to their second. For learners of a sign language, gesture is a potential substrate for transfer. Our study provides a novel test of gestural production by eliciting silent gesture from novices in a controlled environment. We focus on spatial relationships, which in sign languages are represented in a very iconic way using the hands, and which one might therefore predict to be easy for adult learners to acquire. However, a previous study by Marshall and Morgan (2015) revealed that this was only partly the case: in a task that required them to express the relative locations of objects, hearing adult learners of British Sign Language (BSL) could represent objects' locations and orientations correctly, but had difficulty selecting the correct handshapes to represent the objects themselves. If hearing adults are indeed drawing upon their gestural resources when learning sign languages, then their difficulties may have stemmed from their having in manual gesture only a limited repertoire of handshapes to draw upon, or, alternatively, from having too broad a repertoire. If the first hypothesis is correct, the challenge for learners is to extend their handshape repertoire, but if the second is correct, the challenge is instead to narrow down to the handshapes appropriate for that particular sign language. 30 sign-naïve hearing adults were tested on Marshall and Morgan's task. All used some handshapes that were different from those used by native BSL signers and learners, and the set of handshapes used by the group as a whole was larger than that employed by native signers and learners. Our findings suggest that a key challenge when learning to express locative relations might be reducing from a very large set of gestural resources, rather than supplementing a restricted one, in order to converge on the conventionalized classifier system that forms part of the

  9. "So What Is the Appeal?" The Phenomenon of Japanese as a Foreign Language in Hong Kong

    ERIC Educational Resources Information Center

    Humphreys, Gillian; Miyazoe-Wong, Yuko

    2007-01-01

    In spite of long-standing political tensions between Japan and the People's Republic of China, Japanese remains a highly popular language to learn in Hong Kong. This is evidenced by the growth in number of Japanese-related courses and programmes offered at schools and universities in the Special Administrative Region. Although Japan is a dominant…

  10. Using the "Common European Framework of Reference for Languages" to Teach Sign Language to Parents of Deaf Children

    ERIC Educational Resources Information Center

    Snoddon, Kristin

    2015-01-01

    No formal Canadian curriculum presently exists for teaching American Sign Language (ASL) as a second language to parents of deaf and hard of hearing children. However, this group of ASL learners is in need of more comprehensive, research-based support, given the rapid expansion in Canada of universal neonatal hearing screening and the…

  11. Handling Japanese without a Japanese Operating System.

    ERIC Educational Resources Information Center

    Hatasa, Kazumi; And Others

    1992-01-01

    The Macintosh HyperCard environment has become a popular platform for Japanese language courseware because of its flexibility and ease of programing. This project created Japanese bitmap font files for the JIS Levels 1 and 2, and writing XFCNs for font manipulation, Japanese kana input, and answer correction. (12 references) (Author/LB)

  12. Language between Bodies: A Cognitive Approach to Understanding Linguistic Politeness in American Sign Language

    ERIC Educational Resources Information Center

    Roush, Daniel R.

    2011-01-01

    This article proposes an answer to the primary question of how the American Sign Language (ASL) community in the United States conceptualizes (im)politeness and its related notions. It begins with a review of evolving theoretical issues in research on (im)politeness and related methodological problems with studying (im)politeness in natural…

  13. The ABCs of New Zealand Sign Language: Aerial Spelling.

    ERIC Educational Resources Information Center

    Forman, Wayne

    2003-01-01

    Aerial spelling is the term given for the way many people with deafness in New Zealand (NZ) manually represent letters of the alphabet. This article examines the nature and role of aerial spelling in New Zealand Sign Language, particularly that form used by older members of the NZ deaf community. (Contains references.) (Author/CR)

  14. Sign Language Culture as Part of Multiculturalism in Hungary

    ERIC Educational Resources Information Center

    Sarolta, Simigne Fenyo

    2011-01-01

    The objective of the present study is to investigate sign language culture as part of multiculturalism in Hungary. The study consists of two parts. Referring to the 13 national and linguistic minorities living in the territory of Hungary, the first part gives a short account of the narrower interpretation of multiculturalism according to which it…

  15. Morphological Innovation in the Acquisition of American Sign Language.

    ERIC Educational Resources Information Center

    van Hoek, Karen; And Others

    A study examined aspects of the acquisition of spatialized morphology and syntax in American Sign Language (ASL) learned natively by deaf children of deaf parents. Children aged 2 to 8 were shown story books to elicit narratives, and the resulting use of verbs contained morphological forms not appearing in adult grammar. Analysis of the creative…

  16. Gesture and Signing in Support of Expressive Language Development

    ERIC Educational Resources Information Center

    Baker-Ramos, Leslie K.

    2017-01-01

    The purpose of this teacher inquiry is to explore the effects of signing and gesturing on the expressive language development of non-verbal children. The first phase of my inquiry begins with the observations of several non-verbal students with various etiologies in three different educational settings. The focus of these observations is to…

  17. Chemistry through the language barrier. How to scan chemical articles in foreign languages with emphasis on Russian and Japanese

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reid, E.E.

    1970-01-01

    The thesis of this book is that one can learn a surprising amount of information from scientific articles in a foreign language. Use is made of symbols, numbers, etc., (which appear in familiar Roman script) and words that are similar in many languages (e.g., reaction, reazione, reaccion, reactie, reaccao, reaktion, reaksjon, reaktionen, reakcio, rektio, reaktsiya, reakcja, and reakce). Most European languages (Germanic, Latin, et al.) are included, with special emphasis on Russian and Japanese. The technique is illustrated with examples from organic chemistry, but the approach should be applicable to scientific writing in any subject area. (RWR)

  18. Assessing Health Literacy in Deaf American Sign Language Users.

    PubMed

    McKee, Michael M; Paasche-Orlow, Michael K; Winters, Paul C; Fiscella, Kevin; Zazove, Philip; Sen, Ananda; Pearson, Thomas

    2015-01-01

    Communication and language barriers isolate Deaf American Sign Language (ASL) users from mass media, health care messages, and health care communication, which, when coupled with social marginalization, places them at a high risk for inadequate health literacy. Our objectives were to translate, adapt, and develop an accessible health literacy instrument in ASL and to assess the prevalence and correlates of inadequate health literacy among Deaf ASL users and hearing English speakers using a cross-sectional design. A total of 405 participants (166 Deaf and 239 hearing) were enrolled in the study. The Newest Vital Sign was adapted, translated, and developed into an ASL version (ASL-NVS). We found that 48% of Deaf participants had inadequate health literacy, and Deaf individuals were 6.9 times more likely than hearing participants to have inadequate health literacy. The new ASL-NVS, available on a self-administered computer platform, demonstrated good correlation with reading literacy. The prevalence of Deaf ASL users with inadequate health literacy is substantial, warranting further interventions and research.

  19. Assessing Health Literacy in Deaf American Sign Language Users

    PubMed Central

    McKee, Michael M.; Paasche-Orlow, Michael; Winters, Paul C.; Fiscella, Kevin; Zazove, Philip; Sen, Ananda; Pearson, Thomas

    2015-01-01

    Communication and language barriers isolate Deaf American Sign Language (ASL) users from mass media, healthcare messages, and health care communication, which when coupled with social marginalization, places them at a high risk for inadequate health literacy. Our objectives were to translate, adapt, and develop an accessible health literacy instrument in ASL and to assess the prevalence and correlates of inadequate health literacy among Deaf ASL users and hearing English speakers using a cross-sectional design. A total of 405 participants (166 Deaf and 239 hearing) were enrolled in the study. The Newest Vital Sign was adapted, translated, and developed into an ASL version of the NVS (ASL-NVS). Forty-eight percent of Deaf participants had inadequate health literacy, and Deaf individuals were 6.9 times more likely than hearing participants to have inadequate health literacy. The new ASL-NVS, available on a self-administered computer platform, demonstrated good correlation with reading literacy. The prevalence of Deaf ASL users with inadequate health literacy is substantial, warranting further interventions and research. PMID:26513036

  20. Interactional Competence in Japanese as an Additional Language. Pragmatics & Interaction. Volume 4

    ERIC Educational Resources Information Center

    Greer, Tim, Ed.; Ishida, Midori, Ed.; Tateyama, Yumiko, Ed.

    2017-01-01

    In the research literature on interactional competence in talk among second language speakers and their coparticipants, this volume of "Pragmatics & Interaction" is the first to focus on interaction in Japanese. The chapters examine the use and development of interactional practices in a wide range of social settings, from everyday…

  1. Development of Geography and Geology Terminology in British Sign Language

    NASA Astrophysics Data System (ADS)

    Meara, Rhian; Cameron, Audrey; Quinn, Gary; O'Neill, Rachel

    2016-04-01

    The BSL Glossary Project, run by the Scottish Sensory Centre at the University of Edinburgh focuses on developing scientific terminology in British Sign Language for use in the primary, secondary and tertiary education of deaf and hard of hearing students within the UK. Thus far, the project has developed 850 new signs and definitions covering Chemistry, Physics, Biology, Astronomy and Mathematics. The project has also translated examinations into BSL for students across Scotland. The current phase of the project has focused on developing terminology for Geography and Geology subjects. More than 189 new signs have been developed in these subjects including weather, rivers, maps, natural hazards and Geographical Information Systems. The signs were developed by a focus group with expertise in Geography and Geology, Chemistry, Ecology, BSL Linguistics and Deaf Education all of whom are deaf fluent BSL users.

  2. A Barking Dog That Never Bites? The British Sign Language (Scotland) Bill

    ERIC Educational Resources Information Center

    De Meulder, Maartje

    2015-01-01

    This article describes and analyses the pathway to the British Sign Language (Scotland) Bill and the strategies used to reach it. Data collection has been done by means of interviews with key players, analysis of official documents, and participant observation. The article discusses the bill in relation to the Gaelic Language (Scotland) Act 2005…

  3. The Evolution of Networked Computing in the Teaching of Japanese as a Foreign Language.

    ERIC Educational Resources Information Center

    Harrison, Richard

    1998-01-01

    Reviews the evolution of Internet-based projects in Japanese computer-assisted language learning and suggests future directions in which the field may develop, based on emerging network technology and learning theory. (Author/VWL)

  4. Realities, Rewards, and Risks of Heritage-Language Education: Perspectives from Japanese Immigrant Parents in a Midwestern Community

    ERIC Educational Resources Information Center

    Endo, R.

    2013-01-01

    This ethnographic case study describes how three Japanese immigrant parents in a midsize urban community in the Midwest viewed heritage-language education in relation to their children's socioemotional development as bicultural Americans. The literature review offers a comparative and historical analysis of Japanese schools in the diaspora to…

  5. [Understanding the symbolic values of Japanese onomatopoeia: comparison of Japanese and Chinese speakers].

    PubMed

    Haryu, Etsuko; Zhao, Lihua

    2007-10-01

    Do non-native speakers of the Japanese language understand the symbolic values of Japanese onomatopoeia matching a voiced/unvoiced consonant with a big/small sound made by a big/small object? In three experiments, participants who were native speakers of Japanese, Japanese-learning Chinese, or Chinese without knowledge of the Japanese language were shown two pictures. One picture was of a small object making a small sound, such as a small vase being broken, and the other was of a big object making a big sound, such as a big vase being broken. Participants were presented with two novel onomatopoetic words with voicing contrasts, e.g.,/dachan/vs./tachan/, and were told that each word corresponded to one of the two pictures. They were then asked to match the words to the corresponding pictures. Chinese without knowledge of Japanese performed only at chance level, whereas Japanese and Japanese-learning Chinese successfully matched a voiced/unvoiced consonant with a big/small object respectively. The results suggest that the key to understanding the symbolic values of voicing contrasts in Japanese onomatopoeia is some basic knowledge that is intrinsic to the Japanese language.

  6. Within- and across-language spectral and temporal variability of vowels in different phonetic and prosodic contexts: Russian and Japanese

    NASA Astrophysics Data System (ADS)

    Gilichinskaya, Yana D.; Hisagi, Miwako; Law, Franzo F.; Berkowitz, Shari; Ito, Kikuyo

    2005-04-01

    Contextual variability of vowels in three languages with large vowel inventories was examined previously. Here, variability of vowels in two languages with small inventories (Russian, Japanese) was explored. Vowels were produced by three female speakers of each language in four contexts: (Vba) disyllables and in 3-syllable nonsense words (gaC1VC2a) embedded within carrier sentences; contexts included bilabial stops (bVp) in normal rate sentences and alveolar stops (dVt) in both normal and rapid rate sentences. Dependent variables were syllable durations and formant frequencies at syllable midpoint. Results showed very little variation across consonant and rate conditions in formants for /i/ in both languages. Japanese short /u, o, a/ showed fronting (F2 increases) in alveolar context relative to labial context (1.3-2.0 Barks), which was more pronounced in rapid sentences. Fronting of Japanese long vowels was less pronounced (0.3 to 0.9 Barks). Japanese long/short vowel ratios varied with speaking style (syllables versus sentences) and speaking rate. All Russian vowels except /i/ were fronted in alveolar vs labial context (1.1-3.1 Barks) but showed little change in either spectrum or duration with speaking rate. Comparisons of these patterns of variability with American English, French and German vowel results will be discussed.

  7. Access to Sign Language Interpreters in the Criminal Justice System.

    ERIC Educational Resources Information Center

    Miller, Katrina R.

    2001-01-01

    This study surveyed 46 professional sign language interpreters working in criminal justice settings and evaluated 22 cases to evaluate access issues for individuals with hearing impairments. Recommendations to increase the accessibility of interpreting services included providing ongoing awareness training to criminal justice personnel and…

  8. The Link between Form and Meaning in British Sign Language: Effects of Iconicity for Phonological Decisions

    ERIC Educational Resources Information Center

    Thompson, Robin L.; Vinson, David P.; Vigliocco, Gabriella

    2010-01-01

    Signed languages exploit the visual/gestural modality to create iconic expression across a wide range of basic conceptual structures in which the phonetic resources of the language are built up into an analogue of a mental image (Taub, 2001). Previously, we demonstrated a processing advantage when iconic properties of signs were made salient in a…

  9. Gesture in Multiparty Interaction: A Study of Embodied Discourse in Spoken English and American Sign Language

    ERIC Educational Resources Information Center

    Shaw, Emily P.

    2013-01-01

    This dissertation is an examination of gesture in two game nights: one in spoken English between four hearing friends and another in American Sign Language between four Deaf friends. Analyses of gesture have shown there exists a complex integration of manual gestures with speech. Analyses of sign language have implicated the body as a medium…

  10. Foreign Language Learners' Motivation and Its Effects on Their Achievement: Implications for Effective Teaching of Students Studying Japanese at Universiti Brunei Darussalam

    ERIC Educational Resources Information Center

    Keaney, Minako; Mundia, Lawrence

    2014-01-01

    An increasing number of students at the University of Brunei Darussalam are studying the Japanese language. However, research on the relationship between learners' motivation and their achievement has not been given sufficient attention in Japanese foreign language education compared to English in Brunei. The present study, which utilized a…

  11. The “handedness” of language: Directional symmetry breaking of sign usage in words

    PubMed Central

    2018-01-01

    Language, which allows complex ideas to be communicated through symbolic sequences, is a characteristic feature of our species and manifested in a multitude of forms. Using large written corpora for many different languages and scripts, we show that the occurrence probability distributions of signs at the left and right ends of words have a distinct heterogeneous nature. Characterizing this asymmetry using quantitative inequality measures, viz. information entropy and the Gini index, we show that the beginning of a word is less restrictive in sign usage than the end. This property is not simply attributable to the use of common affixes as it is seen even when only word roots are considered. We use the existence of this asymmetry to infer the direction of writing in undeciphered inscriptions that agrees with the archaeological evidence. Unlike traditional investigations of phonotactic constraints which focus on language-specific patterns, our study reveals a property valid across languages and writing systems. As both language and writing are unique aspects of our species, this universal signature may reflect an innate feature of the human cognitive phenomenon. PMID:29342176

  12. Introducing Genre into Japanese-as-a-Foreign-Language: Toward a Genre-Specific Approach to Elementary/Intermediate Writing

    ERIC Educational Resources Information Center

    Shinji, Kawamitsu

    2015-01-01

    Despite the social turn in views of language and the increasing attention to an application of genre theory in teaching languages, the field of Japanese-as-a-Foreign-Language (JFL) has not yet found genre a valuable resource for approaching learners' writing ability. Writing is still practiced as a psycholinguistic space to check learners'…

  13. Lost in Translation: Strategies Japanese Language Learners Use in Communicating Culturally Specific L1 Expressions in English

    ERIC Educational Resources Information Center

    Inoue, Noriyuki; Molina, Sarina Chugani

    2011-01-01

    Communicating in a second language could be seen as a process requiring the deconstruction and reconstruction of cultural meanings. If this is the case, how do second language (L2) learners express cultural meanings of their first language (L1) expressions that do not have semantically equivalent L2 expressions? Twenty-nine Japanese students…

  14. Fellow Language Learners as Producers of Knowledge and Understandings: A Case of a Tertiary Japanese Linguistics Course

    ERIC Educational Resources Information Center

    Minagawa, Harumi

    2017-01-01

    This paper reports students' experiences of a coursework task in a Japanese linguistics course that embraces certain aspects of collaborative learning--aspects that are not practised widely in Japanese language learning situations. These involve the students looking at themselves as well as their fellow students as producers of knowledge and…

  15. Edmodo as a Tool for the Global Connection between Japanese and American College Students in Language Learning

    ERIC Educational Resources Information Center

    Okumura, Shinji

    2017-01-01

    This study investigated how English learners at a university in Japan perceive connections with students in the US through an educational social network platform, called Edmodo. The instructor of English at the Japanese university cooperated with a Japanese language instructor at an American university and they incorporated Edmodo into their…

  16. Gender Identity in a Second Language: The Use of First Person Pronouns by Male Learners of Japanese

    ERIC Educational Resources Information Center

    Brown, Lucien; Cheek, Elizabeth

    2017-01-01

    This is a qualitative sociocultural study examining how five advanced-level learners of Japanese from the United States use gendered first person pronouns to negotiate their identities. Japanese does not have a ubiquitous pronoun such as English "I." Instead, the language contains forms that are marked for formality and gender, including…

  17. Why American Sign Language Gloss Must Matter.

    PubMed

    Supalla, Samuel J; Cripps, Jody H; Byrne, Andrew P

    2017-01-01

    Responding to an article by Grushkin on how deaf children best learn to read, published, along with the present article, in an American Annals of the Deaf special issue, the authors review American Sign Language gloss. Topics include how ASL gloss enables deaf children to learn to read in their own language and simultaneously experience a transition to written English, and what gloss looks like and how it underlines deaf children's learning and mastery of English literacy through ASL. Rebuttal of Grushkin's argument includes data describing a deaf child's engagement in reading aloud (entirely in ASL) with a gloss text, which occurred without the breakdown implied by Grushkin. The authors characterize Grushkin's argument that deaf children need to learn to read through a conventional ASL writing system as limiting, asserting that ASL gloss contributes more by providing a path for learning and mastering English literacy.

  18. A comparison of English and Japanese taste languages: taste descriptive methodology, codability and the umami taste.

    PubMed

    O'Mahony, M; Ishii, R

    1986-05-01

    Everyday taste descriptions for a range of stimuli were obtained from selected groups of American and Japanese subjects, using a variety of stimuli, stimulus presentation procedures and response conditions. In English there was a tendency to use a quadrapartite classification system: 'sweet', 'sour', 'salty' and 'bitter'. The Japanese had a different strategy, adding a fifth label: 'Ajinomoto', referring to the taste of monosodium glutamate. This label was generally replaced by umami--the scientific term--by Japanese who were workers or trained tasters involved with glutamate manufacture. Cultural differences in taste language have consequences for taste psychophysicists who impose a quadrapartite restriction on allowable taste descriptions. Stimulus presentation by filter-paper or aqueous solution elicited the same response trends. Language codability was only an indicator of degree of taste mixedness/singularity if used statistically with samples of sufficient size; it had little value as an indicator for individual subjects.

  19. Bridge of Signs: Can Sign Language Empower Non-Deaf Children to Triumph over Their Communication Disabilities?

    ERIC Educational Resources Information Center

    Toth, Anne

    2009-01-01

    This pilot research project examined the use of sign language as a communication bridge for non-Deaf children between the ages of 0-6 years who had been diagnosed with, or whose communication difficulties suggested, the presence of such disorders as Autism, Down Syndrome, Fetal Alcohol Spectrum Disorder (FASD), and/or learning disabilities.…

  20. How Deaf American Sign Language/English Bilingual Children Become Proficient Readers: An Emic Perspective

    ERIC Educational Resources Information Center

    Mounty, Judith L.; Pucci, Concetta T.; Harmon, Kristen C.

    2014-01-01

    A primary tenet underlying American Sign Language/English bilingual education for deaf students is that early access to a visual language, developed in conjunction with language planning principles, provides a foundation for literacy in English. The goal of this study is to obtain an emic perspective on bilingual deaf readers transitioning from…

  1. Variation in handshape and orientation in British Sign Language: The case of the ‘1’ hand configuration

    PubMed Central

    Fenlon, Jordan; Schembri, Adam; Rentelis, Ramas; Cormier, Kearsy

    2013-01-01

    This paper investigates phonological variation in British Sign Language (BSL) signs produced with a ‘1’ hand configuration in citation form. Multivariate analyses of 2084 tokens reveals that handshape variation in these signs is constrained by linguistic factors (e.g., the preceding and following phonological environment, grammatical category, indexicality, lexical frequency). The only significant social factor was region. For the subset of signs where orientation was also investigated, only grammatical function was important (the surrounding phonological environment and social factors were not significant). The implications for an understanding of pointing signs in signed languages are discussed. PMID:23805018

  2. Response bias reveals enhanced attention to inferior visual field in signers of American Sign Language.

    PubMed

    Dye, Matthew W G; Seymour, Jenessa L; Hauser, Peter C

    2016-04-01

    Deafness results in cross-modal plasticity, whereby visual functions are altered as a consequence of a lack of hearing. Here, we present a reanalysis of data originally reported by Dye et al. (PLoS One 4(5):e5640, 2009) with the aim of testing additional hypotheses concerning the spatial redistribution of visual attention due to deafness and the use of a visuogestural language (American Sign Language). By looking at the spatial distribution of errors made by deaf and hearing participants performing a visuospatial selective attention task, we sought to determine whether there was evidence for (1) a shift in the hemispheric lateralization of visual selective function as a result of deafness, and (2) a shift toward attending to the inferior visual field in users of a signed language. While no evidence was found for or against a shift in lateralization of visual selective attention as a result of deafness, a shift in the allocation of attention from the superior toward the inferior visual field was inferred in native signers of American Sign Language, possibly reflecting an adaptation to the perceptual demands imposed by a visuogestural language.

  3. The Neural Correlates of Highly Iconic Structures and Topographic Discourse in French Sign Language as Observed in Six Hearing Native Signers

    ERIC Educational Resources Information Center

    Courtin, C.; Herve, P. -Y.; Petit, L.; Zago, L.; Vigneau, M.; Beaucousin, V.; Jobard, G.; Mazoyer, B.; Mellet, E.; Tzourio-Mazoyer, N.

    2010-01-01

    "Highly iconic" structures in Sign Language enable a narrator to act, switch characters, describe objects, or report actions in four-dimensions. This group of linguistic structures has no real spoken-language equivalent. Topographical descriptions are also achieved in a sign-language specific manner via the use of signing-space and…

  4. The Nihongo Tutorial System: An Intelligent Tutoring System for Technical Japanese Language Instruction.

    ERIC Educational Resources Information Center

    Maciejewski, Anthony A.; Leung, Nelson K.

    1992-01-01

    The Nihongo Tutorial System is designed to assist English-speaking scientists and engineers in acquiring reading proficiency in Japanese technical literature. It provides individualized lessons that match interest area/language ability with available materials that are encoded with syntactic, phonetic, and morphological information. (14…

  5. Using Signs to Facilitate Vocabulary in Children with Language Delays

    ERIC Educational Resources Information Center

    Lederer, Susan Hendler; Battaglia, Dana

    2015-01-01

    The purpose of this article is to explore recommended practices in choosing and using key word signs (i.e., simple single-word gestures for communication) to facilitate first spoken words in hearing children with language delays. Developmental, theoretical, and empirical supports for this practice are discussed. Practical recommendations for…

  6. Promoting International Posture through History as Content and Language Integrated Learning (CLIL) in the Japanese Context

    ERIC Educational Resources Information Center

    Lockley, Thomas

    2015-01-01

    This article uses the conceptual framework of second language willingness to communicate (L2 WTC), and in particular the contributory construct of international posture (IP; Yashima, 2002), to report on a content and language integrated learning (CLIL) course taught in the Japanese university context. The research follows up an exploratory,…

  7. A Sociolinguistic and Sociocultural Approach to Attitudinal Dispositions of Graduated Students toward the Business Japanese Language

    ERIC Educational Resources Information Center

    Özsen, Tolga; Özbek, Aydin

    2016-01-01

    Effective usage of nonverbal and verbal communication in Japanese such as gestures, mimics, silence and employing grammatical or lexical honorifics plays a significant role in determining the success of foreign language learners in obtaining their intended employment. This study examines the second language (L2) learning of politeness and social…

  8. Mexican sign language recognition using normalized moments and artificial neural networks

    NASA Astrophysics Data System (ADS)

    Solís-V., J.-Francisco; Toxqui-Quitl, Carina; Martínez-Martínez, David; H.-G., Margarita

    2014-09-01

    This work presents a framework designed for the Mexican Sign Language (MSL) recognition. A data set was recorded with 24 static signs from the MSL using 5 different versions, this MSL dataset was captured using a digital camera in incoherent light conditions. Digital Image Processing was used to segment hand gestures, a uniform background was selected to avoid using gloved hands or some special markers. Feature extraction was performed by calculating normalized geometric moments of gray scaled signs, then an Artificial Neural Network performs the recognition using a 10-fold cross validation tested in weka, the best result achieved 95.83% of recognition rate.

  9. Differences in Language Skills: Heritage Language Learner Subgroups and Foreign Language Learners

    ERIC Educational Resources Information Center

    Kondo-Brown, Kimi

    2005-01-01

    Using both proficiency tests and self-assessment measures, this study investigated (a) whether 3 subgroups of Japanese heritage language (JHL) learners would demonstrate language behaviors distinctively different from those of traditional Japanese as a foreign language (JFL) learners, and (b) which domains of language use and skills would…

  10. Cerebral organization of oral and signed language responses: case study evidence from amytal and cortical stimulation studies.

    PubMed

    Mateer, C A; Rapport, R L; Kettrick, C

    1984-01-01

    A normally hearing left-handed patient familiar with American Sign Language (ASL) was assessed under sodium amytal conditions and with left cortical stimulation in both oral speech and signed English. Lateralization was mixed but complementary in each language mode: the right hemisphere perfusion severely disrupted motoric aspects of both types of language expression, the left hemisphere perfusion specifically disrupted features of grammatical and semantic usage in each mode of expression. Both semantic and syntactic aspects of oral and signed responses were altered during left posterior temporal-parietal stimulation. Findings are discussed in terms of the neurological organization of ASL and linguistic organization in cases of early left hemisphere damage.

  11. American Sign Language Comprehension Test: A Tool for Sign Language Researchers.

    PubMed

    Hauser, Peter C; Paludneviciene, Raylene; Riddle, Wanda; Kurz, Kim B; Emmorey, Karen; Contreras, Jessica

    2016-01-01

    The American Sign Language Comprehension Test (ASL-CT) is a 30-item multiple-choice test that measures ASL receptive skills and is administered through a website. This article describes the development and psychometric properties of the test based on a sample of 80 college students including deaf native signers, hearing native signers, deaf non-native signers, and hearing ASL students. The results revealed that the ASL-CT has good internal reliability (α = 0.834). Discriminant validity was established by demonstrating that deaf native signers performed significantly better than deaf non-native signers and hearing native signers. Concurrent validity was established by demonstrating that test results positively correlated with another measure of ASL ability (r = .715) and that hearing ASL students' performance positively correlated with the level of ASL courses they were taking (r = .726). Researchers can use the ASL-CT to characterize an individual's ASL comprehension skills, to establish a minimal skill level as an inclusion criterion for a study, to group study participants by ASL skill (e.g., proficient vs. nonproficient), or to provide a measure of ASL skill as a dependent variable. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. The Effect of Sign Language Rehearsal on Deaf Subjects' Immediate and Delayed Recall of English Word Lists.

    ERIC Educational Resources Information Center

    Bonvillian, John D.; And Others

    1987-01-01

    The relationship between sign language rehearsal and written free recall was examined by having deaf college students rehearse the sign language equivalents of printed English words. Studies of both immediate and delayed memory suggested that word recall increased as a function of total rehearsal frequency and frequency of appearance in rehearsal…

  13. Areas Recruited during Action Understanding Are Not Modulated by Auditory or Sign Language Experience.

    PubMed

    Fang, Yuxing; Chen, Quanjing; Lingnau, Angelika; Han, Zaizhu; Bi, Yanchao

    2016-01-01

    The observation of other people's actions recruits a network of areas including the inferior frontal gyrus (IFG), the inferior parietal lobule (IPL), and posterior middle temporal gyrus (pMTG). These regions have been shown to be activated through both visual and auditory inputs. Intriguingly, previous studies found no engagement of IFG and IPL for deaf participants during non-linguistic action observation, leading to the proposal that auditory experience or sign language usage might shape the functionality of these areas. To understand which variables induce plastic changes in areas recruited during the processing of other people's actions, we examined the effects of tasks (action understanding and passive viewing) and effectors (arm actions vs. leg actions), as well as sign language experience in a group of 12 congenitally deaf signers and 13 hearing participants. In Experiment 1, we found a stronger activation during an action recognition task in comparison to a low-level visual control task in IFG, IPL and pMTG in both deaf signers and hearing individuals, but no effect of auditory or sign language experience. In Experiment 2, we replicated the results of the first experiment using a passive viewing task. Together, our results provide robust evidence demonstrating that the response obtained in IFG, IPL, and pMTG during action recognition and passive viewing is not affected by auditory or sign language experience, adding further support for the supra-modal nature of these regions.

  14. An Investigation into the Relationship of Foreign Language Learning Motivation and Sign Language Use among Deaf and Hard of Hearing Hungarians

    ERIC Educational Resources Information Center

    Kontra, Edit H.; Csizer, Kata

    2013-01-01

    The aim of this study is to point out the relationship between foreign language learning motivation and sign language use among hearing impaired Hungarians. In the article we concentrate on two main issues: first, to what extent hearing impaired people are motivated to learn foreign languages in a European context; second, to what extent sign…

  15. Emergency Department utilization among Deaf American Sign Language users.

    PubMed

    McKee, Michael M; Winters, Paul C; Sen, Ananda; Zazove, Philip; Fiscella, Kevin

    2015-10-01

    Deaf American Sign Language (ASL) users comprise a linguistic minority population with poor health care access due to communication barriers and low health literacy. Potentially, these health care barriers could increase Emergency Department (ED) use. To compare ED use between deaf and non-deaf patients. A retrospective cohort from medical records. The sample was derived from 400 randomly selected charts (200 deaf ASL users and 200 hearing English speakers) from an outpatient primary care health center with a high volume of deaf patients. Abstracted data included patient demographics, insurance, health behavior, and ED use in the past 36 months. Deaf patients were more likely to be never smokers and be insured through Medicaid. In an adjusted analysis, deaf individuals were significantly more likely to use the ED (odds ratio [OR], 1.97; 95% confidence interval [CI], 1.11-3.51) over the prior 36 months. Deaf American Sign Language users appear to be at greater odds for elevated ED utilization when compared to the general hearing population. Efforts to further understand the drivers for increased ED utilization among deaf ASL users are much needed. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Distinctive Feature Extraction for Indian Sign Language (ISL) Gesture using Scale Invariant Feature Transform (SIFT)

    NASA Astrophysics Data System (ADS)

    Patil, Sandeep Baburao; Sinha, G. R.

    2017-02-01

    India, having less awareness towards the deaf and dumb peoples leads to increase the communication gap between deaf and hard hearing community. Sign language is commonly developed for deaf and hard hearing peoples to convey their message by generating the different sign pattern. The scale invariant feature transform was introduced by David Lowe to perform reliable matching between different images of the same object. This paper implements the various phases of scale invariant feature transform to extract the distinctive features from Indian sign language gestures. The experimental result shows the time constraint for each phase and the number of features extracted for 26 ISL gestures.

  17. A Crosslinguistic, Crosscultural Analysis of Metaphors in Two Italian Sign Language (LIS) Registers

    ERIC Educational Resources Information Center

    Russo, Tommaso

    2005-01-01

    This article deals with two main topics: the interplay of iconicity and metaphors in signed language discourse and the relevance of sociocultural knowledge for a full understanding of LIS metaphors. In metaphors, the iconic features of signs play a role in the creative process of determining a mental fit between two different domains. Iconicity…

  18. Neural organization of linguistic short-term memory is sensory modality-dependent: evidence from signed and spoken language.

    PubMed

    Pa, Judy; Wilson, Stephen M; Pickell, Herbert; Bellugi, Ursula; Hickok, Gregory

    2008-12-01

    Despite decades of research, there is still disagreement regarding the nature of the information that is maintained in linguistic short-term memory (STM). Some authors argue for abstract phonological codes, whereas others argue for more general sensory traces. We assess these possibilities by investigating linguistic STM in two distinct sensory-motor modalities, spoken and signed language. Hearing bilingual participants (native in English and American Sign Language) performed equivalent STM tasks in both languages during functional magnetic resonance imaging. Distinct, sensory-specific activations were seen during the maintenance phase of the task for spoken versus signed language. These regions have been previously shown to respond to nonlinguistic sensory stimulation, suggesting that linguistic STM tasks recruit sensory-specific networks. However, maintenance-phase activations common to the two languages were also observed, implying some form of common process. We conclude that linguistic STM involves sensory-dependent neural networks, but suggest that sensory-independent neural networks may also exist.

  19. Lexical Variation and Change in British Sign Language

    PubMed Central

    Stamp, Rose; Schembri, Adam; Fenlon, Jordan; Rentelis, Ramas; Woll, Bencie; Cormier, Kearsy

    2014-01-01

    This paper presents results from a corpus-based study investigating lexical variation in BSL. An earlier study investigating variation in BSL numeral signs found that younger signers were using a decreasing variety of regionally distinct variants, suggesting that levelling may be taking place. Here, we report findings from a larger investigation looking at regional lexical variants for colours, countries, numbers and UK placenames elicited as part of the BSL Corpus Project. Age, school location and language background were significant predictors of lexical variation, with younger signers using a more levelled variety. This change appears to be happening faster in particular sub-groups of the deaf community (e.g., signers from hearing families). Also, we find that for the names of some UK cities, signers from outside the region use a different sign than those who live in the region. PMID:24759673

  20. Deaf Children Attending Different School Environments: Sign Language Abilities and Theory of Mind

    ERIC Educational Resources Information Center

    Tomasuolo, Elena; Valeri, Giovanni; Di Renzo, Alessio; Pasqualetti, Patrizio; Volterra, Virginia

    2013-01-01

    The present study examined whether full access to sign language as a medium for instruction could influence performance in Theory of Mind (ToM) tasks. Three groups of Italian participants (age range: 6-14 years) participated in the study: Two groups of deaf signing children and one group of hearing-speaking children. The two groups of deaf…

  1. Face Recognition Is Shaped by the Use of Sign Language

    ERIC Educational Resources Information Center

    Stoll, Chloé; Palluel-Germain, Richard; Caldara, Roberto; Lao, Junpeng; Dye, Matthew W. G.; Aptel, Florent; Pascalis, Olivier

    2018-01-01

    Previous research has suggested that early deaf signers differ in face processing. Which aspects of face processing are changed and the role that sign language may have played in that change are however unclear. Here, we compared face categorization (human/non-human) and human face recognition performance in early profoundly deaf signers, hearing…

  2. A Prototype Greek Text to Greek Sign Language Conversion System

    ERIC Educational Resources Information Center

    Kouremenos, Dimitris; Fotinea, Stavroula-Evita; Efthimiou, Eleni; Ntalianis, Klimis

    2010-01-01

    In this article, a prototype Greek text to Greek Sign Language (GSL) conversion system is presented. The system is integrated into an educational platform that addresses the needs of teaching GSL grammar and was developed within the SYNENNOESE project (Efthimiou "et al." 2004a. Developing an e-learning platform for the Greek sign…

  3. Issues of Language Choice, Ethics and Equity: Japanese Retirees Living in Malaysia as Their Second Home

    ERIC Educational Resources Information Center

    Stapa, Siti Hamin; Musaev, Talaibek; Hieda, Natsue; Amzah, Normalis

    2013-01-01

    This paper will discuss two issues related to Japanese retirees adopting Malaysia as their second home. The first is that of the preferred language choice of the retirees. To collect data for language choice a self-report questionnaire was administered and an interview was conducted. The findings suggest that the majority of the retirees chose…

  4. Strategies for North American Missionaries' Relational Language-Culture Learning in the Japanese Context

    ERIC Educational Resources Information Center

    Manabe-Kim, Rie

    2012-01-01

    This study focused on presenting the fieldwork findings derived from studying North-American missionaries' relational dynamics with the Japanese people, and the strategies that impacted their language-culture learning. This study also focused on applying the fieldwork findings towards the creation of a coaching model designed to help missionaries…

  5. Psychological Attributes in Foreign Language Reading: An Explorative Study of Japanese College Students

    ERIC Educational Resources Information Center

    Mikami, Hitoshi; Leung, Chi Yui; Yoshikawa, Lisa

    2016-01-01

    This study explores the internal structure of psychological attributes (i.e., motivation, belief and emotion) related to foreign language reading (FLR) (hereafter FLR attributes) and checks the utility of existing FLR attribute measurements for the specific learner group (i.e., Japanese university students studying English as their foreign…

  6. Cross-Modal Bilingualism: Language Contact as Evidence of Linguistic Transfer in Sign Bilingual Education

    ERIC Educational Resources Information Center

    Menendez, Bruno

    2010-01-01

    New positive attitudes towards language interaction in the realm of bilingualism open new horizons for sign bilingual education. Plaza-Pust and Morales-Lopez have innovatively reconceptualised a new cross-disciplinary approach to sign bilingualism, based on both sociolinguistics and psycholinguistics. According to this framework, cross-modal…

  7. Comprehending Sentences With the Body: Action Compatibility in British Sign Language?

    PubMed

    Vinson, David; Perniss, Pamela; Fox, Neil; Vigliocco, Gabriella

    2017-05-01

    Previous studies show that reading sentences about actions leads to specific motor activity associated with actually performing those actions. We investigate how sign language input may modulate motor activation, using British Sign Language (BSL) sentences, some of which explicitly encode direction of motion, versus written English, where motion is only implied. We find no evidence of action simulation in BSL comprehension (Experiments 1-3), but we find effects of action simulation in comprehension of written English sentences by deaf native BSL signers (Experiment 4). These results provide constraints on the nature of mental simulations involved in comprehending action sentences referring to transfer events, suggesting that the richer contextual information provided by BSL sentences versus written or spoken English may reduce the need for action simulation in comprehension, at least when the event described does not map completely onto the signer's own body. Copyright © 2016 Cognitive Science Society, Inc.

  8. Categorical Coding of Manual & English Alphabet Characters by Beginning Students of American Sign Language.

    ERIC Educational Resources Information Center

    Hoemann, Harry W.; Koenig, Teresa J.

    1990-01-01

    Analysis of the performance of beginning American Sign Language students, who had only recently learned the manual alphabet, on a task in which proactive interference would build up rapidly on successive trials, supported the view that different languages have separate memory stores. (Author/CB)

  9. The Strategies Used in Japanese Advertisement.

    ERIC Educational Resources Information Center

    Kurose, Yuki

    This paper investigates the possibility of using Japanese advertising language as a teaching tool in the second language classroom. First, it reviews the aims of advertising and the advantages of learning advertising language in the classroom based on previous research. Next, it discusses language strategies used in Japanese advertising,…

  10. Content validation: clarity/relevance, reliability and internal consistency of enunciative signs of language acquisition.

    PubMed

    Crestani, Anelise Henrich; Moraes, Anaelena Bragança de; Souza, Ana Paula Ramos de

    2017-08-10

    To analyze the results of the validation of building enunciative signs of language acquisition for children aged 3 to 12 months. The signs were built based on mechanisms of language acquisition in an enunciative perspective and on clinical experience with language disorders. The signs were submitted to judgment of clarity and relevance by a sample of six experts, doctors in linguistic in with knowledge of psycholinguistics and language clinic. In the validation of reliability, two judges/evaluators helped to implement the instruments in videos of 20% of the total sample of mother-infant dyads using the inter-evaluator method. The method known as internal consistency was applied to the total sample, which consisted of 94 mother-infant dyads to the contents of the Phase 1 (3-6 months) and 61 mother-infant dyads to the contents of Phase 2 (7 to 12 months). The data were collected through the analysis of mother-infant interaction based on filming of dyads and application of the parameters to be validated according to the child's age. Data were organized in a spreadsheet and then converted to computer applications for statistical analysis. The judgments of clarity/relevance indicated no modifications to be made in the instruments. The reliability test showed an almost perfect agreement between judges (0.8 ≤ Kappa ≥ 1.0); only the item 2 of Phase 1 showed substantial agreement (0.6 ≤ Kappa ≥ 0.79). The internal consistency for Phase 1 had alpha = 0.84, and Phase 2, alpha = 0.74. This demonstrates the reliability of the instruments. The results suggest adequacy as to content validity of the instruments created for both age groups, demonstrating the relevance of the content of enunciative signs of language acquisition.

  11. Teaching Children with Language Delays to Say or Sign "More": Promises and Potential Pitfalls

    ERIC Educational Resources Information Center

    Lederer, Susan Hendler

    2018-01-01

    Teaching young children with language delays to say or sign the word "more" has had strong support from the literature since the 1970s (Bloom & Lahey, 1978; Holland, 1975; Lahey & Bloom, 1977; Lederer, 2002). Semantically, teaching children the word/sign "more" is supported by research on early vocabulary development…

  12. A Sensitive Period for the Acquisition of Complex Morphology: Evidence from American Sign Language.

    ERIC Educational Resources Information Center

    Galvan, Dennis

    A study investigated acquisition of three independent yet simulatneously produced morphological systems in American Sign Language (ASL): the linguistic use of space, use of classifiers, and inflections for aspect, all information incorporated into the production of a sign. Subjects were 30 deaf children with severe or profound prelingual hearing…

  13. Dissociating Linguistic and Non-Linguistic Gesture Processing: Electrophysiological Evidence from American Sign Language

    ERIC Educational Resources Information Center

    Grosvald, Michael; Gutierrez, Eva; Hafer, Sarah; Corina, David

    2012-01-01

    A fundamental advance in our understanding of human language would come from a detailed account of how non-linguistic and linguistic manual actions are differentiated in real time by language users. To explore this issue, we targeted the N400, an ERP component known to be sensitive to semantic context. Deaf signers saw 120 American Sign Language…

  14. The Subsystem of Numerals in Catalan Sign Language: Description and Examples from a Psycholinguistic Study

    ERIC Educational Resources Information Center

    Fuentes, Mariana; Tolchinsky, Liliana

    2004-01-01

    Linguistic descriptions of sign languages are important to the recognition of their linguistic status. These languages are an essential part of the cultural heritage of the communities that create and use them and vital in the education of deaf children. They are also the reference point in language acquisition studies. Ours is exploratory…

  15. The relationships between Japanese interpersonal conflict styles and their language expressions.

    PubMed

    Moriizumi, Satoshi; Takai, Jiro

    2010-01-01

    The present study investigated the influence of interpersonal conflict management styles on language expressions and the differences in expressions in same-sex relational categories based on specific in-group-out-group classifications. Questionnaires were administered to 367 university students in Japan. After reading a scenario, participants reported on actual language use and gave ratings on an interpersonal conflict management scale. The results revealed that Japanese change their expressions, along with psychological styles, depending on the relational target. They also indicated psychological constructs were related to their equivalent expressions. The results suggested that future research should take into consideration the potential differences in behavior and interaction posture inherent in various relational and situational categories.

  16. Validation of the Acoustic Voice Quality Index in the Japanese Language.

    PubMed

    Hosokawa, Kiyohito; Barsties, Ben; Iwahashi, Toshihiko; Iwahashi, Mio; Kato, Chieri; Iwaki, Shinobu; Sasai, Hisanori; Miyauchi, Akira; Matsushiro, Naoki; Inohara, Hidenori; Ogawa, Makoto; Maryn, Youri

    2017-03-01

    The Acoustic Voice Quality Index (AVQI) is a multivariate construct for quantification of overall voice quality based on the analysis of continuous speech and sustained vowel. The stability and validity of the AVQI is well established in several language families. However, the Japanese language has distinct characteristics with respect to several parameters of articulatory and phonatory physiology. The aim of the study was to confirm the criterion-related concurrent validity of AVQI, as well as its responsiveness to change and diagnostic accuracy for voice assessment in the Japanese-speaking population. This is a retrospective study. A total of 336 voice recordings, which included 69 pairs of voice recordings (before and after therapeutic interventions), were eligible for the study. The auditory-perceptual judgment of overall voice quality was evaluated by five experienced raters. The concurrent validity, responsiveness to change, and diagnostic accuracy of the AVQI were estimated. The concurrent validity and responsiveness to change based on the overall voice quality was indicated by high correlation coefficients 0.828 and 0.767, respectively. Receiver operating characteristic analysis revealed an excellent diagnostic accuracy for discrimination between dysphonic and normophonic voices (area under the curve: 0.905). The best threshold level for the AVQI of 3.15 corresponded with a sensitivity of 72.5% and specificity of 95.2%, with the positive and negative likelihood ratios of 15.1 and 0.29, respectively. We demonstrated the validity of the AVQI as a tool for assessment of overall voice quality and that of voice therapy outcomes in the Japanese-speaking population. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  17. A Qualitative Case Study of Japanese University Students and Personal Smartphone Use in English as a Foreign Language Classes

    ERIC Educational Resources Information Center

    Forsythe, Edward M., III

    2017-01-01

    Japanese university English instructors are increasingly requiring students to use their personal smartphones for activities in English as a Foreign Language (EFL) classroom activities. Because of this, it has been recommended that studies be conducted to ascertain Japanese university students' perceptions of using smartphones in EFL language…

  18. A Developmental Shift from Similar to Language-Specific Strategies in Verb Acquisition: A Comparison of English, Spanish, and Japanese

    ERIC Educational Resources Information Center

    Maguire, Mandy J.; Hirsh-Pasek, Kathy; Golinkoff, Roberta Michnick; Imai, Mutsumi; Haryu, Etsuko; Vanegas, Sandra; Okada, Hiroyuki; Pulverman, Rachel; Sanchez-Davis, Brenda

    2010-01-01

    The world's languages draw on a common set of event components for their verb systems. Yet, these components are differentially distributed across languages. At what age do children begin to use language-specific patterns to narrow possible verb meanings? English-, Japanese-, and Spanish-speaking adults, toddlers, and preschoolers were shown…

  19. Japanese Media in English.

    ERIC Educational Resources Information Center

    Tanaka, Sachiko Oda

    1995-01-01

    Describes the use of English in the media in Japan, focusing on the role and history of English-language newspapers, radio, and television programs, as well as the proliferation of English-language films shown in Japanese cinemas. Discusses the implications of English in the Japanese media. (20 references) (MDM)

  20. Sign Language Recognition and Translation: A Multidisciplined Approach from the Field of Artificial Intelligence

    ERIC Educational Resources Information Center

    Parton, Becky Sue

    2006-01-01

    In recent years, research has progressed steadily in regard to the use of computers to recognize and render sign language. This paper reviews significant projects in the field beginning with finger-spelling hands such as "Ralph" (robotics), CyberGloves (virtual reality sensors to capture isolated and continuous signs), camera-based…

  1. Functional changes in people with different hearing status and experiences of using Chinese sign language: an fMRI study.

    PubMed

    Li, Qiang; Xia, Shuang; Zhao, Fei; Qi, Ji

    2014-01-01

    The purpose of this study was to assess functional changes in the cerebral cortex in people with different sign language experience and hearing status whilst observing and imitating Chinese Sign Language (CSL) using functional magnetic resonance imaging (fMRI). 50 participants took part in the study, and were divided into four groups according to their hearing status and experience of using sign language: prelingual deafness signer group (PDS), normal hearing non-signer group (HnS), native signer group with normal hearing (HNS), and acquired signer group with normal hearing (HLS). fMRI images were scanned from all subjects when they performed block-designed tasks that involved observing and imitating sign language stimuli. Nine activation areas were found in response to undertaking either observation or imitation CSL tasks and three activated areas were found only when undertaking the imitation task. Of those, the PDS group had significantly greater activation areas in terms of the cluster size of the activated voxels in the bilateral superior parietal lobule, cuneate lobe and lingual gyrus in response to undertaking either the observation or the imitation CSL task than the HnS, HNS and HLS groups. The PDS group also showed significantly greater activation in the bilateral inferior frontal gyrus which was also found in the HNS or the HLS groups but not in the HnS group. This indicates that deaf signers have better sign language proficiency, because they engage more actively with the phonetic and semantic elements. In addition, the activations of the bilateral superior temporal gyrus and inferior parietal lobule were only found in the PDS group and HNS group, and not in the other two groups, which indicates that the area for sign language processing appears to be sensitive to the age of language acquisition. After reading this article, readers will be able to: discuss the relationship between sign language and its neural mechanisms. Copyright © 2014 Elsevier Inc

  2. Advances to the development of a basic Mexican sign-to-speech and text language translator

    NASA Astrophysics Data System (ADS)

    Garcia-Bautista, G.; Trujillo-Romero, F.; Diaz-Gonzalez, G.

    2016-09-01

    Sign Language (SL) is the basic alternative communication method between deaf people. However, most of the hearing people have trouble understanding the SL, making communication with deaf people almost impossible and taking them apart from daily activities. In this work we present an automatic basic real-time sign language translator capable of recognize a basic list of Mexican Sign Language (MSL) signs of 10 meaningful words, letters (A-Z) and numbers (1-10) and translate them into speech and text. The signs were collected from a group of 35 MSL signers executed in front of a Microsoft Kinect™ Sensor. The hand gesture recognition system use the RGB-D camera to build and storage data point clouds, color and skeleton tracking information. In this work we propose a method to obtain the representative hand trajectory pattern information. We use Euclidean Segmentation method to obtain the hand shape and Hierarchical Centroid as feature extraction method for images of numbers and letters. A pattern recognition method based on a Back Propagation Artificial Neural Network (ANN) is used to interpret the hand gestures. Finally, we use K-Fold Cross Validation method for training and testing stages. Our results achieve an accuracy of 95.71% on words, 98.57% on numbers and 79.71% on letters. In addition, an interactive user interface was designed to present the results in voice and text format.

  3. Evidence for Website Claims about the Benefits of Teaching Sign Language to Infants and Toddlers with Normal Hearing

    ERIC Educational Resources Information Center

    Nelson, Lauri H.; White, Karl R.; Grewe, Jennifer

    2012-01-01

    The development of proficient communication skills in infants and toddlers is an important component to child development. A popular trend gaining national media attention is teaching sign language to babies with normal hearing whose parents also have normal hearing. Thirty-three websites were identified that advocate sign language for hearing…

  4. Eye gaze during comprehension of American Sign Language by native and beginning signers.

    PubMed

    Emmorey, Karen; Thompson, Robin; Colvin, Rachael

    2009-01-01

    An eye-tracking experiment investigated where deaf native signers (N = 9) and hearing beginning signers (N = 10) look while comprehending a short narrative and a spatial description in American Sign Language produced live by a fluent signer. Both groups fixated primarily on the signer's face (more than 80% of the time) but differed with respect to fixation location. Beginning signers fixated on or near the signer's mouth, perhaps to better perceive English mouthing, whereas native signers tended to fixate on or near the eyes. Beginning signers shifted gaze away from the signer's face more frequently than native signers, but the pattern of gaze shifts was similar for both groups. When a shift in gaze occurred, the sign narrator was almost always looking at his or her hands and was most often producing a classifier construction. We conclude that joint visual attention and attention to mouthing (for beginning signers), rather than linguistic complexity or processing load, affect gaze fixation patterns during sign language comprehension.

  5. Computer-Assisted Language Learning for Japanese on the Macintosh: An Update of What's Available.

    ERIC Educational Resources Information Center

    Darnall, Cliff; And Others

    This paper outlines a presentation on available Macintosh computer software for learning Japanese. The software systems described are categorized by their emphasis on speaking, writing, or reading, with a special section on software for young learners. Software that emphasizes spoken language includes "Berlitz for Business…

  6. How Do Teachers and Learners Perceive Corrective Feedback in the Japanese Language Classroom?

    ERIC Educational Resources Information Center

    Yoshida, Reiko

    2010-01-01

    This study examined Japanese language teachers' and learners' perceptions of corrective feedback (CF), focusing on the cases in which the learners responded to the teachers' CF. Data were collected from the second-year course of an Australian university for 1 semester by classroom observation and audio recording and stimulated recall interviews.…

  7. The Impact of Input Quality on Early Sign Development in Native and Non-Native Language Learners

    ERIC Educational Resources Information Center

    Lu, Jenny; Jones, Anna; Morgan, Gary

    2016-01-01

    There is debate about how input variation influences child language. Most deaf children are exposed to a sign language from their non-fluent hearing parents and experience a delay in exposure to accessible language. A small number of children receive language input from their deaf parents who are fluent signers. Thus it is possible to document the…

  8. Negotiating Sequential Boundaries and Learning Opportunities: A Case from a Japanese Language Classroom

    ERIC Educational Resources Information Center

    Mori, Junko

    2004-01-01

    Using the methodological framework of conversation analysis (CA) as a central tool for analysis, this study examines a peer interactive task that occurred in a Japanese as a foreign language classroom. During the short segment of interaction, the students shifted back and forth between the development of an assigned task and the management of…

  9. Japanese as a Second Language Assessment in Japan: Current Issues and Future Directions

    ERIC Educational Resources Information Center

    Hatasa, Yukiko; Watanabe, Tomoko

    2017-01-01

    This article reviews assessment practices of Japanese as a second language as taught in Japan since the 1980s. It begins with an explanation of the social and political conditions that have impacted assessment practices in Japan and then addresses current assessment practices and issues. This analysis first examines large-scale tests developed in…

  10. Scaffolding Strategies for Wiki-Based Collaboration: Action Research in a Multicultural Japanese Language Program

    ERIC Educational Resources Information Center

    Jung, Insung; Suzuki, Yoko

    2015-01-01

    Wikis can be used to encourage and support collaborative constructivist learning. However, their effectiveness depends upon the use of scaffolding strategies to guide the students in their use. This action research investigated three scaffolding strategies for wiki-based multicultural Japanese language learning: worked examples, grouping and peer…

  11. A qualitative exploration of trial-related terminology in a study involving Deaf British Sign Language users.

    PubMed

    Young, Alys; Oram, Rosemary; Dodds, Claire; Nassimi-Green, Catherine; Belk, Rachel; Rogers, Katherine; Davies, Linda; Lovell, Karina

    2016-04-27

    Internationally, few clinical trials have involved Deaf people who use a signed language and none have involved BSL (British Sign Language) users. Appropriate terminology in BSL for key concepts in clinical trials that are relevant to recruitment and participant information materials, to support informed consent, do not exist. Barriers to conceptual understanding of trial participation and sources of misunderstanding relevant to the Deaf community are undocumented. A qualitative, community participatory exploration of trial terminology including conceptual understanding of 'randomisation', 'trial', 'informed choice' and 'consent' was facilitated in BSL involving 19 participants in five focus groups. Data were video-recorded and analysed in source language (BSL) using a phenomenological approach. Six necessary conditions for developing trial information to support comprehension were identified. These included: developing appropriate expressions and terminology from a community basis, rather than testing out previously derived translations from a different language; paying attention to language-specific features which support best means of expression (in the case of BSL expectations of specificity, verb directionality, handshape); bilingual influences on comprehension; deliberate orientation of information to avoid misunderstanding not just to promote accessibility; sensitivity to barriers to discussion about intelligibility of information that are cultural and social in origin, rather than linguistic; the importance of using contemporary language-in-use, rather than jargon-free or plain language, to support meaningful understanding. The study reinforces the ethical imperative to ensure trial participants who are Deaf are provided with optimum resources to understand the implications of participation and to make an informed choice. Results are relevant to the development of trial information in other signed languages as well as in spoken/written languages when

  12. American Sign Language and Deaf Culture Competency of Osteopathic Medical Students

    ERIC Educational Resources Information Center

    Lapinsky, Jessica; Colonna, Caitlin; Sexton, Patricia; Richard, Mariah

    2015-01-01

    The study examined the effectiveness of a workshop on Deaf culture and basic medical American Sign Language for increasing osteopathic student physicians' confidence and knowledge when interacting with ASL-using patients. Students completed a pretest in which they provided basic demographic information, rated their confidence levels, took a video…

  13. Acquiring Word Class Distinctions in American Sign Language: Evidence from Handshape

    ERIC Educational Resources Information Center

    Brentari, Diane; Coppola, Marie; Jung, Ashley; Goldin-Meadow, Susan

    2013-01-01

    Handshape works differently in nouns versus a class of verbs in American Sign Language (ASL) and thus can serve as a cue to distinguish between these two word classes. Handshapes representing characteristics of the object itself ("object" handshapes) and handshapes representing how the object is handled ("handling" handshapes)…

  14. An Avatar-Based Italian Sign Language Visualization System

    NASA Astrophysics Data System (ADS)

    Falletto, Andrea; Prinetto, Paolo; Tiotto, Gabriele

    In this paper, we present an experimental system that supports the translation from Italian to Italian Sign Language (ISL) of the deaf and its visualization through a virtual character. Our objective is to develop a complete platform useful for any application and reusable on several platforms including Web, Digital Television and offline text translation. The system relies on a database that stores both a corpus of Italian words and words coded in the ISL notation system. An interface for the insertion of data is implemented, that allows future extensions and integrations.

  15. Structure of the Brazilian Sign Language (Libras) for Computational Tools: Citizenship and Social Inclusion

    NASA Astrophysics Data System (ADS)

    Guimaraes, Cayley; Antunes, Diego R.; de F. Guilhermino Trindade, Daniela; da Silva, Rafaella A. Lopes; Garcia, Laura Sanchez

    This work presents a computational model (XML) of the Brazilian Sign Language (Libras), based on its phonology. The model was used to create a sample of representative signs to aid the recording of a base of videos whose aim is to support the development of tools to support genuine social inclusion of the deaf.

  16. First language acquisition differs from second language acquisition in prelingually deaf signers: Evidence from sensitivity to grammaticality judgement in British Sign Language

    PubMed Central

    Cormier, Kearsy; Schembri, Adam; Vinson, David; Orfanidou, Eleni

    2012-01-01

    Age of acquisition (AoA) effects have been used to support the notion of a critical period for first language acquisition. In this study, we examine AoA effects in deaf British Sign Language (BSL) users via a grammaticality judgment task. When English reading performance and nonverbal IQ are factored out, results show that accuracy of grammaticality judgement decreases as AoA increases, until around age 8, thus showing the unique effect of AoA on grammatical judgement in early learners. No such effects were found in those who acquired BSL after age 8. These late learners appear to have first language proficiency in English instead, which may have been used to scaffold learning of BSL as a second language later in life. PMID:22578601

  17. Real-Time Processing of ASL Signs: Delayed First Language Acquisition Affects Organization of the Mental Lexicon

    ERIC Educational Resources Information Center

    Lieberman, Amy M.; Borovsky, Arielle; Hatrak, Marla; Mayberry, Rachel I.

    2015-01-01

    Sign language comprehension requires visual attention to the linguistic signal and visual attention to referents in the surrounding world, whereas these processes are divided between the auditory and visual modalities for spoken language comprehension. Additionally, the age-onset of first language acquisition and the quality and quantity of…

  18. Educational Resources and Implementation of a Greek Sign Language Synthesis Architecture

    ERIC Educational Resources Information Center

    Karpouzis, K.; Caridakis, G.; Fotinea, S.-E.; Efthimiou, E.

    2007-01-01

    In this paper, we present how creation and dynamic synthesis of linguistic resources of Greek Sign Language (GSL) may serve to support development and provide content to an educational multitask platform for the teaching of GSL in early elementary school classes. The presented system utilizes standard virtual character (VC) animation technologies…

  19. Signing Science! Andy And Tonya Are Just Like Me! They Wear Hearing Aids And Know My Language!?

    ERIC Educational Resources Information Center

    Vesel, Judy

    2005-01-01

    Are these students talking about their classmates? No, they are describing the Signing Avatar characters--3-D figures who appear on the EnViSci Network Web site and sign the resources and activities in American Sign Language (ASL) or Signed English (SE). During the 2003?04 school year, students in schools for the deaf and hard of hearing…

  20. Tactile Signing with One-Handed Perception

    ERIC Educational Resources Information Center

    Mesch, Johanna

    2013-01-01

    Tactile signing among persons with deaf-blindness is not homogenous; rather, like other forms of language, it exhibits variation, especially in turn taking. Early analyses of tactile Swedish Sign Language, tactile Norwegian Sign Language, and tactile French Sign Language focused on tactile communication with four hands, in which partially blind or…

  1. Family-Centered Practices and American Sign Language (ASL): Challenges and Recommendations

    ERIC Educational Resources Information Center

    Hardin, Belinda J.; Blanchard, Sheresa Boone; Kemmery, Megan A.; Appenzeller, Margo; Parker, Samuel D.

    2014-01-01

    Families with children who are deaf face many important decisions, especially the mode(s) of communication their children will use. The purpose of this focus group study was to better understand the experiences and recommendations of families who chose American Sign Language (ASL) as their primary mode of communication and to identify strategies…

  2. Interactive Application in Spanish Sign Language for a Public Transport Environment

    ERIC Educational Resources Information Center

    Viera-Santana, José Guillermo; Hernández-Haddad, Juan C.; Rodríguez-Esparragón, Dionisio; Castillo-Ortiz, Jesús

    2014-01-01

    People with hearing disability find it difficult to access to information and communication in public places. According to this fact, it is considered the possibility to design a communication system based on the Spanish Sign Language (SSL), which helps to overcome this barrier in public environments of wide concurrence, where much of the…

  3. First language acquisition differs from second language acquisition in prelingually deaf signers: evidence from sensitivity to grammaticality judgement in British Sign Language.

    PubMed

    Cormier, Kearsy; Schembri, Adam; Vinson, David; Orfanidou, Eleni

    2012-07-01

    Age of acquisition (AoA) effects have been used to support the notion of a critical period for first language acquisition. In this study, we examine AoA effects in deaf British Sign Language (BSL) users via a grammaticality judgment task. When English reading performance and nonverbal IQ are factored out, results show that accuracy of grammaticality judgement decreases as AoA increases, until around age 8, thus showing the unique effect of AoA on grammatical judgement in early learners. No such effects were found in those who acquired BSL after age 8. These late learners appear to have first language proficiency in English instead, which may have been used to scaffold learning of BSL as a second language later in life. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. A Case of Specific Language Impairment in a Deaf Signer of American Sign Language.

    PubMed

    Quinto-Pozos, David; Singleton, Jenny L; Hauser, Peter C

    2017-04-01

    This article describes the case of a deaf native signer of American Sign Language (ASL) with a specific language impairment (SLI). School records documented normal cognitive development but atypical language development. Data include school records; interviews with the child, his mother, and school professionals; ASL and English evaluations; and a comprehensive neuropsychological and psychoeducational evaluation, and they span an approximate period of 7.5 years (11;10-19;6) including scores from school records (11;10-16;5) and a 3.5-year period (15;10-19;6) during which we collected linguistic and neuropsychological data. Results revealed that this student has average intelligence, intact visual perceptual skills, visuospatial skills, and motor skills but demonstrates challenges with some memory and sequential processing tasks. Scores from ASL testing signaled language impairment and marked difficulty with fingerspelling. The student also had significant deficits in English vocabulary, spelling, reading comprehension, reading fluency, and writing. Accepted SLI diagnostic criteria exclude deaf individuals from an SLI diagnosis, but the authors propose modified criteria in this work. The results of this study have practical implications for professionals including school psychologists, speech language pathologists, and ASL specialists. The results also support the theoretical argument that SLI can be evident regardless of the modality in which it is communicated. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Japanese Language and Culture: 10-3Y, 20-3Y, 30-3Y. 3-Year Program Guide to Implementation

    ERIC Educational Resources Information Center

    Alberta Education, 2009

    2009-01-01

    This guide to implementation is intended to support the Japanese Language and Culture 10-3Y, 20-3Y, 30-3Y Program of Studies. It was developed primarily for teachers, yet it includes information that may be useful for administrators and other stakeholders in their efforts to plan for and implement the new Japanese program of studies. Familiarity…

  6. The influence of visual feedback and register changes on sign language production: A kinematic study with deaf signers

    PubMed Central

    EMMOREY, KAREN; GERTSBERG, NELLY; KORPICS, FRANCO; WRIGHT, CHARLES E.

    2009-01-01

    Speakers monitor their speech output by listening to their own voice. However, signers do not look directly at their hands and cannot see their own face. We investigated the importance of a visual perceptual loop for sign language monitoring by examining whether changes in visual input alter sign production. Deaf signers produced American Sign Language (ASL) signs within a carrier phrase under five conditions: blindfolded, wearing tunnel-vision goggles, normal (citation) signing, shouting, and informal signing. Three-dimensional movement trajectories were obtained using an Optotrak Certus system. Informally produced signs were shorter with less vertical movement. Shouted signs were displaced forward and to the right and were produced within a larger volume of signing space, with greater velocity, greater distance traveled, and a longer duration. Tunnel vision caused signers to produce less movement within the vertical dimension of signing space, but blind and citation signing did not differ significantly on any measure, except duration. Thus, signers do not “sign louder” when they cannot see themselves, but they do alter their sign production when vision is restricted. We hypothesize that visual feedback serves primarily to fine-tune the size of signing space rather than as input to a comprehension-based monitor. PMID:20046943

  7. The influence of visual feedback and register changes on sign language production: A kinematic study with deaf signers.

    PubMed

    Emmorey, Karen; Gertsberg, Nelly; Korpics, Franco; Wright, Charles E

    2009-01-01

    Speakers monitor their speech output by listening to their own voice. However, signers do not look directly at their hands and cannot see their own face. We investigated the importance of a visual perceptual loop for sign language monitoring by examining whether changes in visual input alter sign production. Deaf signers produced American Sign Language (ASL) signs within a carrier phrase under five conditions: blindfolded, wearing tunnel-vision goggles, normal (citation) signing, shouting, and informal signing. Three-dimensional movement trajectories were obtained using an Optotrak Certus system. Informally produced signs were shorter with less vertical movement. Shouted signs were displaced forward and to the right and were produced within a larger volume of signing space, with greater velocity, greater distance traveled, and a longer duration. Tunnel vision caused signers to produce less movement within the vertical dimension of signing space, but blind and citation signing did not differ significantly on any measure, except duration. Thus, signers do not "sign louder" when they cannot see themselves, but they do alter their sign production when vision is restricted. We hypothesize that visual feedback serves primarily to fine-tune the size of signing space rather than as input to a comprehension-based monitor.

  8. Recognition of sign language with an inertial sensor-based data glove.

    PubMed

    Kim, Kyung-Won; Lee, Mi-So; Soon, Bo-Ram; Ryu, Mun-Ho; Kim, Je-Nam

    2015-01-01

    Communication between people with normal hearing and hearing impairment is difficult. Recently, a variety of studies on sign language recognition have presented benefits from the development of information technology. This study presents a sign language recognition system using a data glove composed of 3-axis accelerometers, magnetometers, and gyroscopes. Each data obtained by the data glove is transmitted to a host application (implemented in a Window program on a PC). Next, the data is converted into angle data, and the angle information is displayed on the host application and verified by outputting three-dimensional models to the display. An experiment was performed with five subjects, three females and two males, and a performance set comprising numbers from one to nine was repeated five times. The system achieves a 99.26% movement detection rate, and approximately 98% recognition rate for each finger's state. The proposed system is expected to be a more portable and useful system when this algorithm is applied to smartphone applications for use in some situations such as in emergencies.

  9. Linguistic Relativity in Japanese and English: Is Language the Primary Determinant in Object Classification?

    ERIC Educational Resources Information Center

    Mazuka, Reiko; Friedman, Ronald S.

    2000-01-01

    Tested claims by Lucy (1992a, 1992b) that differences between the number marking systems used by Yucatec Maya and English lead speakers of these languages to differentially attend to either the material composition or the shape of objects. Replicated Lucy's critical objects' classification experiments using speakers of English and Japanese.…

  10. The Role of Parental Support and Family Variables in L1 and L2 Vocabulary Development of Japanese Heritage Language Students in the United States

    ERIC Educational Resources Information Center

    Mori, Yoshiko; Calder, Toshiko M.

    2017-01-01

    This study investigated the role of parental support and selected family variables in the first (L1) and second language (L2) vocabulary development of Japanese heritage language (JHL) high school students in the United States. Eighty-two JHL students ages 15-18 from eight hoshuukoo (i.e., supplementary academic schools for Japanese-speaking…

  11. Semantic fluency in deaf children who use spoken and signed language in comparison with hearing peers

    PubMed Central

    Jones, A.; Fastelli, A.; Atkinson, J.; Botting, N.; Morgan, G.

    2017-01-01

    Abstract Background Deafness has an adverse impact on children's ability to acquire spoken languages. Signed languages offer a more accessible input for deaf children, but because the vast majority are born to hearing parents who do not sign, their early exposure to sign language is limited. Deaf children as a whole are therefore at high risk of language delays. Aims We compared deaf and hearing children's performance on a semantic fluency task. Optimal performance on this task requires a systematic search of the mental lexicon, the retrieval of words within a subcategory and, when that subcategory is exhausted, switching to a new subcategory. We compared retrieval patterns between groups, and also compared the responses of deaf children who used British Sign Language (BSL) with those who used spoken English. We investigated how semantic fluency performance related to children's expressive vocabulary and executive function skills, and also retested semantic fluency in the majority of the children nearly 2 years later, in order to investigate how much progress they had made in that time. Methods & Procedures Participants were deaf children aged 6–11 years (N = 106, comprising 69 users of spoken English, 29 users of BSL and eight users of Sign Supported English—SSE) compared with hearing children (N = 120) of the same age who used spoken English. Semantic fluency was tested for the category ‘animals’. We coded for errors, clusters (e.g., ‘pets’, ‘farm animals’) and switches. Participants also completed the Expressive One‐Word Picture Vocabulary Test and a battery of six non‐verbal executive function tasks. In addition, we collected follow‐up semantic fluency data for 70 deaf and 74 hearing children, nearly 2 years after they were first tested. Outcomes & Results Deaf children, whether using spoken or signed language, produced fewer items in the semantic fluency task than hearing children, but they showed similar patterns of responses for items

  12. Examining the contribution of motor movement and language dominance to increased left lateralization during sign generation in native signers.

    PubMed

    Gutierrez-Sigut, Eva; Payne, Heather; MacSweeney, Mairéad

    2016-08-01

    The neural systems supporting speech and sign processing are very similar, although not identical. In a previous fTCD study of hearing native signers (Gutierrez-Sigut, Daws, et al., 2015) we found stronger left lateralization for sign than speech. Given that this increased lateralization could not be explained by hand movement alone, the contribution of motor movement versus 'linguistic' processes to the strength of hemispheric lateralization during sign production remains unclear. Here we directly contrast lateralization strength of covert versus overt signing during phonological and semantic fluency tasks. To address the possibility that hearing native signers' elevated lateralization indices (LIs) were due to performing a task in their less dominant language, here we test deaf native signers, whose dominant language is British Sign Language (BSL). Signers were more strongly left lateralized for overt than covert sign generation. However, the strength of lateralization was not correlated with the amount of time producing movements of the right hand. Comparisons with previous data from hearing native English speakers suggest stronger laterality indices for sign than speech in both covert and overt tasks. This increased left lateralization may be driven by specific properties of sign production such as the increased use of self-monitoring mechanisms or the nature of phonological encoding of signs. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Language lateralization of hearing native signers: A functional transcranial Doppler sonography (fTCD) study of speech and sign production.

    PubMed

    Gutierrez-Sigut, Eva; Daws, Richard; Payne, Heather; Blott, Jonathan; Marshall, Chloë; MacSweeney, Mairéad

    2015-12-01

    Neuroimaging studies suggest greater involvement of the left parietal lobe in sign language compared to speech production. This stronger activation might be linked to the specific demands of sign encoding and proprioceptive monitoring. In Experiment 1 we investigate hemispheric lateralization during sign and speech generation in hearing native users of English and British Sign Language (BSL). Participants exhibited stronger lateralization during BSL than English production. In Experiment 2 we investigated whether this increased lateralization index could be due exclusively to the higher motoric demands of sign production. Sign naïve participants performed a phonological fluency task in English and a non-sign repetition task. Participants were left lateralized in the phonological fluency task but there was no consistent pattern of lateralization for the non-sign repetition in these hearing non-signers. The current data demonstrate stronger left hemisphere lateralization for producing signs than speech, which was not primarily driven by motoric articulatory demands. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Language lateralization of hearing native signers: A functional transcranial Doppler sonography (fTCD) study of speech and sign production

    PubMed Central

    Gutierrez-Sigut, Eva; Daws, Richard; Payne, Heather; Blott, Jonathan; Marshall, Chloë; MacSweeney, Mairéad

    2016-01-01

    Neuroimaging studies suggest greater involvement of the left parietal lobe in sign language compared to speech production. This stronger activation might be linked to the specific demands of sign encoding and proprioceptive monitoring. In Experiment 1 we investigate hemispheric lateralization during sign and speech generation in hearing native users of English and British Sign Language (BSL). Participants exhibited stronger lateralization during BSL than English production. In Experiment 2 we investigated whether this increased lateralization index could be due exclusively to the higher motoric demands of sign production. Sign naïve participants performed a phonological fluency task in English and a non-sign repetition task. Participants were left lateralized in the phonological fluency task but there was no consistent pattern of lateralization for the non-sign repetition in these hearing non-signers. The current data demonstrate stronger left hemisphere lateralization for producing signs than speech, which was not primarily driven by motoric articulatory demands. PMID:26605960

  15. The Design of Hand Gestures for Human-Computer Interaction: Lessons from Sign Language Interpreters.

    PubMed

    Rempel, David; Camilleri, Matt J; Lee, David L

    2015-10-01

    The design and selection of 3D modeled hand gestures for human-computer interaction should follow principles of natural language combined with the need to optimize gesture contrast and recognition. The selection should also consider the discomfort and fatigue associated with distinct hand postures and motions, especially for common commands. Sign language interpreters have extensive and unique experience forming hand gestures and many suffer from hand pain while gesturing. Professional sign language interpreters (N=24) rated discomfort for hand gestures associated with 47 characters and words and 33 hand postures. Clear associations of discomfort with hand postures were identified. In a nominal logistic regression model, high discomfort was associated with gestures requiring a flexed wrist, discordant adjacent fingers, or extended fingers. These and other findings should be considered in the design of hand gestures to optimize the relationship between human cognitive and physical processes and computer gesture recognition systems for human-computer input.

  16. The Design of Hand Gestures for Human-Computer Interaction: Lessons from Sign Language Interpreters

    PubMed Central

    Rempel, David; Camilleri, Matt J.; Lee, David L.

    2015-01-01

    The design and selection of 3D modeled hand gestures for human-computer interaction should follow principles of natural language combined with the need to optimize gesture contrast and recognition. The selection should also consider the discomfort and fatigue associated with distinct hand postures and motions, especially for common commands. Sign language interpreters have extensive and unique experience forming hand gestures and many suffer from hand pain while gesturing. Professional sign language interpreters (N=24) rated discomfort for hand gestures associated with 47 characters and words and 33 hand postures. Clear associations of discomfort with hand postures were identified. In a nominal logistic regression model, high discomfort was associated with gestures requiring a flexed wrist, discordant adjacent fingers, or extended fingers. These and other findings should be considered in the design of hand gestures to optimize the relationship between human cognitive and physical processes and computer gesture recognition systems for human-computer input. PMID:26028955

  17. Bilingual Dual Coding in Japanese Returnee Students.

    ERIC Educational Resources Information Center

    Taura, Hideyuki

    1998-01-01

    Investigates effects of second-language acquisition age, length of exposure to the second language, and Japanese language specificity on the bilingual dual coding hypothesis proposed by Paivio and Desrochers (1980). Balanced Japanese-English bilingual returnee (having resided in an English-speaking country) subjects were presented with pictures to…

  18. Repeated Reading for Japanese Language Learners: Effects on Reading Speed, Comprehension, and Comprehension Strategies

    ERIC Educational Resources Information Center

    Gorsuch, Greta; Taguchi, Etsuo; Umehara, Hiroaki

    2015-01-01

    A perennial challenge to second language educators and learners is getting sufficient input in settings where the L2 is not widely used, in this case beginning-level American university students learning Japanese. Reading is a significant means of getting L2 input, with recent calls for attention to reading and authentic texts as curriculum…

  19. Reading in Asian Languages: Making Sense of Written Texts in Chinese, Japanese, and Korean

    ERIC Educational Resources Information Center

    Goodman, Kenneth S., Ed.; Wang, Shaomei, Ed.; Iventosch, Mieko, Ed.; Goodman, Yetta M., Ed.

    2011-01-01

    "Reading in Asian Languages" is rich with information about how literacy works in the non-alphabetic writing systems (Chinese, Japanese, Korean) used by hundreds of millions of people and refutes the common Western belief that such systems are hard to learn or to use. The contributors share a comprehensive view of reading as construction…

  20. The neural correlates of highly iconic structures and topographic discourse in French Sign Language as observed in six hearing native signers.

    PubMed

    Courtin, C; Hervé, P-Y; Petit, L; Zago, L; Vigneau, M; Beaucousin, V; Jobard, G; Mazoyer, B; Mellet, E; Tzourio-Mazoyer, N

    2010-09-01

    "Highly iconic" structures in Sign Language enable a narrator to act, switch characters, describe objects, or report actions in four-dimensions. This group of linguistic structures has no real spoken-language equivalent. Topographical descriptions are also achieved in a sign-language specific manner via the use of signing-space and spatial-classifier signs. We used functional magnetic resonance imaging (fMRI) to compare the neural correlates of topographic discourse and highly iconic structures in French Sign Language (LSF) in six hearing native signers, children of deaf adults (CODAs), and six LSF-naïve monolinguals. LSF materials consisted of videos of a lecture excerpt signed without spatially organized discourse or highly iconic structures (Lect LSF), a tale signed using highly iconic structures (Tale LSF), and a topographical description using a diagrammatic format and spatial-classifier signs (Topo LSF). We also presented texts in spoken French (Lect French, Tale French, Topo French) to all participants. With both languages, the Topo texts activated several different regions that are involved in mental navigation and spatial working memory. No specific correlate of LSF spatial discourse was evidenced. The same regions were more activated during Tale LSF than Lect LSF in CODAs, but not in monolinguals, in line with the presence of signing-space structure in both conditions. Motion processing areas and parts of the fusiform gyrus and precuneus were more active during Tale LSF in CODAs; no such effect was observed with French or in LSF-naïve monolinguals. These effects may be associated with perspective-taking and acting during personal transfers. 2010 Elsevier Inc. All rights reserved.