Sample records for level computer language

  1. Computer Languages: A Practical Guide to the Chief Programming Languages.

    ERIC Educational Resources Information Center

    Sanderson, Peter C.

    All the most commonly-used high-level computer languages are discussed in this book. An introductory discussion provides an overview of the basic components of a digital computer, the general planning of a computer programing problem, and the various types of computer languages. Each chapter is self-contained, emphasizes those features of a…

  2. High level language for measurement complex control based on the computer E-100I

    NASA Technical Reports Server (NTRS)

    Zubkov, B. V.

    1980-01-01

    A high level language was designed to control the process of conducting an experiment using the computer "Elektrinika-1001". Program examples are given to control the measuring and actuating devices. The procedure of including these programs in the suggested high level language is described.

  3. Frances: A Tool for Understanding Computer Architecture and Assembly Language

    ERIC Educational Resources Information Center

    Sondag, Tyler; Pokorny, Kian L.; Rajan, Hridesh

    2012-01-01

    Students in all areas of computing require knowledge of the computing device including software implementation at the machine level. Several courses in computer science curricula address these low-level details such as computer architecture and assembly languages. For such courses, there are advantages to studying real architectures instead of…

  4. Exploring English as a Foreign Language (EFL) Teacher Trainers' Perspectives on Challenges to Promoting Computer Literacy of EFL Teachers

    ERIC Educational Resources Information Center

    Dashtestani, Reza

    2014-01-01

    Computer literacy is a significant component of language teachers' computer-assisted language learning (call) knowledge. Despite its importance, limited research has been undertaken to analyze factors which might influence language teachers' computer literacy levels. This qualitative study explored the perspectives of 39 Iranian EFL teacher…

  5. Computer Literacy of Iranian Teachers of English as a Foreign Language: Challenges and Obstacles

    ERIC Educational Resources Information Center

    Dashtestani, Reza

    2014-01-01

    Basically, one of the requirements for the implementation of computer-assisted language learning (CALL) is English as a foreign language (EFL) teachers' ability to use computers effectively. Educational authorities and planners should identify EFL teachers' computer literacy levels and make attempts to improve the teachers' computer competence.…

  6. Computer Simulation and ESL Reading.

    ERIC Educational Resources Information Center

    Wu, Mary A.

    It is noted that although two approaches to second language instruction--the communicative approach emphasizing genuine language use and computer assisted instruction--have come together in the form of some lower level reading instruction materials for English as a second language (ESL), advanced level ESL reading materials using computer…

  7. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  8. Emerging Approach of Natural Language Processing in Opinion Mining: A Review

    NASA Astrophysics Data System (ADS)

    Kim, Tai-Hoon

    Natural language processing (NLP) is a subfield of artificial intelligence and computational linguistics. It studies the problems of automated generation and understanding of natural human languages. This paper outlines a framework to use computer and natural language techniques for various levels of learners to learn foreign languages in Computer-based Learning environment. We propose some ideas for using the computer as a practical tool for learning foreign language where the most of courseware is generated automatically. We then describe how to build Computer Based Learning tools, discuss its effectiveness, and conclude with some possibilities using on-line resources.

  9. What Is a Programming Language?

    ERIC Educational Resources Information Center

    Wold, Allen

    1983-01-01

    Explains what a computer programing language is in general, the differences between machine language, assembler languages, and high-level languages, and the functions of compilers and interpreters. High-level languages mentioned in the article are: BASIC, FORTRAN, COBOL, PILOT, LOGO, LISP, and SMALLTALK. (EAO)

  10. Marr's levels and the minimalist program.

    PubMed

    Johnson, Mark

    2017-02-01

    A simple change to a cognitive system at Marr's computational level may entail complex changes at the other levels of description of the system. The implementational level complexity of a change, rather than its computational level complexity, may be more closely related to the plausibility of a discrete evolutionary event causing that change. Thus the formal complexity of a change at the computational level may not be a good guide to the plausibility of an evolutionary event introducing that change. For example, while the Minimalist Program's Merge is a simple formal operation (Berwick & Chomsky, 2016), the computational mechanisms required to implement the language it generates (e.g., to parse the language) may be considerably more complex. This has implications for the theory of grammar: theories of grammar which involve several kinds of syntactic operations may be no less evolutionarily plausible than a theory of grammar that involves only one. A deeper understanding of human language at the algorithmic and implementational levels could strengthen Minimalist Program's account of the evolution of language.

  11. Computer Language For Optimization Of Design

    NASA Technical Reports Server (NTRS)

    Scotti, Stephen J.; Lucas, Stephen H.

    1991-01-01

    SOL is computer language geared to solution of design problems. Includes mathematical modeling and logical capabilities of computer language like FORTRAN; also includes additional power of nonlinear mathematical programming methods at language level. SOL compiler takes SOL-language statements and generates equivalent FORTRAN code and system calls. Provides syntactic and semantic checking for recovery from errors and provides detailed reports containing cross-references to show where each variable used. Implemented on VAX/VMS computer systems. Requires VAX FORTRAN compiler to produce executable program.

  12. The paradigm compiler: Mapping a functional language for the connection machine

    NASA Technical Reports Server (NTRS)

    Dennis, Jack B.

    1989-01-01

    The Paradigm Compiler implements a new approach to compiling programs written in high level languages for execution on highly parallel computers. The general approach is to identify the principal data structures constructed by the program and to map these structures onto the processing elements of the target machine. The mapping is chosen to maximize performance as determined through compile time global analysis of the source program. The source language is Sisal, a functional language designed for scientific computations, and the target language is Paris, the published low level interface to the Connection Machine. The data structures considered are multidimensional arrays whose dimensions are known at compile time. Computations that build such arrays usually offer opportunities for highly parallel execution; they are data parallel. The Connection Machine is an attractive target for these computations, and the parallel for construct of the Sisal language is a convenient high level notation for data parallel algorithms. The principles and organization of the Paradigm Compiler are discussed.

  13. Computer Programming Languages for Health Care

    PubMed Central

    O'Neill, Joseph T.

    1979-01-01

    This paper advocates the use of standard high level programming languages for medical computing. It recommends that U.S. Government agencies having health care missions implement coordinated policies that encourage the use of existing standard languages and the development of new ones, thereby enabling them and the medical computing community at large to share state-of-the-art application programs. Examples are based on a model that characterizes language and language translator influence upon the specification, development, test, evaluation, and transfer of application programs.

  14. Design and Construction of Computer-Assisted Instructional Material: A Handbook for Reading/Language Arts Teachers.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    Intended for reading and language arts teachers at all educational levels, this guide presents information to be used by teachers in constructing their own computer assisted educational software using the BASIC programming language and Apple computers. Part 1 provides an overview of the components of traditional tutorial and drill-and-practice…

  15. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.). Although most people would think that analog synthesizers and electronic music substantially predate the use of computers in music, many experiments and complete computer music systems were being constructed and used as early as the 1950s.

  16. A Cross-Cultural Study on the Attitudes of English Language Students towards Computer-Assisted Language Learning

    ERIC Educational Resources Information Center

    Tafazoli, Dara; Gómez Parra, Mª Elena; Huertas Abril, Cristina A.

    2018-01-01

    The purpose of this study was to compare the attitude of Iranian and non-Iranian English language students' attitudes towards Computer-Assisted Language Learning (CALL). Furthermore, the relations of gender, education level, and age to their attitude are investigated. A convergent mixed methods design was used for analyzing both quantitative and…

  17. High level language-based robotic control system

    NASA Technical Reports Server (NTRS)

    Rodriguez, Guillermo (Inventor); Kruetz, Kenneth K. (Inventor); Jain, Abhinandan (Inventor)

    1994-01-01

    This invention is a robot control system based on a high level language implementing a spatial operator algebra. There are two high level languages included within the system. At the highest level, applications programs can be written in a robot-oriented applications language including broad operators such as MOVE and GRASP. The robot-oriented applications language statements are translated into statements in the spatial operator algebra language. Programming can also take place using the spatial operator algebra language. The statements in the spatial operator algebra language from either source are then translated into machine language statements for execution by a digital control computer. The system also includes the capability of executing the control code sequences in a simulation mode before actual execution to assure proper action at execution time. The robot's environment is checked as part of the process and dynamic reconfiguration is also possible. The languages and system allow the programming and control of multiple arms and the use of inward/outward spatial recursions in which every computational step can be related to a transformation from one point in the mechanical robot to another point to name two major advantages.

  18. High level language-based robotic control system

    NASA Technical Reports Server (NTRS)

    Rodriguez, Guillermo (Inventor); Kreutz, Kenneth K. (Inventor); Jain, Abhinandan (Inventor)

    1996-01-01

    This invention is a robot control system based on a high level language implementing a spatial operator algebra. There are two high level languages included within the system. At the highest level, applications programs can be written in a robot-oriented applications language including broad operators such as MOVE and GRASP. The robot-oriented applications language statements are translated into statements in the spatial operator algebra language. Programming can also take place using the spatial operator algebra language. The statements in the spatial operator algebra language from either source are then translated into machine language statements for execution by a digital control computer. The system also includes the capability of executing the control code sequences in a simulation mode before actual execution to assure proper action at execution time. The robot's environment is checked as part of the process and dynamic reconfiguration is also possible. The languages and system allow the programming and control of multiple arms and the use of inward/outward spatial recursions in which every computational step can be related to a transformation from one point in the mechanical robot to another point to name two major advantages.

  19. STAR - A computer language for hybrid AI applications

    NASA Technical Reports Server (NTRS)

    Borchardt, G. C.

    1986-01-01

    Constructing Artificial Intelligence application systems which rely on both symbolic and non-symbolic processing places heavy demands on the communication of data between dissimilar languages. This paper describes STAR (Simple Tool for Automated Reasoning), a computer language for the development of AI application systems which supports the transfer of data structures between a symbolic level and a non-symbolic level defined in languages such as FORTRAN, C and PASCAL. The organization of STAR is presented, followed by the description of an application involving STAR in the interpretation of airborne imaging spectrometer data.

  20. Students' Motivation toward Computer-Based Language Learning

    ERIC Educational Resources Information Center

    Genc, Gulten; Aydin, Selami

    2011-01-01

    The present article examined some factors affecting the motivation level of the preparatory school students in using a web-based computer-assisted language-learning course. The sample group of the study consisted of 126 English-as-a-foreign-language learners at a preparatory school of a state university. After performing statistical analyses…

  1. Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers

    NASA Technical Reports Server (NTRS)

    Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.

    1983-01-01

    A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.

  2. A high level language for a high performance computer

    NASA Technical Reports Server (NTRS)

    Perrott, R. H.

    1978-01-01

    The proposed computational aerodynamic facility will join the ranks of the supercomputers due to its architecture and increased execution speed. At present, the languages used to program these supercomputers have been modifications of programming languages which were designed many years ago for sequential machines. A new programming language should be developed based on the techniques which have proved valuable for sequential programming languages and incorporating the algorithmic techniques required for these supercomputers. The design objectives for such a language are outlined.

  3. A comparison of hardware description languages. [describing digital systems structure and behavior to a computer

    NASA Technical Reports Server (NTRS)

    Shiva, S. G.

    1978-01-01

    Several high level languages which evolved over the past few years for describing and simulating the structure and behavior of digital systems, on digital computers are assessed. The characteristics of the four prominent languages (CDL, DDL, AHPL, ISP) are summarized. A criterion for selecting a suitable hardware description language for use in an automatic integrated circuit design environment is provided.

  4. Top-down methodology for human factors research

    NASA Technical Reports Server (NTRS)

    Sibert, J.

    1983-01-01

    User computer interaction as a conversation is discussed. The design of user interfaces which depends on viewing communications between a user and the computer as a conversion is presented. This conversation includes inputs to the computer (outputs from the user), outputs from the computer (inputs to the user), and the sequencing in both time and space of those outputs and inputs. The conversation is viewed from the user's side of the conversation. Two languages are modeled: the one with which the user communicates with the computer and the language where communication flows from the computer to the user. Both languages exist on three levels; the semantic, syntactic and lexical. It is suggested that natural languages can also be considered in these terms.

  5. Predicting the Proficiency Level of Language Learners Using Lexical Indices

    ERIC Educational Resources Information Center

    Crossley, Scott A.; Salsbury, Tom; McNamara, Danielle S.

    2012-01-01

    This study explores how second language (L2) texts written by learners at various proficiency levels can be classified using computational indices that characterize lexical competence. For this study, 100 writing samples taken from 100 L2 learners were analyzed using lexical indices reported by the computational tool Coh-Metrix. The L2 writing…

  6. Can Synchronous Computer-Mediated Communication (CMC) Help Beginning-Level Foreign Language Learners Speak?

    ERIC Educational Resources Information Center

    Ko, Chao-Jung

    2012-01-01

    This study investigated the possibility that initial-level learners may acquire oral skills through synchronous computer-mediated communication (SCMC). Twelve Taiwanese French as a foreign language (FFL) students, divided into three groups, were required to conduct a variety of tasks in one of the three learning environments (video/audio, audio,…

  7. On the writing of programming systems for spacecraft computers.

    NASA Technical Reports Server (NTRS)

    Mathur, F. P.; Rohr, J. A.

    1972-01-01

    Consideration of the systems designed to generate programs for the increasingly complex digital computers being used on board unmanned deep-space probes. Such programming systems must accommodate the special-purpose features incorporated in the hardware. The use of higher-level language facilities in the programming system can significantly simplify the task. Computers for Mariner and for the Outer Planets Grand Tour are briefly described, as well as their programming systems. Aspects of the higher level languages are considered.

  8. The Impact of Software on Associate Degree Programs in Electronic Engineering Technology.

    ERIC Educational Resources Information Center

    Hata, David M.

    1986-01-01

    Assesses the range and extent of computer assisted instruction software available in electronic engineering technology education. Examines the need for software skills in four areas: (1) high-level languages; (2) assembly language; (3) computer-aided engineering; and (4) computer-aided instruction. Outlines strategies for the future in three…

  9. Rapid Profile: A Second Language Screening Procedure.

    ERIC Educational Resources Information Center

    Mackey, Alison; And Others

    1991-01-01

    Rapid Profile, developed by Manfred Pienemann of National Languages Institute of Australia/Language Acquisition Research Centre, is a computer-based procedure for screening speech samples collected from language learners to assess their level of language development as compared to standard patterns in the acquisition of the target language. Rapid…

  10. Adaptation of a Control Center Development Environment for Industrial Process Control

    NASA Technical Reports Server (NTRS)

    Killough, Ronnie L.; Malik, James M.

    1994-01-01

    In the control center, raw telemetry data is received for storage, display, and analysis. This raw data must be combined and manipulated in various ways by mathematical computations to facilitate analysis, provide diversified fault detection mechanisms, and enhance display readability. A development tool called the Graphical Computation Builder (GCB) has been implemented which provides flight controllers with the capability to implement computations for use in the control center. The GCB provides a language that contains both general programming constructs and language elements specifically tailored for the control center environment. The GCB concept allows staff who are not skilled in computer programming to author and maintain computer programs. The GCB user is isolated from the details of external subsystem interfaces and has access to high-level functions such as matrix operators, trigonometric functions, and unit conversion macros. The GCB provides a high level of feedback during computation development that improves upon the often cryptic errors produced by computer language compilers. An equivalent need can be identified in the industrial data acquisition and process control domain: that of an integrated graphical development tool tailored to the application to hide the operating system, computer language, and data acquisition interface details. The GCB features a modular design which makes it suitable for technology transfer without significant rework. Control center-specific language elements can be replaced by elements specific to industrial process control.

  11. Memory reduction through higher level language hardware

    NASA Technical Reports Server (NTRS)

    Kerner, H.; Gellman, L.

    1972-01-01

    Application of large scale integration in computers to reduce size and manufacturing costs and to produce improvements in logic function is discussed. Use of FORTRAN 4 as computer language for this purpose is described. Effectiveness of method in storing information is illustrated.

  12. The Effect of Computer-Assisted Language Learning on Reading Comprehension in an Iranian EFL Context

    ERIC Educational Resources Information Center

    Saeidi, Mahnaz; Yusefi, Mahsa

    2012-01-01

    This study is an attempt to examine the effect of computer-assisted language learning (CALL) on reading comprehension in an Iranian English as a foreign language (EFL) context. It was hypothesized that CALL has an effect on reading comprehension. Forty female learners of English at intermediate level after administering a proficiency test were…

  13. Computer Assisted English Language Learning in Costa Rican Elementary Schools: An Experimental Study

    ERIC Educational Resources Information Center

    Alvarez-Marinelli, Horacio; Blanco, Marta; Lara-Alecio, Rafael; Irby, Beverly J.; Tong, Fuhui; Stanley, Katherine; Fan, Yinan

    2016-01-01

    This study presents first-year findings of a 25-week longitudinal project derived from a two-year longitudinal randomized trial study at the elementary school level in Costa Rica on effective computer-assisted language learning (CALL) approaches in an English as a foreign language (EFL) setting. A pre-test-post-test experimental group design was…

  14. Implementation is crucial but must be neurobiologically grounded. Comment on “Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition” by W. Tecumseh Fitch

    NASA Astrophysics Data System (ADS)

    Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias; Small, Steven L.

    2014-09-01

    From the perspective of language, Fitch's [1] claim that theories of cognitive computation should not be separated from those of implementation surely deserves applauding. Recent developments in the Cognitive Neuroscience of Language, leading to the new field of the Neurobiology of Language [2-4], emphasise precisely this point: rather than attempting to simply map cognitive theories of language onto the brain, we should aspire to understand how the brain implements language. This perspective resonates with many of the points raised by Fitch in his review, such as the discussion of unhelpful dichotomies (e.g., Nature versus Nurture). Cognitive dichotomies and debates have repeatedly turned out to be of limited usefulness when it comes to understanding language in the brain. The famous modularity-versus-interactivity and dual route-versus-connectionist debates are cases in point: in spite of hundreds of experiments using neuroimaging (or other techniques), or the construction of myriad computer models, little progress has been made in their resolution. This suggests that dichotomies proposed at a purely cognitive (or computational) level without consideration of biological grounding appear to be "asking the wrong questions" about the neurobiology of language. In accordance with these developments, several recent proposals explicitly consider neurobiological constraints while seeking to explain language processing at a cognitive level (e.g. [5-7]).

  15. Development of the Tensoral Computer Language

    NASA Technical Reports Server (NTRS)

    Ferziger, Joel; Dresselhaus, Eliot

    1996-01-01

    The research scientist or engineer wishing to perform large scale simulations or to extract useful information from existing databases is required to have expertise in the details of the particular database, the numerical methods and the computer architecture to be used. This poses a significant practical barrier to the use of simulation data. The goal of this research was to develop a high-level computer language called Tensoral, designed to remove this barrier. The Tensoral language provides a framework in which efficient generic data manipulations can be easily coded and implemented. First of all, Tensoral is general. The fundamental objects in Tensoral represent tensor fields and the operators that act on them. The numerical implementation of these tensors and operators is completely and flexibly programmable. New mathematical constructs and operators can be easily added to the Tensoral system. Tensoral is compatible with existing languages. Tensoral tensor operations co-exist in a natural way with a host language, which may be any sufficiently powerful computer language such as Fortran, C, or Vectoral. Tensoral is very-high-level. Tensor operations in Tensoral typically act on entire databases (i.e., arrays) at one time and may, therefore, correspond to many lines of code in a conventional language. Tensoral is efficient. Tensoral is a compiled language. Database manipulations are simplified optimized and scheduled by the compiler eventually resulting in efficient machine code to implement them.

  16. Vectorized algorithms for spiking neural network simulation.

    PubMed

    Brette, Romain; Goodman, Dan F M

    2011-06-01

    High-level languages (Matlab, Python) are popular in neuroscience because they are flexible and accelerate development. However, for simulating spiking neural networks, the cost of interpretation is a bottleneck. We describe a set of algorithms to simulate large spiking neural networks efficiently with high-level languages using vector-based operations. These algorithms constitute the core of Brian, a spiking neural network simulator written in the Python language. Vectorized simulation makes it possible to combine the flexibility of high-level languages with the computational efficiency usually associated with compiled languages.

  17. Brain-computer interface with language model-electroencephalography fusion for locked-in syndrome.

    PubMed

    Oken, Barry S; Orhan, Umut; Roark, Brian; Erdogmus, Deniz; Fowler, Andrew; Mooney, Aimee; Peters, Betts; Miller, Meghan; Fried-Oken, Melanie B

    2014-05-01

    Some noninvasive brain-computer interface (BCI) systems are currently available for locked-in syndrome (LIS) but none have incorporated a statistical language model during text generation. To begin to address the communication needs of individuals with LIS using a noninvasive BCI that involves rapid serial visual presentation (RSVP) of symbols and a unique classifier with electroencephalography (EEG) and language model fusion. The RSVP Keyboard was developed with several unique features. Individual letters are presented at 2.5 per second. Computer classification of letters as targets or nontargets based on EEG is performed using machine learning that incorporates a language model for letter prediction via Bayesian fusion enabling targets to be presented only 1 to 4 times. Nine participants with LIS and 9 healthy controls were enrolled. After screening, subjects first calibrated the system, and then completed a series of balanced word generation mastery tasks that were designed with 5 incremental levels of difficulty, which increased by selecting phrases for which the utility of the language model decreased naturally. Six participants with LIS and 9 controls completed the experiment. All LIS participants successfully mastered spelling at level 1 and one subject achieved level 5. Six of 9 control participants achieved level 5. Individuals who have incomplete LIS may benefit from an EEG-based BCI system, which relies on EEG classification and a statistical language model. Steps to further improve the system are discussed.

  18. English Language Learners' Strategies for Reading Computer-Based Texts at Home and in School

    ERIC Educational Resources Information Center

    Park, Ho-Ryong; Kim, Deoksoon

    2016-01-01

    This study investigated four elementary-level English language learners' (ELLs') use of strategies for reading computer-based texts at home and in school. The ELLs in this study were in the fourth and fifth grades in a public elementary school. We identify the ELLs' strategies for reading computer-based texts in home and school environments. We…

  19. Interactive Supercomputing’s Star-P Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edelman, Alan; Husbands, Parry; Leibman, Steve

    2006-09-19

    The thesis of this extended abstract is simple. High productivity comes from high level infrastructures. To measure this, we introduce a methodology that goes beyond the tradition of timing software in serial and tuned parallel modes. We perform a classroom productivity study involving 29 students who have written a homework exercise in a low level language (MPI message passing) and a high level language (Star-P with MATLAB client). Our conclusions indicate what perhaps should be of little surprise: (1) the high level language is always far easier on the students than the low level language. (2) The early versions ofmore » the high level language perform inadequately compared to the tuned low level language, but later versions substantially catch up. Asymptotically, the analogy must hold that message passing is to high level language parallel programming as assembler is to high level environments such as MATLAB, Mathematica, Maple, or even Python. We follow the Kepner method that correctly realizes that traditional speedup numbers without some discussion of the human cost of reaching these numbers can fail to reflect the true human productivity cost of high performance computing. Traditional data compares low level message passing with serial computation. With the benefit of a high level language system in place, in our case Star-P running with MATLAB client, and with the benefit of a large data pool: 29 students, each running the same code ten times on three evolutions of the same platform, we can methodically demonstrate the productivity gains. To date we are not aware of any high level system as extensive and interoperable as Star-P, nor are we aware of an experiment of this kind performed with this volume of data.« less

  20. Parallel language constructs for tensor product computations on loosely coupled architectures

    NASA Technical Reports Server (NTRS)

    Mehrotra, Piyush; Vanrosendale, John

    1989-01-01

    Distributed memory architectures offer high levels of performance and flexibility, but have proven awkard to program. Current languages for nonshared memory architectures provide a relatively low level programming environment, and are poorly suited to modular programming, and to the construction of libraries. A set of language primitives designed to allow the specification of parallel numerical algorithms at a higher level is described. Tensor product array computations are focused on along with a simple but important class of numerical algorithms. The problem of programming 1-D kernal routines is focused on first, such as parallel tridiagonal solvers, and then how such parallel kernels can be combined to form parallel tensor product algorithms is examined.

  1. The Contribution of CALL to Advanced-Level Foreign/Second Language Instruction

    ERIC Educational Resources Information Center

    Burston, Jack; Arispe, Kelly

    2016-01-01

    This paper evaluates the contribution of instructional technology to advanced-level foreign/second language learning (AL2) over the past thirty years. It is shown that the most salient feature of AL2 practice and associated Computer-Assisted Language Learning (CALL) research are their rarity and restricted nature. Based on an analysis of four…

  2. The Language of Man. Book 4.

    ERIC Educational Resources Information Center

    Littell, Joseph Fletcher, Ed.

    Book 4 of "The Language of Man" series contains articles which deal with semantics, levels of language (including informal, formal and technical language, jargon, and gobbledygook), the hidden persuaders (advertising of merchandise and political candidates), and communications of the future (including the computer and other mass media now being…

  3. ProjectQ Software Framework

    NASA Astrophysics Data System (ADS)

    Steiger, Damian S.; Haener, Thomas; Troyer, Matthias

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. A high level quantum programming language and optimizing compilers are essential components to achieve scalable quantum computation. In order to address this, we introduce the ProjectQ software framework - an open source effort to support both theorists and experimentalists by providing intuitive tools to implement and run quantum algorithms. Here, we present our ProjectQ quantum compiler, which compiles a quantum algorithm from our high-level Python-embedded language down to low-level quantum gates available on the target system. We demonstrate how this compiler can be used to control actual hardware and to run high-performance simulations.

  4. BEYSIK: Language description and handbook for programmers (system for the collective use of the Institute of Space Research, Academy of Sciences USSR)

    NASA Technical Reports Server (NTRS)

    Orlov, I. G.

    1979-01-01

    The BASIC algorithmic language is described, and a guide is presented for the programmer using the language interpreter. The high-level algorithm BASIC is a problem-oriented programming language intended for solution of computational and engineering problems.

  5. The Bilingual Language Interaction Network for Comprehension of Speech

    ERIC Educational Resources Information Center

    Shook, Anthony; Marian, Viorica

    2013-01-01

    During speech comprehension, bilinguals co-activate both of their languages, resulting in cross-linguistic interaction at various levels of processing. This interaction has important consequences for both the structure of the language system and the mechanisms by which the system processes spoken language. Using computational modeling, we can…

  6. Use of an Automatic Problem Generator to Teach Basic Skills in a First Course in Assembly Language.

    ERIC Educational Resources Information Center

    Benander, Alan; And Others

    1989-01-01

    Discussion of the use of computer aided instruction (CAI) and instructional software in college level courses highlights an automatic problem generator, AUTOGEN, that was written for computer science students learning assembly language. Design of the software is explained, and student responses are reported. (nine references) (LRW)

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grote, D. P.

    Forthon generates links between Fortran and Python. Python is a high level, object oriented, interactive and scripting language that allows a flexible and versatile interface to computational tools. The Forthon package generates the necessary wrapping code which allows access to the Fortran database and to the Fortran subroutines and functions. This provides a development package where the computationally intensive parts of a code can be written in efficient Fortran, and the high level controlling code can be written in the much more versatile Python language.

  8. AV Programs for Computer Know-How.

    ERIC Educational Resources Information Center

    Mandell, Phyllis Levy

    1985-01-01

    Lists 44 audiovisual programs (most released between 1983 and 1984) grouped in seven categories: computers in society, introduction to computers, computer operations, languages and programing, computer graphics, robotics, computer careers. Excerpts from "School Library Journal" reviews, price, and intended grade level are included. Names…

  9. A Computational Model of Linguistic Humor in Puns.

    PubMed

    Kao, Justine T; Levy, Roger; Goodman, Noah D

    2016-07-01

    Humor plays an essential role in human interactions. Precisely what makes something funny, however, remains elusive. While research on natural language understanding has made significant advancements in recent years, there has been little direct integration of humor research with computational models of language understanding. In this paper, we propose two information-theoretic measures-ambiguity and distinctiveness-derived from a simple model of sentence processing. We test these measures on a set of puns and regular sentences and show that they correlate significantly with human judgments of funniness. Moreover, within a set of puns, the distinctiveness measure distinguishes exceptionally funny puns from mediocre ones. Our work is the first, to our knowledge, to integrate a computational model of general language understanding and humor theory to quantitatively predict humor at a fine-grained level. We present it as an example of a framework for applying models of language processing to understand higher level linguistic and cognitive phenomena. © 2015 The Authors. Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.

  10. The Bilingual Language Interaction Network for Comprehension of Speech*

    PubMed Central

    Marian, Viorica

    2013-01-01

    During speech comprehension, bilinguals co-activate both of their languages, resulting in cross-linguistic interaction at various levels of processing. This interaction has important consequences for both the structure of the language system and the mechanisms by which the system processes spoken language. Using computational modeling, we can examine how cross-linguistic interaction affects language processing in a controlled, simulated environment. Here we present a connectionist model of bilingual language processing, the Bilingual Language Interaction Network for Comprehension of Speech (BLINCS), wherein interconnected levels of processing are created using dynamic, self-organizing maps. BLINCS can account for a variety of psycholinguistic phenomena, including cross-linguistic interaction at and across multiple levels of processing, cognate facilitation effects, and audio-visual integration during speech comprehension. The model also provides a way to separate two languages without requiring a global language-identification system. We conclude that BLINCS serves as a promising new model of bilingual spoken language comprehension. PMID:24363602

  11. Self-Regulated Learning in Learning Environments with Pedagogical Agents that Interact in Natural Language

    ERIC Educational Resources Information Center

    Graesser, Arthur; McNamara, Danielle

    2010-01-01

    This article discusses the occurrence and measurement of self-regulated learning (SRL) both in human tutoring and in computer tutors with agents that hold conversations with students in natural language and help them learn at deeper levels. One challenge in building these computer tutors is to accommodate, encourage, and scaffold SRL because these…

  12. The Effect of De-Contextualized Multimedia Software on Taiwanese College Level Students' English Vocabulary Development

    ERIC Educational Resources Information Center

    Yan, Yaw-liang

    2010-01-01

    Computer technology has been applied widely as an educational tool in second language learning for a long time. There have been many studies discussing the application of computer technology to different aspects in second language learning. However, the learning effect of both de-contextualized multimedia software and sound gloss on second…

  13. Assessing Metacognitive Knowledge in Web-Based Call: A Neural Network Approach

    ERIC Educational Resources Information Center

    Yeh, Siou-Wen; Lo, Jia-Jiunn

    2005-01-01

    The assessment of learners' metacognitive knowledge level is crucial when developing computer-assisted language learning systems. Currently, many systems assess learners' metacognitive knowledge level with pre-instructional questionnaires or metacognitive interviews. However, learners with limited language proficiency may be at a disadvantage in…

  14. Formal specification of human-computer interfaces

    NASA Technical Reports Server (NTRS)

    Auernheimer, Brent

    1990-01-01

    A high-level formal specification of a human computer interface is described. Previous work is reviewed and the ASLAN specification language is described. Top-level specifications written in ASLAN for a library and a multiwindow interface are discussed.

  15. Educational and interpersonal uses of home computers by adolescents with and without specific language impairment.

    PubMed

    Durkin, Kevin; Conti-Ramsden, Gina; Walker, Allan; Simkin, Zoë

    2009-03-01

    Many uses of new media entail processing language content, yet little is known about the relationship between language ability and media use in young people. This study compares educational versus interpersonal uses of home computers in adolescents with and without a history of specific language impairment (SLI). Participants were 55 17-year-olds with SLI and 72 typically developing peers. Measures of frequency and ease of computer use were obtained as well as assessments of participants' psycholinguistic skills. Results showed a strong preference for interpersonal computer use in both groups. Virtually all participants engaged with interpersonal new media, finding them relatively easy to use. In contrast, one third of adolescents with SLI did not use educational applications during a typical week. Regression analyses revealed that lower frequency of educational use was associated with poorer language and literacy skills. However, in adolescents with SLI, this association was mediated by perceived ease of use. The findings show that language ability contributes to new media use and that adolescents with SLI are at a greater risk of low levels of engagement with educational technology.

  16. Automating software design system DESTA

    NASA Technical Reports Server (NTRS)

    Lovitsky, Vladimir A.; Pearce, Patricia D.

    1992-01-01

    'DESTA' is the acronym for the Dialogue Evolutionary Synthesizer of Turnkey Algorithms by means of a natural language (Russian or English) functional specification of algorithms or software being developed. DESTA represents the computer-aided and/or automatic artificial intelligence 'forgiving' system which provides users with software tools support for algorithm and/or structured program development. The DESTA system is intended to provide support for the higher levels and earlier stages of engineering design of software in contrast to conventional Computer Aided Design (CAD) systems which provide low level tools for use at a stage when the major planning and structuring decisions have already been taken. DESTA is a knowledge-intensive system. The main features of the knowledge are procedures, functions, modules, operating system commands, batch files, their natural language specifications, and their interlinks. The specific domain for the DESTA system is a high level programming language like Turbo Pascal 6.0. The DESTA system is operational and runs on an IBM PC computer.

  17. Neurobiological roots of language in primate audition: common computational properties.

    PubMed

    Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias; Small, Steven L; Rauschecker, Josef P

    2015-03-01

    Here, we present a new perspective on an old question: how does the neurobiology of human language relate to brain systems in nonhuman primates? We argue that higher-order language combinatorics, including sentence and discourse processing, can be situated in a unified, cross-species dorsal-ventral streams architecture for higher auditory processing, and that the functions of the dorsal and ventral streams in higher-order language processing can be grounded in their respective computational properties in primate audition. This view challenges an assumption, common in the cognitive sciences, that a nonhuman primate model forms an inherently inadequate basis for modeling higher-level language functions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Type Specialization in Aldor

    NASA Astrophysics Data System (ADS)

    Dragan, Laurentiu; Watt, Stephen M.

    Computer algebra in scientific computation squarely faces the dilemma of natural mathematical expression versus efficiency. While higher-order programming constructs and parametric polymorphism provide a natural and expressive language for mathematical abstractions, they can come at a considerable cost. We investigate how deeply nested type constructions may be optimized to achieve performance similar to that of hand-tuned code written in lower-level languages.

  19. Computer-Mediated Input, Output and Feedback in the Development of L2 Word Recognition from Speech

    ERIC Educational Resources Information Center

    Matthews, Joshua; Cheng, Junyu; O'Toole, John Mitchell

    2015-01-01

    This paper reports on the impact of computer-mediated input, output and feedback on the development of second language (L2) word recognition from speech (WRS). A quasi-experimental pre-test/treatment/post-test research design was used involving three intact tertiary level English as a Second Language (ESL) classes. Classes were either assigned to…

  20. Developing a Multimedia, Computer-Based Spanish Placement Test

    ERIC Educational Resources Information Center

    Zabaleta, Francisco

    2007-01-01

    Placing students of a foreign language within a basic language program constitutes an ongoing problem, particularly for large university departments when they have many incoming freshmen and transfer students. This article outlines the author's experience designing and piloting a language placement test for a university level Spanish program. The…

  1. Anxiety in Language Testing: The APTIS Case

    ERIC Educational Resources Information Center

    Valencia Robles, Jeannette de Fátima

    2017-01-01

    The requirement of holding a diploma which certifies proficiency level in a foreign language is constantly increasing in academic and working environments. Computer-based testing has become a prevailing tendency for these and other educational purposes. Each year large numbers of students take online language tests everywhere in the world. In…

  2. A Counterexample Guided Abstraction Refinement Framework for Verifying Concurrent C Programs

    DTIC Science & Technology

    2005-05-24

    source code are routinely executed. The source code is written in languages ranging from C/C++/Java to ML/ Ocaml . These languages differ not only in...from the difficulty to model computer programs—due to the complexity of programming languages as compared to hardware description languages —to...intermediate specification language lying between high-level Statechart- like formalisms and transition systems. Actions are encoded as changes in

  3. Focus-on-Form and EFL Learners' Language Development in Synchronous Computer-Mediated Communication: Task-Based Interactions

    ERIC Educational Resources Information Center

    Eslami, Zohreh R.; Kung, Wan-Tsai

    2016-01-01

    The purpose of this study was to explore the occurrence of incidental focus-on-form and its effect on subsequent second language (L2) production of learners of different dyads in an online task-based language learning context. The participants included Taiwanese learners of English as a foreign language at different proficiency levels, and native…

  4. Individualized Teaching and Autonomous Learning: Developing EFL Learners' CLA in a Web-Based Language Skills Training System

    ERIC Educational Resources Information Center

    Lu, Zhihong; Wen, Fuan; Li, Ping

    2012-01-01

    Teaching listening and speaking in English in China has been given top priority on the post-secondary level. This has lead to the question of how learners develop communicative language ability (CLA) effectively in computer-assisted language learning (CALL) environments. The authors demonstrate a self-developed language skill learning system with…

  5. P-KIMMO: A Prolog Implementation of the Two Level Model.

    ERIC Educational Resources Information Center

    Lee, Kang-Hyuk

    Implementation of a computer-based model for morphological analysis and synthesis of language, entitled P-KIMMO, is discussed. The model was implemented in Quintus Prolog on a Sun Workstation and exported to a Macintosh computer. This model has two levels of morphophonological representation, lexical and surface levels, associated by…

  6. The Layer-Oriented Approach to Declarative Languages for Biological Modeling

    PubMed Central

    Raikov, Ivan; De Schutter, Erik

    2012-01-01

    We present a new approach to modeling languages for computational biology, which we call the layer-oriented approach. The approach stems from the observation that many diverse biological phenomena are described using a small set of mathematical formalisms (e.g. differential equations), while at the same time different domains and subdomains of computational biology require that models are structured according to the accepted terminology and classification of that domain. Our approach uses distinct semantic layers to represent the domain-specific biological concepts and the underlying mathematical formalisms. Additional functionality can be transparently added to the language by adding more layers. This approach is specifically concerned with declarative languages, and throughout the paper we note some of the limitations inherent to declarative approaches. The layer-oriented approach is a way to specify explicitly how high-level biological modeling concepts are mapped to a computational representation, while abstracting away details of particular programming languages and simulation environments. To illustrate this process, we define an example language for describing models of ionic currents, and use a general mathematical notation for semantic transformations to show how to generate model simulation code for various simulation environments. We use the example language to describe a Purkinje neuron model and demonstrate how the layer-oriented approach can be used for solving several practical issues of computational neuroscience model development. We discuss the advantages and limitations of the approach in comparison with other modeling language efforts in the domain of computational biology and outline some principles for extensible, flexible modeling language design. We conclude by describing in detail the semantic transformations defined for our language. PMID:22615554

  7. The layer-oriented approach to declarative languages for biological modeling.

    PubMed

    Raikov, Ivan; De Schutter, Erik

    2012-01-01

    We present a new approach to modeling languages for computational biology, which we call the layer-oriented approach. The approach stems from the observation that many diverse biological phenomena are described using a small set of mathematical formalisms (e.g. differential equations), while at the same time different domains and subdomains of computational biology require that models are structured according to the accepted terminology and classification of that domain. Our approach uses distinct semantic layers to represent the domain-specific biological concepts and the underlying mathematical formalisms. Additional functionality can be transparently added to the language by adding more layers. This approach is specifically concerned with declarative languages, and throughout the paper we note some of the limitations inherent to declarative approaches. The layer-oriented approach is a way to specify explicitly how high-level biological modeling concepts are mapped to a computational representation, while abstracting away details of particular programming languages and simulation environments. To illustrate this process, we define an example language for describing models of ionic currents, and use a general mathematical notation for semantic transformations to show how to generate model simulation code for various simulation environments. We use the example language to describe a Purkinje neuron model and demonstrate how the layer-oriented approach can be used for solving several practical issues of computational neuroscience model development. We discuss the advantages and limitations of the approach in comparison with other modeling language efforts in the domain of computational biology and outline some principles for extensible, flexible modeling language design. We conclude by describing in detail the semantic transformations defined for our language.

  8. Abstractions for DNA circuit design.

    PubMed

    Lakin, Matthew R; Youssef, Simon; Cardelli, Luca; Phillips, Andrew

    2012-03-07

    DNA strand displacement techniques have been used to implement a broad range of information processing devices, from logic gates, to chemical reaction networks, to architectures for universal computation. Strand displacement techniques enable computational devices to be implemented in DNA without the need for additional components, allowing computation to be programmed solely in terms of nucleotide sequences. A major challenge in the design of strand displacement devices has been to enable rapid analysis of high-level designs while also supporting detailed simulations that include known forms of interference. Another challenge has been to design devices capable of sustaining precise reaction kinetics over long periods, without relying on complex experimental equipment to continually replenish depleted species over time. In this paper, we present a programming language for designing DNA strand displacement devices, which supports progressively increasing levels of molecular detail. The language allows device designs to be programmed using a common syntax and then analysed at varying levels of detail, with or without interference, without needing to modify the program. This allows a trade-off to be made between the level of molecular detail and the computational cost of analysis. We use the language to design a buffered architecture for DNA devices, capable of maintaining precise reaction kinetics for a potentially unbounded period. We test the effectiveness of buffered gates to support long-running computation by designing a DNA strand displacement system capable of sustained oscillations.

  9. Discourse Markers in Italian as L2 in Face to Face vs. Computer Mediated Settings

    ERIC Educational Resources Information Center

    De Marco, Anna; Leone, Paola

    2013-01-01

    This pilot study aims to highlight a) differences in pragmatic function and distribution of discourse markers (DMs) in computer mediated and face to face (FtF) settings and b) any correlation of DM uses and language competence. The data have been collected by video-recording and analysing three speakers of Italian L2 (language level competence:…

  10. Bilingual Academic Computer and Technology Oriented Program: Project COM-TECH. Evaluation Section Report. OREA Report.

    ERIC Educational Resources Information Center

    Berney, Tomi D.; Plotkin, Donna

    Project COM-TECH offered bilingual individualized instruction, using an enrichment approach, to Spanish- and Haitian Creole-speaking students with varying levels of English and native language proficiency and academic preparation. The program provided supplementary instruction in English as a Second Language (ESL); Native Language Arts (NLA); and…

  11. Improving robustness and computational efficiency using modern C++

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paterno, M.; Kowalkowski, J.; Green, C.

    2014-01-01

    For nearly two decades, the C++ programming language has been the dominant programming language for experimental HEP. The publication of ISO/IEC 14882:2011, the current version of the international standard for the C++ programming language, makes available a variety of language and library facilities for improving the robustness, expressiveness, and computational efficiency of C++ code. However, much of the C++ written by the experimental HEP community does not take advantage of the features of the language to obtain these benefits, either due to lack of familiarity with these features or concern that these features must somehow be computationally inefficient. In thismore » paper, we address some of the features of modern C+-+, and show how they can be used to make programs that are both robust and computationally efficient. We compare and contrast simple yet realistic examples of some common implementation patterns in C, currently-typical C++, and modern C++, and show (when necessary, down to the level of generated assembly language code) the quality of the executable code produced by recent C++ compilers, with the aim of allowing the HEP community to make informed decisions on the costs and benefits of the use of modern C++.« less

  12. Developing Computer-Interactive Tape Exercises for Intermediate-Level Business French.

    ERIC Educational Resources Information Center

    Garnett, Mary Anne

    One college language teacher developed computer-interactive audiotape exercises for an intermediate-level class in business French. The project was undertaken because of a need for appropriate materials at that level. The use of authoring software permitted development of a variety of activity types, including multiple-choice, fill-in-the-blank,…

  13. Bibliography to Computer and History Instruction.

    ERIC Educational Resources Information Center

    Feichtl, Franz

    1994-01-01

    Presents a bibliography of 28 items related to the use of computers in history instruction. Focuses on materials dealing with secondary and college-level instruction. Includes 11 German-language items. (CFR)

  14. Cross-Compiler for Modeling Space-Flight Systems

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    Ripples is a computer program that makes it possible to specify arbitrarily complex space-flight systems in an easy-to-learn, high-level programming language and to have the specification automatically translated into LibSim, which is a text-based computing language in which such simulations are implemented. LibSim is a very powerful simulation language, but learning it takes considerable time, and it requires that models of systems and their components be described at a very low level of abstraction. To construct a model in LibSim, it is necessary to go through a time-consuming process that includes modeling each subsystem, including defining its fault-injection states, input and output conditions, and the topology of its connections to other subsystems. Ripples makes it possible to describe the same models at a much higher level of abstraction, thereby enabling the user to build models faster and with fewer errors. Ripples can be executed in a variety of computers and operating systems, and can be supplied in either source code or binary form. It must be run in conjunction with a Lisp compiler.

  15. Learning a Generative Probabilistic Grammar of Experience: A Process-Level Model of Language Acquisition

    ERIC Educational Resources Information Center

    Kolodny, Oren; Lotem, Arnon; Edelman, Shimon

    2015-01-01

    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given…

  16. Functional language and data flow architectures

    NASA Technical Reports Server (NTRS)

    Ercegovac, M. D.; Patel, D. R.; Lang, T.

    1983-01-01

    This is a tutorial article about language and architecture approaches for highly concurrent computer systems based on the functional style of programming. The discussion concentrates on the basic aspects of functional languages, and sequencing models such as data-flow, demand-driven and reduction which are essential at the machine organization level. Several examples of highly concurrent machines are described.

  17. What about Me?: Individual Self-Assessment by Skill and Level of Language Instruction

    ERIC Educational Resources Information Center

    Brantmeier, Cindy; Vanderplank, Robert; Strube, Michael

    2012-01-01

    In an investigation with advanced language learners, Brantmeier [Brantmeier, C., 2006. "Advanced L2 learners and reading placement: self-assessment, computer based testing, and subsequent performance." "System" 34 (1), 15-35.] reports that self-assessment (SA) of second language (L2) reading ability, when measured with self-rated scales, is not an…

  18. A programmable computational image sensor for high-speed vision

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Shi, Cong; Long, Xitian; Wu, Nanjian

    2013-08-01

    In this paper we present a programmable computational image sensor for high-speed vision. This computational image sensor contains four main blocks: an image pixel array, a massively parallel processing element (PE) array, a row processor (RP) array and a RISC core. The pixel-parallel PE is responsible for transferring, storing and processing image raw data in a SIMD fashion with its own programming language. The RPs are one dimensional array of simplified RISC cores, it can carry out complex arithmetic and logic operations. The PE array and RP array can finish great amount of computation with few instruction cycles and therefore satisfy the low- and middle-level high-speed image processing requirement. The RISC core controls the whole system operation and finishes some high-level image processing algorithms. We utilize a simplified AHB bus as the system bus to connect our major components. Programming language and corresponding tool chain for this computational image sensor are also developed.

  19. The computational structural mechanics testbed architecture. Volume 1: The language

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1988-01-01

    This is the first set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language CLAMP, the command language interpreter CLIP, and the data manager GAL. Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP, and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 1 presents the basic elements of the CLAMP language and is intended for all users.

  20. Evaluating Computer-Based Test Accommodations for English Learners

    ERIC Educational Resources Information Center

    Roohr, Katrina Crotts; Sireci, Stephen G.

    2017-01-01

    Test accommodations for English learners (ELs) are intended to reduce the language barrier and level the playing field, allowing ELs to better demonstrate their true proficiencies. Computer-based accommodations for ELs show promising results for leveling that field while also providing us with additional data to more closely investigate the…

  1. Computer-Mediated Online Language Learning Programmes vs. Tailor-Made Teaching Practices at University Level: A Foul Relationship or a Perfect Match?

    ERIC Educational Resources Information Center

    Brudermann, Cédric A.

    2015-01-01

    This paper explores the potential of digital learning environments to address current issues related to individualised instruction and the expansion of educational opportunities in English as a foreign language at university level. To do so, an applied linguistics-centred research endeavour was carried out. This reflection led to the…

  2. Teaching Computer Languages and Elementary Theory for Mixed Audiences at University Level

    NASA Astrophysics Data System (ADS)

    Christiansen, Henning

    2004-09-01

    Theoretical issues of computer science are traditionally taught in a way that presupposes a solid mathematical background and are usually considered more or less inaccessible for students without this. An effective methodology is described which has been developed for a target group of university students with different backgrounds such as natural science or humanities. It has been developed for a course that integrates theoretical material on computer languages and abstract machines with practical programming techniques. Prolog used as meta-language for describing language issues is the central instrument in the approach: Formal descriptions become running prototypes that are easy and appealing to test and modify, and can be extended into analyzers, interpreters, and tools such as tracers and debuggers. Experience shows a high learning curve, especially when the principles are extended into a learning-by-doing approach having the students to develop such descriptions themselves from an informal introduction.

  3. Parallel language constructs for tensor product computations on loosely coupled architectures

    NASA Technical Reports Server (NTRS)

    Mehrotra, Piyush; Van Rosendale, John

    1989-01-01

    A set of language primitives designed to allow the specification of parallel numerical algorithms at a higher level is described. The authors focus on tensor product array computations, a simple but important class of numerical algorithms. They consider first the problem of programming one-dimensional kernel routines, such as parallel tridiagonal solvers, and then look at how such parallel kernels can be combined to form parallel tensor product algorithms.

  4. Lexical Analysis to Enhance Man/Machine Interaction: Simplifying and Improving the Creation of Software. Final Report.

    ERIC Educational Resources Information Center

    Hutchins, Sandra E.

    By analyzing the lexicology of natural language (English or other languages as they are commonly spoken or written), as compared to computer languages, this study explored the extent to which syntactic and semantic levels of linguistic analysis can be implemented and effectively used on microcomputers. In Phase I of the study, the Apple IIe with…

  5. Students as Producers and Collaborators: Exploring the Use of Padlets and Videos in MFL Teaching

    ERIC Educational Resources Information Center

    de Berg, Anna

    2016-01-01

    In today's digital age, Languages graduates need more specific skills than fluency in the foreign language and intercultural competence. Employers expect from all applicants a high level of computer literacy and a set of soft skills such as creativity or the ability to solve problems and work on team projects. Modern Foreign Language (MFL)…

  6. The Effect of Task Types on Foreign Language Learners' Social Presence in Synchronous Computer Mediated Communication (SCMC)

    ERIC Educational Resources Information Center

    Ko, Chao-Jung

    2016-01-01

    This study aims to clarify the relationship between task types and foreign language learners' social presence (SP) in text-based SCMC learning modes. The participants in this study comprised 38 high-intermediate level English as a foreign language (EFL) learners from different disciplines of a university in Taiwan. They were divided into two…

  7. Scheduling language and algorithm development study. Volume 1: Study summary and overview

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A high level computer programming language and a program library were developed to be used in writing programs for scheduling complex systems such as the space transportation system. The objectives and requirements of the study are summarized and unique features of the specified language and program library are described and related to the why of the objectives and requirements.

  8. Implications of the Java language on computer-based patient records.

    PubMed

    Pollard, D; Kucharz, E; Hammond, W E

    1996-01-01

    The growth of the utilization of the World Wide Web (WWW) as a medium for the delivery of computer-based patient records (CBPR) has created a new paradigm in which clinical information may be delivered. Until recently the authoring tools and environment for application development on the WWW have been limited to Hyper Text Markup Language (HTML) utilizing common gateway interface scripts. While, at times, this provides an effective medium for the delivery of CBPR, it is a less than optimal solution. The server-centric dynamics and low levels of interactivity do not provide for a robust application which is required in a clinical environment. The emergence of Sun Microsystems' Java language is a solution to the problem. In this paper we examine the Java language and its implications to the CBPR. A quantitative and qualitative assessment was performed. The Java environment is compared to HTML and Telnet CBPR environments. Qualitative comparisons include level of interactivity, server load, client load, ease of use, and application capabilities. Quantitative comparisons include data transfer time delays. The Java language has demonstrated promise for delivering CBPRs.

  9. DECUS Proceedings; Fall 1971, Papers and Presentations.

    ERIC Educational Resources Information Center

    1971

    Papers and presentations at the 1971 symposium of the Digital Equipment Computer Users Society (DECUS) are presented. The papers deal with medical and physiological applications, computer graphics, simulation education, small computer executive systems, management information tools, data acquisition systems, and high level languages. Although many…

  10. Computer enhancement through interpretive techniques

    NASA Technical Reports Server (NTRS)

    Foster, G.; Spaanenburg, H. A. E.; Stumpf, W. E.

    1972-01-01

    The improvement in the usage of the digital computer through the use of the technique of interpretation rather than the compilation of higher ordered languages was investigated by studying the efficiency of coding and execution of programs written in FORTRAN, ALGOL, PL/I and COBOL. FORTRAN was selected as the high level language for examining programs which were compiled, and A Programming Language (APL) was chosen for the interpretive language. It is concluded that APL is competitive, not because it and the algorithms being executed are well written, but rather because the batch processing is less efficient than has been admitted. There is not a broad base of experience founded on trying different implementation strategies which have been targeted at open competition with traditional processing methods.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Zheming; Yoshii, Kazutomo; Finkel, Hal

    Open Computing Language (OpenCL) is a high-level language that enables software programmers to explore Field Programmable Gate Arrays (FPGAs) for application acceleration. The Intel FPGA software development kit (SDK) for OpenCL allows a user to specify applications at a high level and explore the performance of low-level hardware acceleration. In this report, we present the FPGA performance and power consumption results of the single-precision floating-point vector add OpenCL kernel using the Intel FPGA SDK for OpenCL on the Nallatech 385A FPGA board. The board features an Arria 10 FPGA. We evaluate the FPGA implementations using the compute unit duplication andmore » kernel vectorization optimization techniques. On the Nallatech 385A FPGA board, the maximum compute kernel bandwidth we achieve is 25.8 GB/s, approximately 76% of the peak memory bandwidth. The power consumption of the FPGA device when running the kernels ranges from 29W to 42W.« less

  12. Computer-Aided Transformation of PDE Models: Languages, Representations, and a Calculus of Operations

    DTIC Science & Technology

    2016-01-05

    discretizations . We maintain that what is clear at the mathematical level should be equally clear in computation. In this small STIR project, we separate the...concerns of describing and discretizing such models by defining an input language representing PDE, including steady-state and tran- sient, linear and...solvers, such as [8, 9], focused on the solvers themselves and particular families of discretizations (e. g. finite elements), and now it is natural to

  13. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.

    PubMed

    Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar

    2015-09-04

    The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.

  14. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.

    PubMed

    Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar

    2015-06-01

    The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.

  15. An abstract specification language for Markov reliability models

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1985-01-01

    Markov models can be used to compute the reliability of virtually any fault tolerant system. However, the process of delineating all of the states and transitions in a model of complex system can be devastatingly tedious and error-prone. An approach to this problem is presented utilizing an abstract model definition language. This high level language is described in a nonformal manner and illustrated by example.

  16. Scheduling language and algorithm development study. Appendix: Study approach and activity summary

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The approach and organization of the study to develop a high level computer programming language and a program library are presented. The algorithm and problem modeling analyses are summarized. The approach used to identify and specify the capabilities required in the basic language is described. Results of the analyses used to define specifications for the scheduling module library are presented.

  17. An abstract language for specifying Markov reliability models

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1986-01-01

    Markov models can be used to compute the reliability of virtually any fault tolerant system. However, the process of delineating all of the states and transitions in a model of complex system can be devastatingly tedious and error-prone. An approach to this problem is presented utilizing an abstract model definition language. This high level language is described in a nonformal manner and illustrated by example.

  18. Trends in Programming Languages for Neuroscience Simulations

    PubMed Central

    Davison, Andrew P.; Hines, Michael L.; Muller, Eilif

    2009-01-01

    Neuroscience simulators allow scientists to express models in terms of biological concepts, without having to concern themselves with low-level computational details of their implementation. The expressiveness, power and ease-of-use of the simulator interface is critical in efficiently and accurately translating ideas into a working simulation. We review long-term trends in the development of programmable simulator interfaces, and examine the benefits of moving from proprietary, domain-specific languages to modern dynamic general-purpose languages, in particular Python, which provide neuroscientists with an interactive and expressive simulation development environment and easy access to state-of-the-art general-purpose tools for scientific computing. PMID:20198154

  19. Trends in programming languages for neuroscience simulations.

    PubMed

    Davison, Andrew P; Hines, Michael L; Muller, Eilif

    2009-01-01

    Neuroscience simulators allow scientists to express models in terms of biological concepts, without having to concern themselves with low-level computational details of their implementation. The expressiveness, power and ease-of-use of the simulator interface is critical in efficiently and accurately translating ideas into a working simulation. We review long-term trends in the development of programmable simulator interfaces, and examine the benefits of moving from proprietary, domain-specific languages to modern dynamic general-purpose languages, in particular Python, which provide neuroscientists with an interactive and expressive simulation development environment and easy access to state-of-the-art general-purpose tools for scientific computing.

  20. Interpretive computer simulator for the NASA Standard Spacecraft Computer-2 (NSSC-2)

    NASA Technical Reports Server (NTRS)

    Smith, R. S.; Noland, M. S.

    1979-01-01

    An Interpretive Computer Simulator (ICS) for the NASA Standard Spacecraft Computer-II (NSSC-II) was developed as a code verification and testing tool for the Annular Suspension and Pointing System (ASPS) project. The simulator is written in the higher level language PASCAL and implented on the CDC CYBER series computer system. It is supported by a metal assembler, a linkage loader for the NSSC-II, and a utility library to meet the application requirements. The architectural design of the NSSC-II is that of an IBM System/360 (S/360) and supports all but four instructions of the S/360 standard instruction set. The structural design of the ICS is described with emphasis on the design differences between it and the NSSC-II hardware. The program flow is diagrammed, with the function of each procedure being defined; the instruction implementation is discussed in broad terms; and the instruction timings used in the ICS are listed. An example of the steps required to process an assembly level language program on the ICS is included. The example illustrates the control cards necessary to assemble, load, and execute assembly language code; the sample program to to be executed; the executable load module produced by the loader; and the resulting output produced by the ICS.

  1. ANTLR Tree Grammar Generator and Extensions

    NASA Technical Reports Server (NTRS)

    Craymer, Loring

    2005-01-01

    A computer program implements two extensions of ANTLR (Another Tool for Language Recognition), which is a set of software tools for translating source codes between different computing languages. ANTLR supports predicated- LL(k) lexer and parser grammars, a notation for annotating parser grammars to direct tree construction, and predicated tree grammars. [ LL(k) signifies left-right, leftmost derivation with k tokens of look-ahead, referring to certain characteristics of a grammar.] One of the extensions is a syntax for tree transformations. The other extension is the generation of tree grammars from annotated parser or input tree grammars. These extensions can simplify the process of generating source-to-source language translators and they make possible an approach, called "polyphase parsing," to translation between computing languages. The typical approach to translator development is to identify high-level semantic constructs such as "expressions," "declarations," and "definitions" as fundamental building blocks in the grammar specification used for language recognition. The polyphase approach is to lump ambiguous syntactic constructs during parsing and then disambiguate the alternatives in subsequent tree transformation passes. Polyphase parsing is believed to be useful for generating efficient recognizers for C++ and other languages that, like C++, have significant ambiguities.

  2. Introduction to Computational Physics for Undergraduates

    NASA Astrophysics Data System (ADS)

    Zubairi, Omair; Weber, Fridolin

    2018-03-01

    This is an introductory textbook on computational methods and techniques intended for undergraduates at the sophomore or junior level in the fields of science, mathematics, and engineering. It provides an introduction to programming languages such as FORTRAN 90/95/2000 and covers numerical techniques such as differentiation, integration, root finding, and data fitting. The textbook also entails the use of the Linux/Unix operating system and other relevant software such as plotting programs, text editors, and mark up languages such as LaTeX. It includes multiple homework assignments.

  3. Designing Distance Learning Tasks to Help Maximize Vocabulary Development

    ERIC Educational Resources Information Center

    Loucky, John Paul

    2012-01-01

    Task-based language learning using the benefits of online computer-assisted language learning (CALL) can be effective for rapid vocabulary expansion, especially when target vocabulary has been pre-arranged into bilingual categories under simpler, common Semantic Field Keywords. Results and satisfaction levels for both Chinese English majors and…

  4. Bridging Levels of Analysis: Learning, Information Theory, and the Lexicon

    ERIC Educational Resources Information Center

    Dye, Melody

    2017-01-01

    While information theory is typically considered in the context of modern computing and engineering, its core mathematical principles provide a potentially useful lens through which to consider human language. Like the artificial communication systems such principles were invented to describe, natural languages involve a sender and receiver, a…

  5. Real-Time MENTAT programming language and architecture

    NASA Technical Reports Server (NTRS)

    Grimshaw, Andrew S.; Silberman, Ami; Liu, Jane W. S.

    1989-01-01

    Real-time MENTAT, a programming environment designed to simplify the task of programming real-time applications in distributed and parallel environments, is described. It is based on the same data-driven computation model and object-oriented programming paradigm as MENTAT. It provides an easy-to-use mechanism to exploit parallelism, language constructs for the expression and enforcement of timing constraints, and run-time support for scheduling and exciting real-time programs. The real-time MENTAT programming language is an extended C++. The extensions are added to facilitate automatic detection of data flow and generation of data flow graphs, to express the timing constraints of individual granules of computation, and to provide scheduling directives for the runtime system. A high-level view of the real-time MENTAT system architecture and programming language constructs is provided.

  6. Teaching computer interfacing with virtual instruments in an object-oriented language.

    PubMed Central

    Gulotta, M

    1995-01-01

    LabVIEW is a graphic object-oriented computer language developed to facilitate hardware/software communication. LabVIEW is a complete computer language that can be used like Basic, FORTRAN, or C. In LabVIEW one creates virtual instruments that aesthetically look like real instruments but are controlled by sophisticated computer programs. There are several levels of data acquisition VIs that make it easy to control data flow, and many signal processing and analysis algorithms come with the software as premade VIs. In the classroom, the similarity between virtual and real instruments helps students understand how information is passed between the computer and attached instruments. The software may be used in the absence of hardware so that students can work at home as well as in the classroom. This article demonstrates how LabVIEW can be used to control data flow between computers and instruments, points out important features for signal processing and analysis, and shows how virtual instruments may be used in place of physical instrumentation. Applications of LabVIEW to the teaching laboratory are also discussed, and a plausible course outline is given. PMID:8580361

  7. Teaching computer interfacing with virtual instruments in an object-oriented language.

    PubMed

    Gulotta, M

    1995-11-01

    LabVIEW is a graphic object-oriented computer language developed to facilitate hardware/software communication. LabVIEW is a complete computer language that can be used like Basic, FORTRAN, or C. In LabVIEW one creates virtual instruments that aesthetically look like real instruments but are controlled by sophisticated computer programs. There are several levels of data acquisition VIs that make it easy to control data flow, and many signal processing and analysis algorithms come with the software as premade VIs. In the classroom, the similarity between virtual and real instruments helps students understand how information is passed between the computer and attached instruments. The software may be used in the absence of hardware so that students can work at home as well as in the classroom. This article demonstrates how LabVIEW can be used to control data flow between computers and instruments, points out important features for signal processing and analysis, and shows how virtual instruments may be used in place of physical instrumentation. Applications of LabVIEW to the teaching laboratory are also discussed, and a plausible course outline is given.

  8. Functional Programming in Computer Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Loren James; Davis, Marion Kei

    We explore functional programming through a 16-week internship at Los Alamos National Laboratory. Functional programming is a branch of computer science that has exploded in popularity over the past decade due to its high-level syntax, ease of parallelization, and abundant applications. First, we summarize functional programming by listing the advantages of functional programming languages over the usual imperative languages, and we introduce the concept of parsing. Second, we discuss the importance of lambda calculus in the theory of functional programming. Lambda calculus was invented by Alonzo Church in the 1930s to formalize the concept of effective computability, and every functionalmore » language is essentially some implementation of lambda calculus. Finally, we display the lasting products of the internship: additions to a compiler and runtime system for the pure functional language STG, including both a set of tests that indicate the validity of updates to the compiler and a compiler pass that checks for illegal instances of duplicate names.« less

  9. User-Defined Data Distributions in High-Level Programming Languages

    NASA Technical Reports Server (NTRS)

    Diaconescu, Roxana E.; Zima, Hans P.

    2006-01-01

    One of the characteristic features of today s high performance computing systems is a physically distributed memory. Efficient management of locality is essential for meeting key performance requirements for these architectures. The standard technique for dealing with this issue has involved the extension of traditional sequential programming languages with explicit message passing, in the context of a processor-centric view of parallel computation. This has resulted in complex and error-prone assembly-style codes in which algorithms and communication are inextricably interwoven. This paper presents a high-level approach to the design and implementation of data distributions. Our work is motivated by the need to improve the current parallel programming methodology by introducing a paradigm supporting the development of efficient and reusable parallel code. This approach is currently being implemented in the context of a new programming language called Chapel, which is designed in the HPCS project Cascade.

  10. The computational structural mechanics testbed architecture. Volume 5: The Input-Output Manager DMGASP

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1989-01-01

    This is the fifth of a set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language (CLAMP), the command language interpreter (CLIP), and the data manager (GAL). Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 5 describes the low-level data management component of the NICE software. It is intended only for advanced programmers involved in maintenance of the software.

  11. The growth of language: Universal Grammar, experience, and principles of computation.

    PubMed

    Yang, Charles; Crain, Stephen; Berwick, Robert C; Chomsky, Noam; Bolhuis, Johan J

    2017-10-01

    Human infants develop language remarkably rapidly and without overt instruction. We argue that the distinctive ontogenesis of child language arises from the interplay of three factors: domain-specific principles of language (Universal Grammar), external experience, and properties of non-linguistic domains of cognition including general learning mechanisms and principles of efficient computation. We review developmental evidence that children make use of hierarchically composed structures ('Merge') from the earliest stages and at all levels of linguistic organization. At the same time, longitudinal trajectories of development show sensitivity to the quantity of specific patterns in the input, which suggests the use of probabilistic processes as well as inductive learning mechanisms that are suitable for the psychological constraints on language acquisition. By considering the place of language in human biology and evolution, we propose an approach that integrates principles from Universal Grammar and constraints from other domains of cognition. We outline some initial results of this approach as well as challenges for future research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. The Sizing and Optimization Language, (SOL): Computer language for design problems

    NASA Technical Reports Server (NTRS)

    Lucas, Stephen H.; Scotti, Stephen J.

    1988-01-01

    The Sizing and Optimization Language, (SOL), a new high level, special purpose computer language was developed to expedite application of numerical optimization to design problems and to make the process less error prone. SOL utilizes the ADS optimization software and provides a clear, concise syntax for describing an optimization problem, the OPTIMIZE description, which closely parallels the mathematical description of the problem. SOL offers language statements which can be used to model a design mathematically, with subroutines or code logic, and with existing FORTRAN routines. In addition, SOL provides error checking and clear output of the optimization results. Because of these language features, SOL is best suited to model and optimize a design concept when the model consits of mathematical expressions written in SOL. For such cases, SOL's unique syntax and error checking can be fully utilized. SOL is presently available for DEC VAX/VMS systems. A SOL package is available which includes the SOL compiler, runtime library routines, and a SOL reference manual.

  13. F-Nets and Software Cabling: Deriving a Formal Model and Language for Portable Parallel Programming

    NASA Technical Reports Server (NTRS)

    DiNucci, David C.; Saini, Subhash (Technical Monitor)

    1998-01-01

    Parallel programming is still being based upon antiquated sequence-based definitions of the terms "algorithm" and "computation", resulting in programs which are architecture dependent and difficult to design and analyze. By focusing on obstacles inherent in existing practice, a more portable model is derived here, which is then formalized into a model called Soviets which utilizes a combination of imperative and functional styles. This formalization suggests more general notions of algorithm and computation, as well as insights into the meaning of structured programming in a parallel setting. To illustrate how these principles can be applied, a very-high-level graphical architecture-independent parallel language, called Software Cabling, is described, with many of the features normally expected from today's computer languages (e.g. data abstraction, data parallelism, and object-based programming constructs).

  14. Programming Language Software For Graphics Applications

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  15. Predictors of Language Gains Among School-Age Children With Language Impairment in the Public Schools.

    PubMed

    Justice, Laura M; Jiang, Hui; Logan, Jessica A; Schmitt, Mary Beth

    2017-06-10

    This study aimed to identify child-level characteristics that predict gains in language skills for children with language impairment who were receiving therapy within the public schools. The therapy provided represented business-as-usual speech/language treatment provided by speech-language pathologists in the public schools. The sample included 272 kindergartners and first-graders with language impairment who participated in a larger study titled "Speech-Therapy Experiences in the Public Schools." Multilevel regression analyses were applied to examine the extent to which select child-level characteristics, including age, nonverbal cognition, memory, phonological awareness, vocabulary, behavior problems, and self-regulation, predicted children's language gains over an academic year. Pratt indices were computed to establish the relative importance of the predictors of interest. Phonological awareness and vocabulary skill related to greater gains in language skills, and together they accounted for nearly 70% of the explained variance, or 10% of total variance at child level. Externalizing behavior, nonverbal cognition, and age were also potentially important predictors of language gains. This study significantly advances our understanding of the characteristics of children that may contribute to their language gains while receiving therapy in the public schools. Researchers can explore how these characteristics may serve to moderate treatment outcomes, whereas clinicians can assess how these characteristics may factor into understanding treatment responses.

  16. Rich Language Analysis for Counterterrorism

    NASA Astrophysics Data System (ADS)

    Guidère, Mathieu; Howard, Newton; Argamon, Shlomo

    Accurate and relevant intelligence is critical for effective counterterrorism. Too much irrelevant information is as bad or worse than not enough information. Modern computational tools promise to provide better search and summarization capabilities to help analysts filter and select relevant and key information. However, to do this task effectively, such tools must have access to levels of meaning beyond the literal. Terrorists operating in context-rich cultures like fundamentalist Islam use messages with multiple levels of interpretation, which are easily misunderstood by non-insiders. This chapter discusses several kinds of such “encryption” used by terrorists and insurgents in the Arabic language, and how knowledge of such methods can be used to enhance computational text analysis techniques for use in counterterrorism.

  17. Bilingual parallel programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, I.; Overbeek, R.

    1990-01-01

    Numerous experiments have demonstrated that computationally intensive algorithms support adequate parallelism to exploit the potential of large parallel machines. Yet successful parallel implementations of serious applications are rare. The limiting factor is clearly programming technology. None of the approaches to parallel programming that have been proposed to date -- whether parallelizing compilers, language extensions, or new concurrent languages -- seem to adequately address the central problems of portability, expressiveness, efficiency, and compatibility with existing software. In this paper, we advocate an alternative approach to parallel programming based on what we call bilingual programming. We present evidence that this approach providesmore » and effective solution to parallel programming problems. The key idea in bilingual programming is to construct the upper levels of applications in a high-level language while coding selected low-level components in low-level languages. This approach permits the advantages of a high-level notation (expressiveness, elegance, conciseness) to be obtained without the cost in performance normally associated with high-level approaches. In addition, it provides a natural framework for reusing existing code.« less

  18. Interacting domain-specific languages with biological problem solving environments

    NASA Astrophysics Data System (ADS)

    Cickovski, Trevor M.

    Iteratively developing a biological model and verifying results with lab observations has become standard practice in computational biology. This process is currently facilitated by biological Problem Solving Environments (PSEs), multi-tiered and modular software frameworks which traditionally consist of two layers: a computational layer written in a high level language using design patterns, and a user interface layer which hides its details. Although PSEs have proven effective, they still enforce some communication overhead between biologists refining their models through repeated comparison with experimental observations in vitro or in vivo, and programmers actually implementing model extensions and modifications within the computational layer. I illustrate the use of biological Domain-Specific Languages (DSLs) as a middle-level PSE tier to ameliorate this problem by providing experimentalists with the ability to iteratively test and develop their models using a higher degree of expressive power compared to a graphical interface, while saving the requirement of general purpose programming knowledge. I develop two radically different biological DSLs: XML-based BIOLOGO will model biological morphogenesis using a cell-centered stochastic cellular automaton and translate into C++ modules for an object-oriented PSE C OMPUCELL3D, and MDLab will provide a set of high-level Python libraries for running molecular dynamics simulations, using wrapped functionality from the C++ PSE PROTOMOL. I describe each language in detail, including its its roles within the larger PSE and its expressibility in terms of representable phenomena, and a discussion of observations from users of the languages. Moreover I will use these studies to draw general conclusions about biological DSL development, including dependencies upon the goals of the corresponding PSE, strategies, and tradeoffs.

  19. A School-College Consultation Model for Integration of Technology and Whole Language in Elementary Science Instruction. Field Study Report No. 1991.A.BAL, Christopher Columbus Consortium Project.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    A study examined a new collaborative consultation process to enhance the classroom implementation of whole language science units that make use of computers and multimedia resources. The overall program was divided into three projects, two at the fifth-grade level and one at the third grade level. Each project was staffed by a team of one…

  20. GSFC Systems Test and Operation Language (STOL) functional requirements and language description

    NASA Technical Reports Server (NTRS)

    Desjardins, R.; Hall, G.; Mcguire, J.; Merwarth, P.; Mocarsky, W.; Truszkowski, W.; Villasenor, A.; Brosi, F.; Burch, P.; Carey, D.

    1978-01-01

    The Systems Tests and Operation Language (STOL) provides the means for user communication with payloads, applications programs, and other ground system elements. It is a systems operation language that enables an operator or user to communicate a command to a computer system. The system interprets each high level language directive from the user and performs the indicated action, such as executing a program, printing out a snapshot, or sending a payload command. This document presents the following: (1) required language features and implementation considerations; (2) basic capabilities; (3) telemetry, command, and input/output directives; (4) procedure definition and control; (5) listing, extension, and STOL nucleus capabilities.

  1. Intelligent Computer-Assisted Language Learning.

    ERIC Educational Resources Information Center

    Harrington, Michael

    1996-01-01

    Introduces the field of intelligent computer assisted language learning (ICALL) and relates them to current practice in computer assisted language learning (CALL) and second language learning. Points out that ICALL applies expertise from artificial intelligence and the computer and cognitive sciences to the development of language learning…

  2. Computer-Based Writing and Paper-Based Writing: A Study of Beginning-Level and Intermediate-Level Chinese Learners' Writing

    ERIC Educational Resources Information Center

    Kang, Hana

    2011-01-01

    Chinese writing is one of the most difficult challenges for Chinese learners whose first language writing system is alphabetic letters. Chinese teachers have incorporated computer-based writing into their teaching in the attempt to reduce the difficulties of writing in Chinese, with a particular emphasis on composing (as opposed to simply writing…

  3. Simulation, Design Abstraction, and SystemC

    ERIC Educational Resources Information Center

    Harcourt, Ed

    2007-01-01

    SystemC is a system-level design and simulation language based on C++. We've been using SystemC for computer organization and design projects for the past several years. Because SystemC is embedded in C++ it contains the powerful abstraction mechanisms of C++ not found in traditional hardware description languages, such as support for…

  4. Exploring the Past. "A Senior Literacy Model." Final Report.

    ERIC Educational Resources Information Center

    Greater Erie Community Action Committee, PA.

    A program of basic language/writing skills was designed to enhance the literacy levels of 24 multicultural seniors, aged 65 or older, who were recruited from senior centers throughout Erie County, Pennsylvania. Computer literacy and basic word processing skills were taught along with basic language/writing skills in a nonthreatening learning…

  5. The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core

    PubMed Central

    Hucka, Michael; Bergmann, Frank T.; Hoops, Stefan; Keating, Sarah M.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Wilkinson, Darren J.

    2017-01-01

    Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528564

  6. The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core.

    PubMed

    Hucka, Michael; Bergmann, Frank T; Hoops, Stefan; Keating, Sarah M; Sahle, Sven; Schaff, James C; Smith, Lucian P; Wilkinson, Darren J

    2015-09-04

    Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.

  7. The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core.

    PubMed

    Hucka, Michael; Bergmann, Frank T; Hoops, Stefan; Keating, Sarah M; Sahle, Sven; Schaff, James C; Smith, Lucian P; Wilkinson, Darren J

    2015-06-01

    Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.

  8. Cognitive biases, linguistic universals, and constraint-based grammar learning.

    PubMed

    Culbertson, Jennifer; Smolensky, Paul; Wilson, Colin

    2013-07-01

    According to classical arguments, language learning is both facilitated and constrained by cognitive biases. These biases are reflected in linguistic typology-the distribution of linguistic patterns across the world's languages-and can be probed with artificial grammar experiments on child and adult learners. Beginning with a widely successful approach to typology (Optimality Theory), and adapting techniques from computational approaches to statistical learning, we develop a Bayesian model of cognitive biases and show that it accounts for the detailed pattern of results of artificial grammar experiments on noun-phrase word order (Culbertson, Smolensky, & Legendre, 2012). Our proposal has several novel properties that distinguish it from prior work in the domains of linguistic theory, computational cognitive science, and machine learning. This study illustrates how ideas from these domains can be synthesized into a model of language learning in which biases range in strength from hard (absolute) to soft (statistical), and in which language-specific and domain-general biases combine to account for data from the macro-level scale of typological distribution to the micro-level scale of learning by individuals. Copyright © 2013 Cognitive Science Society, Inc.

  9. A strategy for automatically generating programs in the lucid programming language

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1987-01-01

    A strategy for automatically generating and verifying simple computer programs is described. The programs are specified by a precondition and a postcondition in predicate calculus. The programs generated are in the Lucid programming language, a high-level, data-flow language known for its attractive mathematical properties and ease of program verification. The Lucid programming is described, and the automatic program generation strategy is described and applied to several example problems.

  10. Productive High Performance Parallel Programming with Auto-tuned Domain-Specific Embedded Languages

    DTIC Science & Technology

    2013-01-02

    Compilation JVM Java Virtual Machine KB Kilobyte KDT Knowledge Discovery Toolbox LAPACK Linear Algebra Package LLVM Low-Level Virtual Machine LOC Lines...different starting points. Leo Meyerovich also helped solidify some of the ideas here in discussions during Par Lab retreats. I would also like to thank...multi-timestep computations by blocking in both time and space. 88 Implementation Output Approx DSL Type Language Language Parallelism LoC Graphite

  11. A data management system for engineering and scientific computing

    NASA Technical Reports Server (NTRS)

    Elliot, L.; Kunii, H. S.; Browne, J. C.

    1978-01-01

    Data elements and relationship definition capabilities for this data management system are explicitly tailored to the needs of engineering and scientific computing. System design was based upon studies of data management problems currently being handled through explicit programming. The system-defined data element types include real scalar numbers, vectors, arrays and special classes of arrays such as sparse arrays and triangular arrays. The data model is hierarchical (tree structured). Multiple views of data are provided at two levels. Subschemas provide multiple structural views of the total data base and multiple mappings for individual record types are supported through the use of a REDEFINES capability. The data definition language and the data manipulation language are designed as extensions to FORTRAN. Examples of the coding of real problems taken from existing practice in the data definition language and the data manipulation language are given.

  12. HGML: a hypertext guideline markup language.

    PubMed Central

    Hagerty, C. G.; Pickens, D.; Kulikowski, C.; Sonnenberg, F.

    2000-01-01

    Existing text-based clinical practice guidelines can be difficult to put into practice. While a growing number of such documents have gained acceptance in the medical community and contain a wealth of valuable information, the time required to digest them is substantial. Yet the expressive power, subtlety and flexibility of natural language pose challenges when designing computer tools that will help in their application. At the same time, formal computer languages typically lack such expressiveness and the effort required to translate existing documents into these languages may be costly. We propose a method based on the mark-up concept for converting text-based clinical guidelines into a machine-operable form. This allows existing guidelines to be manipulated by machine, and viewed in different formats at various levels of detail according to the needs of the practitioner, while preserving their originally published form. PMID:11079898

  13. The Fifth Generation. An annotated bibliography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bramer, M.; Bramer, D.

    The Japanese Fifth Generation Computer System project constitutes a radical reappraisal of the functions which an advanced computer system should be able to perform, the programming languages needed to implement such functions, and the machine architectures suitable for supporting the chosen languages. The book guides the reader through the ever-growing literature on the project, and the international responses, including the United Kingdom Government's Alvey Program and the MCC Program in the United States. Evaluative abstracts are given, including books, journal articles, unpublished reports and material at both overview and technical levels.

  14. An Overview of Computer-Based Natural Language Processing.

    ERIC Educational Resources Information Center

    Gevarter, William B.

    Computer-based Natural Language Processing (NLP) is the key to enabling humans and their computer-based creations to interact with machines using natural languages (English, Japanese, German, etc.) rather than formal computer languages. NLP is a major research area in the fields of artificial intelligence and computational linguistics. Commercial…

  15. Scalable and massively parallel Monte Carlo photon transport simulations for heterogeneous computing platforms

    NASA Astrophysics Data System (ADS)

    Yu, Leiming; Nina-Paravecino, Fanny; Kaeli, David; Fang, Qianqian

    2018-01-01

    We present a highly scalable Monte Carlo (MC) three-dimensional photon transport simulation platform designed for heterogeneous computing systems. Through the development of a massively parallel MC algorithm using the Open Computing Language framework, this research extends our existing graphics processing unit (GPU)-accelerated MC technique to a highly scalable vendor-independent heterogeneous computing environment, achieving significantly improved performance and software portability. A number of parallel computing techniques are investigated to achieve portable performance over a wide range of computing hardware. Furthermore, multiple thread-level and device-level load-balancing strategies are developed to obtain efficient simulations using multiple central processing units and GPUs.

  16. An approach to separating the levels of hierarchical structure building in language and mathematics.

    PubMed

    Makuuchi, Michiru; Bahlmann, Jörg; Friederici, Angela D

    2012-07-19

    We aimed to dissociate two levels of hierarchical structure building in language and mathematics, namely 'first-level' (the build-up of hierarchical structure with externally given elements) and 'second-level' (the build-up of hierarchical structure with internally represented elements produced by first-level processes). Using functional magnetic resonance imaging, we investigated these processes in three domains: sentence comprehension, arithmetic calculation (using Reverse Polish notation, which gives two operands followed by an operator) and a working memory control task. All tasks required the build-up of hierarchical structures at the first- and second-level, resulting in a similar computational hierarchy across language and mathematics, as well as in a working memory control task. Using a novel method that estimates the difference in the integration cost for conditions of different trial durations, we found an anterior-to-posterior functional organization in the prefrontal cortex, according to the level of hierarchy. Common to all domains, the ventral premotor cortex (PMv) supports first-level hierarchy building, while the dorsal pars opercularis (POd) subserves second-level hierarchy building, with lower activation for language compared with the other two tasks. These results suggest that the POd and the PMv support domain-general mechanisms for hierarchical structure building, with the POd being uniquely efficient for language.

  17. Graphical Requirements for Force Level Planning. Volume 2

    DTIC Science & Technology

    1991-09-01

    technology review includes graphics algorithms, computer hardware, computer software, and design methodologies. The technology can either exist today or...level graphics language. 7.4 User Interface Design Tools As user interfaces have become more sophisticated, they have become harder to develop. Xl...Setphen M. Pizer, editors. Proceedings 1986 Workshop on Interactive 31) Graphics , October 1986. 18 J. S. Dumas. Designing User Interface Software. Prentice

  18. The preliminary SOL (Sizing and Optimization Language) reference manual

    NASA Technical Reports Server (NTRS)

    Lucas, Stephen H.; Scotti, Stephen J.

    1989-01-01

    The Sizing and Optimization Language, SOL, a high-level special-purpose computer language has been developed to expedite application of numerical optimization to design problems and to make the process less error-prone. This document is a reference manual for those wishing to write SOL programs. SOL is presently available for DEC VAX/VMS systems. A SOL package is available which includes the SOL compiler and runtime library routines. An overview of SOL appears in NASA TM 100565.

  19. A direct-execution parallel architecture for the Advanced Continuous Simulation Language (ACSL)

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Owen, Jeffrey E.

    1988-01-01

    A direct-execution parallel architecture for the Advanced Continuous Simulation Language (ACSL) is presented which overcomes the traditional disadvantages of simulations executed on a digital computer. The incorporation of parallel processing allows the mapping of simulations into a digital computer to be done in the same inherently parallel manner as they are currently mapped onto an analog computer. The direct-execution format maximizes the efficiency of the executed code since the need for a high level language compiler is eliminated. Resolution is greatly increased over that which is available with an analog computer without the sacrifice in execution speed normally expected with digitial computer simulations. Although this report covers all aspects of the new architecture, key emphasis is placed on the processing element configuration and the microprogramming of the ACLS constructs. The execution times for all ACLS constructs are computed using a model of a processing element based on the AMD 29000 CPU and the AMD 29027 FPU. The increase in execution speed provided by parallel processing is exemplified by comparing the derived execution times of two ACSL programs with the execution times for the same programs executed on a similar sequential architecture.

  20. The Integration of a Computer-Based Early Reading Program to Increase English Language Learners' Literacy Skills

    ERIC Educational Resources Information Center

    James, Laurie

    2014-01-01

    The intention of this study was to establish if the third grade English Language Learners improved reading fluency when using the computerized Waterford Early Reading Program. This quantitative study determined the effectiveness of the Waterford Early Reading Program at two Title I elementary schools. Students not meeting Grade Level Expectations…

  1. Languages, communication potential and generalized trust in Sub-Saharan Africa: evidence based on the Afrobarometer Survey.

    PubMed

    Buzasi, Katalin

    2015-01-01

    The goal of this study is to investigate whether speaking other than home languages in Sub-Saharan Africa promotes generalized trust. Based on various psychological and economic theories, a simple model is provided to illustrate how languages might shape trust through various channels. Relying on data from the Afrobarometer Project, which provides information on home and additional languages, the Index of Communication Potential (ICP) is introduced to capture the linguistic situation in the 20 sample countries. The ICP, which can be computed at any desired level of aggregation, refers to the probability that an individual can communicate with a randomly selected person in the society based on common languages. The estimated two-level hierarchical models show that, however, individual level communication potential does not seem to impact trust formation, but living in an area with higher average communication potential increases the chance of exhibiting higher trust toward unknown people. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Language evolution and human-computer interaction

    NASA Technical Reports Server (NTRS)

    Grudin, Jonathan; Norman, Donald A.

    1991-01-01

    Many of the issues that confront designers of interactive computer systems also appear in natural language evolution. Natural languages and human-computer interfaces share as their primary mission the support of extended 'dialogues' between responsive entities. Because in each case one participant is a human being, some of the pressures operating on natural languages, causing them to evolve in order to better support such dialogue, also operate on human-computer 'languages' or interfaces. This does not necessarily push interfaces in the direction of natural language - since one entity in this dialogue is not a human, this is not to be expected. Nonetheless, by discerning where the pressures that guide natural language evolution also appear in human-computer interaction, we can contribute to the design of computer systems and obtain a new perspective on natural languages.

  3. Using stochastic language models (SLM) to map lexical, syntactic, and phonological information processing in the brain.

    PubMed

    Lopopolo, Alessandro; Frank, Stefan L; van den Bosch, Antal; Willems, Roel M

    2017-01-01

    Language comprehension involves the simultaneous processing of information at the phonological, syntactic, and lexical level. We track these three distinct streams of information in the brain by using stochastic measures derived from computational language models to detect neural correlates of phoneme, part-of-speech, and word processing in an fMRI experiment. Probabilistic language models have proven to be useful tools for studying how language is processed as a sequence of symbols unfolding in time. Conditional probabilities between sequences of words are at the basis of probabilistic measures such as surprisal and perplexity which have been successfully used as predictors of several behavioural and neural correlates of sentence processing. Here we computed perplexity from sequences of words and their parts of speech, and their phonemic transcriptions. Brain activity time-locked to each word is regressed on the three model-derived measures. We observe that the brain keeps track of the statistical structure of lexical, syntactic and phonological information in distinct areas.

  4. A Python Implementation of an Intermediate-Level Tropical Circulation Model and Implications for How Modeling Science is Done

    NASA Astrophysics Data System (ADS)

    Lin, J. W. B.

    2015-12-01

    Historically, climate models have been developed incrementally and in compiled languages like Fortran. While the use of legacy compiledlanguages results in fast, time-tested code, the resulting model is limited in its modularity and cannot take advantage of functionalityavailable with modern computer languages. Here we describe an effort at using the open-source, object-oriented language Pythonto create more flexible climate models: the package qtcm, a Python implementation of the intermediate-level Neelin-Zeng Quasi-Equilibrium Tropical Circulation model (QTCM1) of the atmosphere. The qtcm package retains the core numerics of QTCM1, written in Fortran, to optimize model performance but uses Python structures and utilities to wrap the QTCM1 Fortran routines and manage model execution. The resulting "mixed language" modeling package allows order and choice of subroutine execution to be altered at run time, and model analysis and visualization to be integrated in interactively with model execution at run time. This flexibility facilitates more complex scientific analysis using less complex code than would be possible using traditional languages alone and provides tools to transform the traditional "formulate hypothesis → write and test code → run model → analyze results" sequence into a feedback loop that can be executed automatically by the computer.

  5. Left inferior parietal lobe engagement in social cognition and language

    PubMed Central

    Bzdok, Danilo; Hartwigsen, Gesa; Reid, Andrew; Laird, Angela R.; Fox, Peter T.; Eickhoff, Simon B.

    2017-01-01

    Social cognition and language are two core features of the human species. Despite distributed recruitment of brain regions in each mental capacity, the left parietal lobe (LPL) represents a zone of topographical convergence. The present study quantitatively summarizes hundreds of neuroimaging studies on social cognition and language. Using connectivity-based parcellation on a meta-analytically defined volume of interest (VOI), regional coactivation patterns within this VOI allowed identifying distinct subregions. Across parcellation solutions, two clusters emerged consistently in rostro-ventral and caudo-ventral aspects of the parietal VOI. Both clusters were functionally significantly associated with social-cognitive and language processing. In particular, the rostro-ventral cluster was associated with lower-level processing facets, while the caudo-ventral cluster was associated with higher-level processing facets in both mental capacities. Contrarily, in the (less stable) dorsal parietal VOI, all clusters reflected computation of general-purpose processes, such as working memory and matching tasks, that are frequently co-recruited by social or language processes. Our results hence favour a rostro-caudal distinction of lower-versus higher-level processes underlying social cognition and language in the left inferior parietal lobe. PMID:27241201

  6. When technology became language: the origins of the linguistic conception of computer programming, 1950-1960.

    PubMed

    Nofre, David; Priestley, Mark; Alberts, Gerard

    2014-01-01

    Language is one of the central metaphors around which the discipline of computer science has been built. The language metaphor entered modern computing as part of a cybernetic discourse, but during the second half of the 1950s acquired a more abstract meaning, closely related to the formal languages of logic and linguistics. The article argues that this transformation was related to the appearance of the commercial computer in the mid-1950s. Managers of computing installations and specialists on computer programming in academic computer centers, confronted with an increasing variety of machines, called for the creation of "common" or "universal languages" to enable the migration of computer code from machine to machine. Finally, the article shows how the idea of a universal language was a decisive step in the emergence of programming languages, in the recognition of computer programming as a proper field of knowledge, and eventually in the way we think of the computer.

  7. Effect of Computer-Delivered Testing on Achievement in a Mastery Learning Course of Study with Partial Scoring and Variable Pacing.

    ERIC Educational Resources Information Center

    Evans, Richard M.; Surkan, Alvin J.

    The recent arrival of portable computer systems with high-level language interpreters now makes it practical to rapidly develop complex testing and scoring programs. These programs permit undergraduates access, at arbitrary times, to testing as an integral part of a mastery learning strategy. Effects of introducing the computer were studied by…

  8. The non-independence discussion about cycle structure in the computer language: the final simplification of computer language in the structural design

    NASA Astrophysics Data System (ADS)

    Yang, Peilu

    2013-03-01

    In the first place, the article discusses the theory, content, development, and questions about structured programming design. The further extension on this basement provides the cycle structure in computer language is the sequence structure, branch structure, and the cycle structure with independence. Through the deeply research by the writer, we find the non-independence and reach the final simplification about the computer language design. In the first, the writer provides the language structure of linear structure (I structure) and curvilinear structure (Y structure). This makes the computer language has high proficiency with simplification during the program exploration. The research in this article is corresponding with the widely used dualistic structure in the computer field. Moreover, it is greatly promote the evolution of computer language.

  9. Neurally and mathematically motivated architecture for language and thought.

    PubMed

    Perlovsky, L I; Ilin, R

    2010-01-01

    Neural structures of interaction between thinking and language are unknown. This paper suggests a possible architecture motivated by neural and mathematical considerations. A mathematical requirement of computability imposes significant constraints on possible architectures consistent with brain neural structure and with a wealth of psychological knowledge. How language interacts with cognition. Do we think with words, or is thinking independent from language with words being just labels for decisions? Why is language learned by the age of 5 or 7, but acquisition of knowledge represented by learning to use this language knowledge takes a lifetime? This paper discusses hierarchical aspects of language and thought and argues that high level abstract thinking is impossible without language. We discuss a mathematical technique that can model the joint language-thought architecture, while overcoming previously encountered difficulties of computability. This architecture explains a contradiction between human ability for rational thoughtful decisions and irrationality of human thinking revealed by Tversky and Kahneman; a crucial role in this contradiction might be played by language. The proposed model resolves long-standing issues: how the brain learns correct words-object associations; why animals do not talk and think like people. We propose the role played by language emotionality in its interaction with thought. We relate the mathematical model to Humboldt's "firmness" of languages; and discuss possible influence of language grammar on its emotionality. Psychological and brain imaging experiments related to the proposed model are discussed. Future theoretical and experimental research is outlined.

  10. Neurally and Mathematically Motivated Architecture for Language and Thought

    PubMed Central

    Perlovsky, L.I; Ilin, R

    2010-01-01

    Neural structures of interaction between thinking and language are unknown. This paper suggests a possible architecture motivated by neural and mathematical considerations. A mathematical requirement of computability imposes significant constraints on possible architectures consistent with brain neural structure and with a wealth of psychological knowledge. How language interacts with cognition. Do we think with words, or is thinking independent from language with words being just labels for decisions? Why is language learned by the age of 5 or 7, but acquisition of knowledge represented by learning to use this language knowledge takes a lifetime? This paper discusses hierarchical aspects of language and thought and argues that high level abstract thinking is impossible without language. We discuss a mathematical technique that can model the joint language-thought architecture, while overcoming previously encountered difficulties of computability. This architecture explains a contradiction between human ability for rational thoughtful decisions and irrationality of human thinking revealed by Tversky and Kahneman; a crucial role in this contradiction might be played by language. The proposed model resolves long-standing issues: how the brain learns correct words-object associations; why animals do not talk and think like people. We propose the role played by language emotionality in its interaction with thought. We relate the mathematical model to Humboldt’s “firmness” of languages; and discuss possible influence of language grammar on its emotionality. Psychological and brain imaging experiments related to the proposed model are discussed. Future theoretical and experimental research is outlined. PMID:21673788

  11. RPython high-level synthesis

    NASA Astrophysics Data System (ADS)

    Cieszewski, Radoslaw; Linczuk, Maciej

    2016-09-01

    The development of FPGA technology and the increasing complexity of applications in recent decades have forced compilers to move to higher abstraction levels. Compilers interprets an algorithmic description of a desired behavior written in High-Level Languages (HLLs) and translate it to Hardware Description Languages (HDLs). This paper presents a RPython based High-Level synthesis (HLS) compiler. The compiler get the configuration parameters and map RPython program to VHDL. Then, VHDL code can be used to program FPGA chips. In comparison of other technologies usage, FPGAs have the potential to achieve far greater performance than software as a result of omitting the fetch-decode-execute operations of General Purpose Processors (GPUs), and introduce more parallel computation. This can be exploited by utilizing many resources at the same time. Creating parallel algorithms computed with FPGAs in pure HDL is difficult and time consuming. Implementation time can be greatly reduced with High-Level Synthesis compiler. This article describes design methodologies and tools, implementation and first results of created VHDL backend for RPython compiler.

  12. Computational Linguistics in Military Operations

    DTIC Science & Technology

    2010-01-01

    information dominance at the operational and tactical level of war in future warfare. Discussion: Mastering culture and language in a foreign country is decisive to understand the operational environment. In addition, the ability to understand and speak a foreign language is a prerequisite to achieve truly comprehension of an unfamiliar culture. Lasting operations in Afghanistan and Iraq and the necessity to breach the language gap lead to progress in the field of Machine Translation and the development of technical solutions to close the gap in the past decade. This paper

  13. ONR Far East Scientific Bulletin, Volume 7, Number 2, April-June 1982,

    DTIC Science & Technology

    1982-01-01

    contained source code . - PAL (Program Automation Language) PAL is a system design language that automatically generates an executable program from a...NTIS c3&1 DTIC TliB Unn ’l.- A ElJustitt for _ By - Distrib~tion Availability Codes Avail and/or Di st Speojal iii 0- CONTENTS~ P age r’A Gflmpse at...tools exist at ECL in prototype forms. Like most major computer manufacturers, they have also extended high level languages such as FORTRAN , COBOL

  14. HAL/S - The programming language for Shuttle

    NASA Technical Reports Server (NTRS)

    Martin, F. H.

    1974-01-01

    HAL/S is a higher order language and system, now operational, adopted by NASA for programming Space Shuttle on-board software. Program reliability is enhanced through language clarity and readability, modularity through program structure, and protection of code and data. Salient features of HAL/S include output orientation, automatic checking (with strictly enforced compiler rules), the availability of linear algebra, real-time control, a statement-level simulator, and compiler transferability (for applying HAL/S to additional object and host computers). The compiler is described briefly.

  15. Computer-Assisted Language Learning: Current Programs and Projects. ERIC Digest.

    ERIC Educational Resources Information Center

    Higgins, Chris

    For many years, foreign language teachers have used the computer to provide supplemental exercises in the instruction of foreign languages. In recent years, advances in computer technology have motivated teachers to reassess the computer and consider it a valuable part of daily foreign language learning. Innovative software programs, authoring…

  16. The PLATO System and Language Study.

    ERIC Educational Resources Information Center

    Hart, Robert S., Ed.

    1981-01-01

    This issue presents an overview of research in computer-based language instruction using the PLATO IV computer system. The following articles are presented: (1) "Language Study and the PLATO system," by R. Hart; (2) "Reflections on the Use of Computers in Second-Language Acquisition," by F. Marty; (3) "Computer-Based…

  17. Application of Computer Assisted Colposcopy Education

    DTIC Science & Technology

    2001-05-29

    Language, age , and a literacy level of seventh grade also limited the study. The comfort level of the participant with computer utilization was another...across the age continuum. Even patients with low literacy skills also benefited from the self- paced instruction and non-threatening learning environment...Inclusion criteria were that women had to be 18 years of age or older and eligible for military medical care. Additionally, participants had to read

  18. Configurable software for satellite graphics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartzman, P D

    An important goal in interactive computer graphics is to provide users with both quick system responses for basic graphics functions and enough computing power for complex calculations. One solution is to have a distributed graphics system in which a minicomputer and a powerful large computer share the work. The most versatile type of distributed system is an intelligent satellite system in which the minicomputer is programmable by the application user and can do most of the work while the large remote machine is used for difficult computations. At New York University, the hardware was configured from available equipment. The levelmore » of system intelligence resulted almost completely from software development. Unlike previous work with intelligent satellites, the resulting system had system control centered in the satellite. It also had the ability to reconfigure software during realtime operation. The design of the system was done at a very high level using set theoretic language. The specification clearly illustrated processor boundaries and interfaces. The high-level specification also produced a compact, machine-independent virtual graphics data structure for picture representation. The software was written in a systems implementation language; thus, only one set of programs was needed for both machines. A user can program both machines in a single language. Tests of the system with an application program indicate that is has very high potential. A major result of this work is the demonstration that a gigantic investment in new hardware is not necessary for computing facilities interested in graphics.« less

  19. The Advantages and Disadvantages of Computer Technology in Second Language Acquisition

    ERIC Educational Resources Information Center

    Lai, Cheng-Chieh; Kritsonis, William Allan

    2006-01-01

    The purpose of this article is to discuss the advantages and disadvantages of computer technology and Computer Assisted Language Learning (CALL) programs for current second language learning. According to the National Clearinghouse for English Language Acquisition & Language Instruction Educational Programs' report (2002), more than nine million…

  20. "Leveling the Playing Field:" The Effects of Online Second Language Instruction on Student Willingness to Communicate in French

    ERIC Educational Resources Information Center

    Kissau, Scott; McCullough, Heather; Pyke, J. Garvey

    2010-01-01

    Second language (L2) instruction in the United States has in recent history experienced significant change. Instead of emphasizing grammatical accuracy, L2 teachers are now asked to focus on developing student communication skills. Furthermore, L2 classrooms are being transformed via the growth of computer-mediated instruction. Traditional,…

  1. Blended Learning Experience in a Programming Language Course and the Effect of the Thinking Styles of the Students on Success and Motivation

    ERIC Educational Resources Information Center

    Yagci, Mustafa

    2016-01-01

    High-level thinking and problem solving skill is one requirement of computer programming that most of the students experience problems with. Individual differences such as motivation, attitude towards programming, thinking style of the student, and complexity of the programming language have influence on students' success on programming. Thus,…

  2. Language and Discourse Analysis with Coh-Metrix: Applications from Educational Material to Learning Environments at Scale

    ERIC Educational Resources Information Center

    Dowell, Nia M. M.; Graesser, Arthur\tC.; Cai, Zhiqiang

    2016-01-01

    The goal of this article is to preserve and distribute the information presented at the LASI (2014) workshop on Coh-Metrix, a theoretically grounded, computational linguistics facility that analyzes texts on multiple levels of language and discourse. The workshop focused on the utility of Coh-Metrix in discourse theory and educational practice. We…

  3. The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 2 Core.

    PubMed

    Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J

    2018-03-09

    Computational models can help researchers to interpret data, understand biological functions, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that different software systems can exchange. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 2 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML, their encoding in XML (the eXtensible Markup Language), validation rules that determine the validity of an SBML document, and examples of models in SBML form. The design of Version 2 differs from Version 1 principally in allowing new MathML constructs, making more child elements optional, and adding identifiers to all SBML elements instead of only selected elements. Other materials and software are available from the SBML project website at http://sbml.org/.

  4. High-Level Data-Abstraction System

    NASA Technical Reports Server (NTRS)

    Fishwick, P. A.

    1986-01-01

    Communication with data-base processor flexible and efficient. High Level Data Abstraction (HILDA) system is three-layer system supporting data-abstraction features of Intel data-base processor (DBP). Purpose of HILDA establishment of flexible method of efficiently communicating with DBP. Power of HILDA lies in its extensibility with regard to syntax and semantic changes. HILDA's high-level query language readily modified. Offers powerful potential to computer sites where DBP attached to DEC VAX-series computer. HILDA system written in Pascal and FORTRAN 77 for interactive execution.

  5. The Nebula Standard Computer Architecture,

    DTIC Science & Technology

    good target for high level languages, the designers also adopted a visibility approach in architecture design that provides more freedom for the hardware implementor while still maintaining software portability. (Author)

  6. The emergence of mind and brain: an evolutionary, computational, and philosophical approach.

    PubMed

    Mainzer, Klaus

    2008-01-01

    Modern philosophy of mind cannot be understood without recent developments in computer science, artificial intelligence (AI), robotics, neuroscience, biology, linguistics, and psychology. Classical philosophy of formal languages as well as symbolic AI assume that all kinds of knowledge must explicitly be represented by formal or programming languages. This assumption is limited by recent insights into the biology of evolution and developmental psychology of the human organism. Most of our knowledge is implicit and unconscious. It is not formally represented, but embodied knowledge, which is learnt by doing and understood by bodily interacting with changing environments. That is true not only for low-level skills, but even for high-level domains of categorization, language, and abstract thinking. The embodied mind is considered an emergent capacity of the brain as a self-organizing complex system. Actually, self-organization has been a successful strategy of evolution to handle the increasing complexity of the world. Genetic programs are not sufficient and cannot prepare the organism for all kinds of complex situations in the future. Self-organization and emergence are fundamental concepts in the theory of complex dynamical systems. They are also applied in organic computing as a recent research field of computer science. Therefore, cognitive science, AI, and robotics try to model the embodied mind in an artificial evolution. The paper analyzes these approaches in the interdisciplinary framework of complex dynamical systems and discusses their philosophical impact.

  7. Computer Assisted Language Learning. Routledge Studies in Computer Assisted Language Learning

    ERIC Educational Resources Information Center

    Pennington, Martha

    2011-01-01

    Computer-assisted language learning (CALL) is an approach to language teaching and learning in which computer technology is used as an aid to the presentation, reinforcement and assessment of material to be learned, usually including a substantial interactive element. This books provides an up-to date and comprehensive overview of…

  8. The computational structural mechanics testbed architecture. Volume 2: The interface

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1988-01-01

    This is the third set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language CLAMP, the command language interpreter CLIP, and the data manager GAL. Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 3 describes the CLIP-Processor interface and related topics. It is intended only for processor developers.

  9. The computational structural mechanics testbed architecture. Volume 2: Directives

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1989-01-01

    This is the second of a set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language (CLAMP), the command language interpreter (CLIP), and the data manager (GAL). Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 2 describes the CLIP directives in detail. It is intended for intermediate and advanced users.

  10. Are languages really independent from genes? If not, what would a genetic bias affecting language diversity look like?

    PubMed

    Dediu, Dan

    2011-04-01

    It is generally accepted that the relationship between human genes and language is very complex and multifaceted. This has its roots in the “regular” complexity governing the interplay among genes and between genes and environment for most phenotypes, but with the added layer of supraontogenetic and supra-individual processes defining culture. At the coarsest level, focusing on the species, it is clear that human-specific--but not necessarily faculty-specific--genetic factors subtend our capacity for language and a currently very productive research program is aiming at uncovering them. At the other end of the spectrum, it is uncontroversial that individual-level variations in different aspects related to speech and language have an important genetic component and their discovery and detailed characterization have already started to revolutionize the way we think about human nature. However, at the intermediate, glossogenetic/population level, the relationship becomes controversial, partly due to deeply ingrained beliefs about language acquisition and universality and partly because of confusions with a different type of gene-languages correlation due to shared history. Nevertheless, conceptual, mathematical and computational models--and, recently, experimental evidence from artificial languages and songbirds--have repeatedly shown that genetic biases affecting the acquisition or processing of aspects of language and speech can be amplified by population-level intergenerational cultural processes and made manifest either as fixed “universal” properties of language or as structured linguistic diversity. Here, I review several such models as well as the recently proposed case of a causal relationship between the distribution of tone languages and two genes related to brain growth and development, ASPM and Microcephalin, and I discuss the relevance of such genetic biasing for language evolution, change, and diversity.

  11. Neural associative memories for the integration of language, vision and action in an autonomous agent.

    PubMed

    Markert, H; Kaufmann, U; Kara Kayikci, Z; Palm, G

    2009-03-01

    Language understanding is a long-standing problem in computer science. However, the human brain is capable of processing complex languages with seemingly no difficulties. This paper shows a model for language understanding using biologically plausible neural networks composed of associative memories. The model is able to deal with ambiguities on the single word and grammatical level. The language system is embedded into a robot in order to demonstrate the correct semantical understanding of the input sentences by letting the robot perform corresponding actions. For that purpose, a simple neural action planning system has been combined with neural networks for visual object recognition and visual attention control mechanisms.

  12. Student Computer Dialogs Without Special Purpose Languages.

    ERIC Educational Resources Information Center

    Bork, Alfred

    The phrase "student computer dialogs" refers to interactive sessions between the student and the computer. Rather than using programing languages specifically designed for computer assisted instruction (CAI), existing general purpose languages should be emphasized in the future development of student computer dialogs, as the power and…

  13. Communicating River Level Data and Information to Stakeholders with Different Interests

    NASA Astrophysics Data System (ADS)

    Macleod, K.; Sripada, S.; Ioris, A.; Arts, K.; van der Wal, R.

    2012-12-01

    There is a need to increase the effectiveness of how river level data are communicated to a range of stakeholders with an interest in river level information to increase the use of data collected by regulatory agencies. Currently, river level data is provided to members of the public through a web site without any formal engagement with river users having taken place. In our research project called wikiRivers, we are working with the suppliers of river level data as well as the users of this data to explore and improve from the user perspective how river level data and information is made available online. We are focusing on the application of natural language generation technology to create textual summaries of river level data tailored for specific interest groups. These tailored textual summaries will be presented among other modes of information presentation (e.g. maps and visualizations) with the aim to increase communication effectiveness. Natural language generation involves developing computational models that use non-linguistic input data to produce natural language as their output. Acquiring accurate correct system knowledge for natural language generation is a key step in developing such an effective computer software system. In this paper we set out the needs for this project based on discussions with the stakeholder who supplies the river level data and current cyberinfrastructure and report on what we have learned from those individuals and groups who use river level data. Stages in the wikiRivers stakeholder identification, engagement and cyberinfrastructure development. S1- interviews with collectors and suppliers of river level data. S2- river level data stakeholder analysis, including analysis of their interests in individual river networks in Scotland and what they require from the cyberinfrastructure. S3-5 Iterative development and testing of cyberinfrastructure and modelling of river level data with domain and stakeholder knowledge.

  14. Computationally intensive econometrics using a distributed matrix-programming language.

    PubMed

    Doornik, Jurgen A; Hendry, David F; Shephard, Neil

    2002-06-15

    This paper reviews the need for powerful computing facilities in econometrics, focusing on concrete problems which arise in financial economics and in macroeconomics. We argue that the profession is being held back by the lack of easy-to-use generic software which is able to exploit the availability of cheap clusters of distributed computers. Our response is to extend, in a number of directions, the well-known matrix-programming interpreted language Ox developed by the first author. We note three possible levels of extensions: (i) Ox with parallelization explicit in the Ox code; (ii) Ox with a parallelized run-time library; and (iii) Ox with a parallelized interpreter. This paper studies and implements the first case, emphasizing the need for deterministic computing in science. We give examples in the context of financial economics and time-series modelling.

  15. Common data buffer

    NASA Technical Reports Server (NTRS)

    Byrne, F.

    1981-01-01

    Time-shared interface speeds data processing in distributed computer network. Two-level high-speed scanning approach routes information to buffer, portion of which is reserved for series of "first-in, first-out" memory stacks. Buffer address structure and memory are protected from noise or failed components by error correcting code. System is applicable to any computer or processing language.

  16. Teaching Computer Languages and Elementary Theory for Mixed Audiences at University Level

    ERIC Educational Resources Information Center

    Christiansen, Henning

    2004-01-01

    Theoretical issues of computer science are traditionally taught in a way that presupposes a solid mathematical background and are usually considered more or less inaccessible for students without this. An effective methodology is described which has been developed for a target group of university students with different backgrounds such as natural…

  17. x-y-recording in transmission electron microscopy. A versatile and inexpensive interface to personal computers with application to stereology.

    PubMed

    Rickmann, M; Siklós, L; Joó, F; Wolff, J R

    1990-09-01

    An interface for IBM XT/AT-compatible computers is described which has been designed to read the actual specimen stage position of electron microscopes. The complete system consists of (i) optical incremental encoders attached to the x- and y-stage drivers of the microscope, (ii) two keypads for operator input, (iii) an interface card fitted to the bus of the personal computer, (iv) a standard configuration IBM XT (or compatible) personal computer optionally equipped with a (v) HP Graphic Language controllable colour plotter. The small size of the encoders and their connection to the stage drivers by simple ribbed belts allows an easy adaptation of the system to most electron microscopes. Operation of the interface card itself is supported by any high-level language available for personal computers. By the modular concept of these languages, the system can be customized to various applications, and no computer expertise is needed for actual operation. The present configuration offers an inexpensive attachment, which covers a wide range of applications from a simple notebook to high-resolution (200-nm) mapping of tissue. Since section coordinates can be processed in real-time, stereological estimations can be derived directly "on microscope". This is exemplified by an application in which particle numbers were determined by the disector method.

  18. On the Nature of Agreement in English-French Acquisition: A Processing Investigation in the Verbal and Nominal Domains

    ERIC Educational Resources Information Center

    Renaud, Claire

    2010-01-01

    Current second language (L2) research focuses on the level of features--that is, the core elements of languages in the Minimalist Program framework. These features, involved in computations, are further divided into two types: those that indicate to which category a word belongs (i.e., interpretable features) versus those that constrain the type…

  19. Left inferior parietal lobe engagement in social cognition and language.

    PubMed

    Bzdok, Danilo; Hartwigsen, Gesa; Reid, Andrew; Laird, Angela R; Fox, Peter T; Eickhoff, Simon B

    2016-09-01

    Social cognition and language are two core features of the human species. Despite distributed recruitment of brain regions in each mental capacity, the left parietal lobe (LPL) represents a zone of topographical convergence. The present study quantitatively summarizes hundreds of neuroimaging studies on social cognition and language. Using connectivity-based parcellation on a meta-analytically defined volume of interest (VOI), regional coactivation patterns within this VOI allowed identifying distinct subregions. Across parcellation solutions, two clusters emerged consistently in rostro-ventral and caudo-ventral aspects of the parietal VOI. Both clusters were functionally significantly associated with social-cognitive and language processing. In particular, the rostro-ventral cluster was associated with lower-level processing facets, while the caudo-ventral cluster was associated with higher-level processing facets in both mental capacities. Contrarily, in the (less stable) dorsal parietal VOI, all clusters reflected computation of general-purpose processes, such as working memory and matching tasks, that are frequently co-recruited by social or language processes. Our results hence favour a rostro-caudal distinction of lower- versus higher-level processes underlying social cognition and language in the left inferior parietal lobe. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Biocharts: a visual formalism for complex biological systems

    PubMed Central

    Kugler, Hillel; Larjo, Antti; Harel, David

    2010-01-01

    We address one of the central issues in devising languages, methods and tools for the modelling and analysis of complex biological systems, that of linking high-level (e.g. intercellular) information with lower-level (e.g. intracellular) information. Adequate ways of dealing with this issue are crucial for understanding biological networks and pathways, which typically contain huge amounts of data that continue to grow as our knowledge and understanding of a system increases. Trying to comprehend such data using the standard methods currently in use is often virtually impossible. We propose a two-tier compound visual language, which we call Biocharts, that is geared towards building fully executable models of biological systems. One of the main goals of our approach is to enable biologists to actively participate in the computational modelling effort, in a natural way. The high-level part of our language is a version of statecharts, which have been shown to be extremely successful in software and systems engineering. The statecharts can be combined with any appropriately well-defined language (preferably a diagrammatic one) for specifying the low-level dynamics of the pathways and networks. We illustrate the language and our general modelling approach using the well-studied process of bacterial chemotaxis. PMID:20022895

  1. The Julia programming language: the future of scientific computing

    NASA Astrophysics Data System (ADS)

    Gibson, John

    2017-11-01

    Julia is an innovative new open-source programming language for high-level, high-performance numerical computing. Julia combines the general-purpose breadth and extensibility of Python, the ease-of-use and numeric focus of Matlab, the speed of C and Fortran, and the metaprogramming power of Lisp. Julia uses type inference and just-in-time compilation to compile high-level user code to machine code on the fly. A rich set of numeric types and extensive numerical libraries are built-in. As a result, Julia is competitive with Matlab for interactive graphical exploration and with C and Fortran for high-performance computing. This talk interactively demonstrates Julia's numerical features and benchmarks Julia against C, C++, Fortran, Matlab, and Python on a spectral time-stepping algorithm for a 1d nonlinear partial differential equation. The Julia code is nearly as compact as Matlab and nearly as fast as Fortran. This material is based upon work supported by the National Science Foundation under Grant No. 1554149.

  2. Exploiting current-generation graphics hardware for synthetic-scene generation

    NASA Astrophysics Data System (ADS)

    Tanner, Michael A.; Keen, Wayne A.

    2010-04-01

    Increasing seeker frame rate and pixel count, as well as the demand for higher levels of scene fidelity, have driven scene generation software for hardware-in-the-loop (HWIL) and software-in-the-loop (SWIL) testing to higher levels of parallelization. Because modern PC graphics cards provide multiple computational cores (240 shader cores for a current NVIDIA Corporation GeForce and Quadro cards), implementation of phenomenology codes on graphics processing units (GPUs) offers significant potential for simultaneous enhancement of simulation frame rate and fidelity. To take advantage of this potential requires algorithm implementation that is structured to minimize data transfers between the central processing unit (CPU) and the GPU. In this paper, preliminary methodologies developed at the Kinetic Hardware In-The-Loop Simulator (KHILS) will be presented. Included in this paper will be various language tradeoffs between conventional shader programming, Compute Unified Device Architecture (CUDA) and Open Computing Language (OpenCL), including performance trades and possible pathways for future tool development.

  3. ASTEC and MODEL: Controls software development at Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Downing, John P.; Bauer, Frank H.; Surber, Jeffrey L.

    1993-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at the Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. In the last three years the ASTEC (Analysis and Simulation Tools for Engineering Controls) software has been under development. ASTEC is meant to be an integrated collection of controls analysis tools for use at the desktop level. MODEL (Multi-Optimal Differential Equation Language) is a translator that converts programs written in the MODEL language to FORTRAN. An upgraded version of the MODEL program will be merged into ASTEC. MODEL has not been modified since 1981 and has not kept with changes in computers or user interface techniques. This paper describes the changes made to MODEL in order to make it useful in the 90's and how it relates to ASTEC.

  4. Contracting for Computer Software in Standardized Computer Languages

    PubMed Central

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the contracting process for standard language software. Appropriate contract language is suggested for requiring strict compliance with a standard, and an overview of remedies is given for failure to comply.

  5. QuantumOptics.jl: A Julia framework for simulating open quantum systems

    NASA Astrophysics Data System (ADS)

    Krämer, Sebastian; Plankensteiner, David; Ostermann, Laurin; Ritsch, Helmut

    2018-06-01

    We present an open source computational framework geared towards the efficient numerical investigation of open quantum systems written in the Julia programming language. Built exclusively in Julia and based on standard quantum optics notation, the toolbox offers speed comparable to low-level statically typed languages, without compromising on the accessibility and code readability found in dynamic languages. After introducing the framework, we highlight its features and showcase implementations of generic quantum models. Finally, we compare its usability and performance to two well-established and widely used numerical quantum libraries.

  6. Computer architecture evaluation for structural dynamics computations: Project summary

    NASA Technical Reports Server (NTRS)

    Standley, Hilda M.

    1989-01-01

    The intent of the proposed effort is the examination of the impact of the elements of parallel architectures on the performance realized in a parallel computation. To this end, three major projects are developed: a language for the expression of high level parallelism, a statistical technique for the synthesis of multicomputer interconnection networks based upon performance prediction, and a queueing model for the analysis of shared memory hierarchies.

  7. The DYNAMO Simulation Language--An Alternate Approach to Computer Science Education.

    ERIC Educational Resources Information Center

    Bronson, Richard

    1986-01-01

    Suggests the use of computer simulation of continuous systems as a problem solving approach to computer languages. Outlines the procedures that the system dynamics approach employs in computer simulations. Explains the advantages of the special purpose language, DYNAMO. (ML)

  8. Performance of the Heavy Flavor Tracker (HFT) detector in star experiment at RHIC

    NASA Astrophysics Data System (ADS)

    Alruwaili, Manal

    With the growing technology, the number of the processors is becoming massive. Current supercomputer processing will be available on desktops in the next decade. For mass scale application software development on massive parallel computing available on desktops, existing popular languages with large libraries have to be augmented with new constructs and paradigms that exploit massive parallel computing and distributed memory models while retaining the user-friendliness. Currently, available object oriented languages for massive parallel computing such as Chapel, X10 and UPC++ exploit distributed computing, data parallel computing and thread-parallelism at the process level in the PGAS (Partitioned Global Address Space) memory model. However, they do not incorporate: 1) any extension at for object distribution to exploit PGAS model; 2) the programs lack the flexibility of migrating or cloning an object between places to exploit load balancing; and 3) lack the programming paradigms that will result from the integration of data and thread-level parallelism and object distribution. In the proposed thesis, I compare different languages in PGAS model; propose new constructs that extend C++ with object distribution and object migration; and integrate PGAS based process constructs with these extensions on distributed objects. Object cloning and object migration. Also a new paradigm MIDD (Multiple Invocation Distributed Data) is presented when different copies of the same class can be invoked, and work on different elements of a distributed data concurrently using remote method invocations. I present new constructs, their grammar and their behavior. The new constructs have been explained using simple programs utilizing these constructs.

  9. A pattern-based analysis of clinical computer-interpretable guideline modeling languages.

    PubMed

    Mulyar, Nataliya; van der Aalst, Wil M P; Peleg, Mor

    2007-01-01

    Languages used to specify computer-interpretable guidelines (CIGs) differ in their approaches to addressing particular modeling challenges. The main goals of this article are: (1) to examine the expressive power of CIG modeling languages, and (2) to define the differences, from the control-flow perspective, between process languages in workflow management systems and modeling languages used to design clinical guidelines. The pattern-based analysis was applied to guideline modeling languages Asbru, EON, GLIF, and PROforma. We focused on control-flow and left other perspectives out of consideration. We evaluated the selected CIG modeling languages and identified their degree of support of 43 control-flow patterns. We used a set of explicitly defined evaluation criteria to determine whether each pattern is supported directly, indirectly, or not at all. PROforma offers direct support for 22 of 43 patterns, Asbru 20, GLIF 17, and EON 11. All four directly support basic control-flow patterns, cancellation patterns, and some advance branching and synchronization patterns. None support multiple instances patterns. They offer varying levels of support for synchronizing merge patterns and state-based patterns. Some support a few scenarios not covered by the 43 control-flow patterns. CIG modeling languages are remarkably close to traditional workflow languages from the control-flow perspective, but cover many fewer workflow patterns. CIG languages offer some flexibility that supports modeling of complex decisions and provide ways for modeling some decisions not covered by workflow management systems. Workflow management systems may be suitable for clinical guideline applications.

  10. Equation-based languages – A new paradigm for building energy modeling, simulation and optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wetter, Michael; Bonvini, Marco; Nouidui, Thierry S.

    Most of the state-of-the-art building simulation programs implement models in imperative programming languages. This complicates modeling and excludes the use of certain efficient methods for simulation and optimization. In contrast, equation-based modeling languages declare relations among variables, thereby allowing the use of computer algebra to enable much simpler schematic modeling and to generate efficient code for simulation and optimization. We contrast the two approaches in this paper. We explain how such manipulations support new use cases. In the first of two examples, we couple models of the electrical grid, multiple buildings, HVAC systems and controllers to test a controller thatmore » adjusts building room temperatures and PV inverter reactive power to maintain power quality. In the second example, we contrast the computing time for solving an optimal control problem for a room-level model predictive controller with and without symbolic manipulations. As a result, exploiting the equation-based language led to 2, 200 times faster solution« less

  11. Equation-based languages – A new paradigm for building energy modeling, simulation and optimization

    DOE PAGES

    Wetter, Michael; Bonvini, Marco; Nouidui, Thierry S.

    2016-04-01

    Most of the state-of-the-art building simulation programs implement models in imperative programming languages. This complicates modeling and excludes the use of certain efficient methods for simulation and optimization. In contrast, equation-based modeling languages declare relations among variables, thereby allowing the use of computer algebra to enable much simpler schematic modeling and to generate efficient code for simulation and optimization. We contrast the two approaches in this paper. We explain how such manipulations support new use cases. In the first of two examples, we couple models of the electrical grid, multiple buildings, HVAC systems and controllers to test a controller thatmore » adjusts building room temperatures and PV inverter reactive power to maintain power quality. In the second example, we contrast the computing time for solving an optimal control problem for a room-level model predictive controller with and without symbolic manipulations. As a result, exploiting the equation-based language led to 2, 200 times faster solution« less

  12. The computational structural mechanics testbed architecture. Volume 4: The global-database manager GAL-DBM

    NASA Technical Reports Server (NTRS)

    Wright, Mary A.; Regelbrugge, Marc E.; Felippa, Carlos A.

    1989-01-01

    This is the fourth of a set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language CLAMP, the command language interpreter CLIP, and the data manager GAL. Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 4 describes the nominal-record data management component of the NICE software. It is intended for all users.

  13. Is CALL Obsolete? Language Acquisition and Language Learning Revisited in a Digital Age

    ERIC Educational Resources Information Center

    Jarvis, Huw; Krashen, Stephen

    2014-01-01

    In this article, Huw Jarvis and Stephen Krashen ask "Is CALL Obsolete?" When the term CALL (Computer-Assisted Language Learning) was introduced in the 1960s, the language education profession knew only about language learning, not language acquisition, and assumed the computer's primary contribution to second language acquisition…

  14. Application programs written by using customizing tools of a computer-aided design system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, X.; Huang, R.; Juricic, D.

    1995-12-31

    Customizing tools of Computer-Aided Design Systems have been developed to such a degree as to become equivalent to powerful higher-level programming languages that are especially suitable for graphics applications. Two examples of application programs written by using AutoCAD`s customizing tools are given in some detail to illustrate their power. One tool uses AutoLISP list-processing language to develop an application program that produces four views of a given solid model. The other uses AutoCAD Developmental System, based on program modules written in C, to produce an application program that renders a freehand sketch from a given CAD drawing.

  15. Preface to MOST-ONISW 2009

    NASA Astrophysics Data System (ADS)

    Doerr, Martin; Freitas, Fred; Guizzardi, Giancarlo; Han, Hyoil

    Ontology is a cross-disciplinary field concerned with the study of concepts and theories that can be used for representing shared conceptualizations of specific domains. Ontological Engineering is a discipline in computer and information science concerned with the development of techniques, methods, languages and tools for the systematic construction of concrete artifacts capturing these representations, i.e., models (e.g., domain ontologies) and metamodels (e.g., upper-level ontologies). In recent years, there has been a growing interest in the application of formal ontology and ontological engineering to solve modeling problems in diverse areas in computer science such as software and data engineering, knowledge representation, natural language processing, information science, among many others.

  16. MATH77, Version 4.0

    NASA Technical Reports Server (NTRS)

    Lawson, Charles L.; Krogh, Fred; Van Snyder, W.; Oken, Carol A.; Mccreary, Faith A.; Lieske, Jay H.; Perrine, Jack; Coffin, Ralph S.; Wayne, Warren J.

    1994-01-01

    MATH77 is high-quality library of ANSI FORTRAN 77 subprograms implementing contemporary algorithms for basic computational processes of science and engineering. Release 4.0 of MATH77 contains 454 user-callable and 136 lower-level subprograms. MATH77 release 4.0 subroutine library designed to be usable on any computer system supporting full ANSI standard FORTRAN 77 language.

  17. Computer Assisted Instruction: The Game "Le Choc des Multinationales."

    ERIC Educational Resources Information Center

    Cramer, Hazel

    "Le Choc de Multinationales" is a microcomputer game for students in an upper-level commercial French couse, to be played by two opponents, one of whom may be another student or the computer itself as a direct business competitor. The game's requirements for language use and knowledge of business and economics theory and principles are moderate,…

  18. Bilingual Academic Computer and Technology Oriented Program. Project COM-TECH, 1987-1988.

    ERIC Educational Resources Information Center

    Berney, Tomi D.; Plotkin, Donna

    The Bilingual Computer and Technology Oriented Program (COM-TECH) completed the final year of a 3-year funding cycle. The project's primary goal was to provide bilingual individualized instruction, using an enrichment approach, to Spanish- and Haitian Creole/French-speaking students of varying levels of native and English second-language (ESL)…

  19. A Digital Simulation Program for Health Science Students to Follow Drug Levels in the Body

    ERIC Educational Resources Information Center

    Stavchansky, Salomon; And Others

    1977-01-01

    The Rayetheon Scientific Simulation Language (RSSL) program, an easily-used simulation on the CDC/6600 computer at the University of Texas at Austin, offers a simple method of solving differential equations on a digital computer. It is used by undergraduate biopharmaceutics-pharmacokinetics students and graduate students in all areas. (Author/LBH)

  20. A Peer-Assisted Learning Experience in Computer Programming Language Learning and Developing Computer Programming Skills

    ERIC Educational Resources Information Center

    Altintas, Tugba; Gunes, Ali; Sayan, Hamiyet

    2016-01-01

    Peer learning or, as commonly expressed, peer-assisted learning (PAL) involves school students who actively assist others to learn and in turn benefit from an effective learning environment. This research was designed to support students in becoming more autonomous in their learning, help them enhance their confidence level in tackling computer…

  1. Pedagogy and Processes for a Computer Programming Outreach Workshop--The Bridge to College Model

    ERIC Educational Resources Information Center

    Tangney, Brendan; Oldham, Elizabeth; Conneely, Claire; Barrett, Stephen; Lawlor, John

    2010-01-01

    This paper describes a model for computer programming outreach workshops aimed at second-level students (ages 15-16). Participants engage in a series of programming activities based on the Scratch visual programming language, and a very strong group-based pedagogy is followed. Participants are not required to have any prior programming experience.…

  2. Systems Biology Markup Language (SBML) Level 2 Version 5: Structures and Facilities for Model Definitions

    PubMed Central

    Hucka, Michael; Bergmann, Frank T.; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M.; Le Novére, Nicolas; Myers, Chris J.; Olivier, Brett G.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Waltemath, Dagmar; Wilkinson, Darren J.

    2017-01-01

    Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528569

  3. Systems Biology Markup Language (SBML) Level 2 Version 5: Structures and Facilities for Model Definitions.

    PubMed

    Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J

    2015-09-04

    Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org.

  4. Systems Biology Markup Language (SBML) Level 2 Version 5: Structures and Facilities for Model Definitions.

    PubMed

    Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J

    2015-06-01

    Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.

  5. The Computer Integration into the EFL Instruction in Indonesia: An Analysis of Two University Instructors in Integrating Computer Technology into EFL Instruction to Encourage Students' Language Learning Engagement

    ERIC Educational Resources Information Center

    Prihatin, Pius N.

    2012-01-01

    Computer technology has been popular for teaching English as a foreign language in non-English speaking countries. This case study explored the way language instructors designed and implemented computer-based instruction so that students are engaged in English language learning. This study explored the beliefs, practices and perceptions of…

  6. CDC to CRAY FORTRAN conversion manual

    NASA Technical Reports Server (NTRS)

    Mcgary, C.; Diebert, D.

    1983-01-01

    Documentation describing software differences between two general purpose computers for scientific applications is presented. Descriptions of the use of the FORTRAN and FORTRAN 77 high level programming language on a CDC 7600 under SCOPE and a CRAY XMP under COS are offered. Itemized differences of the FORTRAN language sets of the two machines are also included. The material is accompanied by numerous examples of preferred programming techniques for the two machines.

  7. Florida Assessments for Instruction in Reading, Aligned to the Language Arts Florida Standards, FAIR-FS, Grades 3 through 12. Technical Manual

    ERIC Educational Resources Information Center

    Foorman, Barbara R.; Petscher, Yaacov; Schatschneider, Chris

    2015-01-01

    The FAIR-FS consists of computer-adaptive reading comprehension and oral language screening tasks that provide measures to track growth over time, as well as a Probability of Literacy Success (PLS) linked to grade-level performance (i.e., the 40th percentile) on the reading comprehension subtest of the Stanford Achievement Test (SAT-10) in the…

  8. Introduction to Computer Aided Instruction in the Language Laboratory.

    ERIC Educational Resources Information Center

    Hughett, Harvey L.

    The first half of this book focuses on the rationale, ideas, and information for the use of technology, including microcomputers, to improve language teaching efficiency. Topics discussed include foreign language computer assisted instruction (CAI), hardware and software selection, computer literacy, educational computing organizations, ease of…

  9. An informatics approach to integrating genetic and neurological data in speech and language neuroscience.

    PubMed

    Bohland, Jason W; Myers, Emma M; Kim, Esther

    2014-01-01

    A number of heritable disorders impair the normal development of speech and language processes and occur in large numbers within the general population. While candidate genes and loci have been identified, the gap between genotype and phenotype is vast, limiting current understanding of the biology of normal and disordered processes. This gap exists not only in our scientific knowledge, but also in our research communities, where genetics researchers and speech, language, and cognitive scientists tend to operate independently. Here we describe a web-based, domain-specific, curated database that represents information about genotype-phenotype relations specific to speech and language disorders, as well as neuroimaging results demonstrating focal brain differences in relevant patients versus controls. Bringing these two distinct data types into a common database ( http://neurospeech.org/sldb ) is a first step toward bringing molecular level information into cognitive and computational theories of speech and language function. One bridge between these data types is provided by densely sampled profiles of gene expression in the brain, such as those provided by the Allen Brain Atlases. Here we present results from exploratory analyses of human brain gene expression profiles for genes implicated in speech and language disorders, which are annotated in our database. We then discuss how such datasets can be useful in the development of computational models that bridge levels of analysis, necessary to provide a mechanistic understanding of heritable language disorders. We further describe our general approach to information integration, discuss important caveats and considerations, and offer a specific but speculative example based on genes implicated in stuttering and basal ganglia function in speech motor control.

  10. Electronic Circuit Analysis Language (ECAL)

    NASA Astrophysics Data System (ADS)

    Chenghang, C.

    1983-03-01

    The computer aided design technique is an important development in computer applications and it is an important component of computer science. The special language for electronic circuit analysis is the foundation of computer aided design or computer aided circuit analysis (abbreviated as CACD and CACA) of simulated circuits. Electronic circuit analysis language (ECAL) is a comparatively simple and easy to use circuit analysis special language which uses the FORTRAN language to carry out the explanatory executions. It is capable of conducting dc analysis, ac analysis, and transient analysis of a circuit. Futhermore, the results of the dc analysis can be used directly as the initial conditions for the ac and transient analyses.

  11. Computer-Mediated Glosses in Second Language Reading Comprehension and Vocabulary Learning: A Meta-Analysis

    ERIC Educational Resources Information Center

    Abraham, Lee B.

    2008-01-01

    Language learners have unprecedented opportunities for developing second language literacy skills and intercultural understanding by reading authentic texts on the Internet and in multimedia computer-assisted language learning environments. This article presents findings from a meta-analysis of 11 studies of computer-mediated glosses in second…

  12. Pedagogy and Related Criteria: The Selection of Software for Computer Assisted Language Learning

    ERIC Educational Resources Information Center

    Samuels, Jeffrey D.

    2013-01-01

    Computer-Assisted Language Learning (CALL) is an established field of academic inquiry with distinct applications for second language teaching and learning. Many CALL professionals direct language labs or language resource centers (LRCs) in which CALL software applications and generic software applications support language learning programs and…

  13. Learning Strategies and Motivation among Procrastinators of Various English Proficiency Levels

    ERIC Educational Resources Information Center

    Goda, Yoshiko; Yamada, Masanori; Matsuda, Takeshi; Kato, Hiroshi; Saito, Yutaka; Miyagawa, Hiroyuki

    2014-01-01

    Our research project focuses on learning strategies and motivation among academic procrastinators in computer assisted language learning (CALL) settings. In this study, we aim to compare them according to students' levels of English proficiency. One hundred and fourteen university students participated in this research project. Sixty-four students…

  14. A visual programming environment for the Navier-Stokes computer

    NASA Technical Reports Server (NTRS)

    Tomboulian, Sherryl; Crockett, Thomas W.; Middleton, David

    1988-01-01

    The Navier-Stokes computer is a high-performance, reconfigurable, pipelined machine designed to solve large computational fluid dynamics problems. Due to the complexity of the architecture, development of effective, high-level language compilers for the system appears to be a very difficult task. Consequently, a visual programming methodology has been developed which allows users to program the system at an architectural level by constructing diagrams of the pipeline configuration. These schematic program representations can then be checked for validity and automatically translated into machine code. The visual environment is illustrated by using a prototype graphical editor to program an example problem.

  15. Computer Language Settings and Canadian Spellings

    ERIC Educational Resources Information Center

    Shuttleworth, Roger

    2011-01-01

    The language settings used on personal computers interact with the spell-checker in Microsoft Word, which directly affects the flagging of spellings that are deemed incorrect. This study examined the language settings of personal computers owned by a group of Canadian university students. Of 21 computers examined, only eight had their Windows…

  16. Thai Language Sentence Similarity Computation Based on Syntactic Structure and Semantic Vector

    NASA Astrophysics Data System (ADS)

    Wang, Hongbin; Feng, Yinhan; Cheng, Liang

    2018-03-01

    Sentence similarity computation plays an increasingly important role in text mining, Web page retrieval, machine translation, speech recognition and question answering systems. Thai language as a kind of resources scarce language, it is not like Chinese language with HowNet and CiLin resources. So the Thai sentence similarity research faces some challenges. In order to solve this problem of the Thai language sentence similarity computation. This paper proposes a novel method to compute the similarity of Thai language sentence based on syntactic structure and semantic vector. This method firstly uses the Part-of-Speech (POS) dependency to calculate two sentences syntactic structure similarity, and then through the word vector to calculate two sentences semantic similarity. Finally, we combine the two methods to calculate two Thai language sentences similarity. The proposed method not only considers semantic, but also considers the sentence syntactic structure. The experiment result shows that this method in Thai language sentence similarity computation is feasible.

  17. Errors and Intelligence in Computer-Assisted Language Learning: Parsers and Pedagogues. Routledge Studies in Computer Assisted Language Learning

    ERIC Educational Resources Information Center

    Heift, Trude; Schulze, Mathias

    2012-01-01

    This book provides the first comprehensive overview of theoretical issues, historical developments and current trends in ICALL (Intelligent Computer-Assisted Language Learning). It assumes a basic familiarity with Second Language Acquisition (SLA) theory and teaching, CALL and linguistics. It is of interest to upper undergraduate and/or graduate…

  18. First stage identification of syntactic elements in an extra-terrestrial signal

    NASA Astrophysics Data System (ADS)

    Elliott, John

    2011-02-01

    By investigating the generic attributes of a representative set of terrestrial languages at varying levels of abstraction, it is our endeavour to try and isolate elements of the signal universe, which are computationally tractable for its detection and structural decipherment. Ultimately, our aim is to contribute in some way to the understanding of what 'languageness' actually is. This paper describes algorithms and software developed to characterise and detect generic intelligent language-like features in an input signal, using natural language learning techniques: looking for characteristic statistical "language-signatures" in test corpora. As a first step towards such species-independent language-detection, we present a suite of programs to analyse digital representations of a range of data, and use the results to extrapolate whether or not there are language-like structures which distinguish this data from other sources, such as music, images, and white noise.

  19. An Evaluation Framework and Comparative Analysis of the Widely Used First Programming Languages

    PubMed Central

    Farooq, Muhammad Shoaib; Khan, Sher Afzal; Ahmad, Farooq; Islam, Saeed; Abid, Adnan

    2014-01-01

    Computer programming is the core of computer science curriculum. Several programming languages have been used to teach the first course in computer programming, and such languages are referred to as first programming language (FPL). The pool of programming languages has been evolving with the development of new languages, and from this pool different languages have been used as FPL at different times. Though the selection of an appropriate FPL is very important, yet it has been a controversial issue in the presence of many choices. Many efforts have been made for designing a good FPL, however, there is no ample way to evaluate and compare the existing languages so as to find the most suitable FPL. In this article, we have proposed a framework to evaluate the existing imperative, and object oriented languages for their suitability as an appropriate FPL. Furthermore, based on the proposed framework we have devised a customizable scoring function to compute a quantitative suitability score for a language, which reflects its conformance to the proposed framework. Lastly, we have also evaluated the conformance of the widely used FPLs to the proposed framework, and have also computed their suitability scores. PMID:24586449

  20. An evaluation framework and comparative analysis of the widely used first programming languages.

    PubMed

    Farooq, Muhammad Shoaib; Khan, Sher Afzal; Ahmad, Farooq; Islam, Saeed; Abid, Adnan

    2014-01-01

    Computer programming is the core of computer science curriculum. Several programming languages have been used to teach the first course in computer programming, and such languages are referred to as first programming language (FPL). The pool of programming languages has been evolving with the development of new languages, and from this pool different languages have been used as FPL at different times. Though the selection of an appropriate FPL is very important, yet it has been a controversial issue in the presence of many choices. Many efforts have been made for designing a good FPL, however, there is no ample way to evaluate and compare the existing languages so as to find the most suitable FPL. In this article, we have proposed a framework to evaluate the existing imperative, and object oriented languages for their suitability as an appropriate FPL. Furthermore, based on the proposed framework we have devised a customizable scoring function to compute a quantitative suitability score for a language, which reflects its conformance to the proposed framework. Lastly, we have also evaluated the conformance of the widely used FPLs to the proposed framework, and have also computed their suitability scores.

  1. Linguistic transfer in bilingual children with specific language impairment.

    PubMed

    Verhoeven, Ludo; Steenge, Judit; van Balkom, Hans

    2012-01-01

    In the literature so far the limited research on specific language impairment (SLI) in bilingual children has concentrated on linguistic skills in the first language (L1) and/or the second language (L2) without paying attention to the relations between the two types of skills and to the issue of linguistic transfer. To examine the first and second language proficiency of 75 Turkish-Dutch bilingual children with SLI in the age range between 7 and 11 years living in the Netherlands. A multidimensional perspective on language proficiency was taken in order to assess children's Turkish and Dutch proficiency levels, whereas equivalent tests were used in order to determine language dominance. A second aim was to find out to what extent the children's proficiency in L2 can be predicted from their L1 proficiency, while taking into account their general cognitive abilities. The children's performance on a battery of equivalent language ability tests in Turkish and Dutch was compared at three age levels. By means of analyses of variance, it was explored to what extent the factors of language and grade level as well as their interactions were significant. Bivariate correlations and partial correlations with age level partialled out were computed to examine the relationships between L1 and L2 proficiency levels. Moreover, regression analysis was conducted to find out to what extent the variance in general L2 proficiency levels could be explained by children's L1 proficiency, short-term memory and non-verbal intelligence. Repeated measures analyses showed that the children had generally higher scores on L1 as compared with L2 and that with progression of age the children's scores in L1 and L2 improved. Medium to high correlations were found between phonological memory, phonological awareness, grammatical skills and story comprehension in the two languages. Regression analysis revealed that children's L2 proficiency levels could be explained by their proficiency levels in L1, even after controlling for children's non-verbal intelligence and working memory. It is concluded that children's formal linguistic skills in L1 and L2 tend to be related and that their level of L1 proficiency may help to develop linguistic skills in L2. © 2011 Royal College of Speech & Language Therapists.

  2. The effect of topiramate plasma concentration on linguistic behavior, verbal recall and working memory.

    PubMed

    Marino, S E; Pakhomov, S V S; Han, S; Anderson, K L; Ding, M; Eberly, L E; Loring, D W; Hawkins-Taylor, C; Rarick, J O; Leppik, I E; Cibula, J E; Birnbaum, A K

    2012-07-01

    This is the first study of the effect of topiramate on linguistic behavior and verbal recall using a computational linguistics system for automated language and speech analysis to detect and quantify drug-induced changes in speech recorded during discourse-level tasks. Healthy volunteers were administered a single, 100-mg oral dose of topiramate in two double-blind, randomized, placebo-controlled, crossover studies. Subjects' topiramate plasma levels ranged from 0.23 to 2.81 μg/mL. We found a significant association between topiramate levels and impairment on measures of verbal fluency elicited during a picture description task, correct number of words recalled on a paragraph recall test, and reaction time recorded during a working memory task. Using the tools of clinical pharmacology and computational linguistics, we elucidated the relationship between the determinants of a drug's disposition as reflected in plasma concentrations and their impact on cognitive functioning as reflected in spoken language discourse. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Linguistic Analysis of Natural Language Communication with Computers.

    ERIC Educational Resources Information Center

    Thompson, Bozena Henisz

    Interaction with computers in natural language requires a language that is flexible and suited to the task. This study of natural dialogue was undertaken to reveal those characteristics which can make computer English more natural. Experiments were made in three modes of communication: face-to-face, terminal-to-terminal, and human-to-computer,…

  4. Automatic generation of the index of productive syntax for child language transcripts.

    PubMed

    Hassanali, Khairun-nisa; Liu, Yang; Iglesias, Aquiles; Solorio, Thamar; Dollaghan, Christine

    2014-03-01

    The index of productive syntax (IPSyn; Scarborough (Applied Psycholinguistics 11:1-22, 1990) is a measure of syntactic development in child language that has been used in research and clinical settings to investigate the grammatical development of various groups of children. However, IPSyn is mostly calculated manually, which is an extremely laborious process. In this article, we describe the AC-IPSyn system, which automatically calculates the IPSyn score for child language transcripts using natural language processing techniques. Our results show that the AC-IPSyn system performs at levels comparable to scores computed manually. The AC-IPSyn system can be downloaded from www.hlt.utdallas.edu/~nisa/ipsyn.html .

  5. A primer in macromolecular linguistics.

    PubMed

    Searls, David B

    2013-03-01

    Polymeric macromolecules, when viewed abstractly as strings of symbols, can be treated in terms of formal language theory, providing a mathematical foundation for characterizing such strings both as collections and in terms of their individual structures. In addition this approach offers a framework for analysis of macromolecules by tools and conventions widely used in computational linguistics. This article introduces the ways that linguistics can be and has been applied to molecular biology, covering the relevant formal language theory at a relatively nontechnical level. Analogies between macromolecules and human natural language are used to provide intuitive insights into the relevance of grammars, parsing, and analysis of language complexity to biology. Copyright © 2012 Wiley Periodicals, Inc.

  6. Frequency of Educational Computer Use as a Longitudinal Predictor of Educational Outcome in Young People with Specific Language Impairment

    PubMed Central

    Durkin, Kevin; Conti-Ramsden, Gina

    2012-01-01

    Computer use draws on linguistic abilities. Using this medium thus presents challenges for young people with Specific Language Impairment (SLI) and raises questions of whether computer-based tasks are appropriate for them. We consider theoretical arguments predicting impaired performance and negative outcomes relative to peers without SLI versus the possibility of positive gains. We examine the relationship between frequency of computer use (for leisure and educational purposes) and educational achievement; in particular examination performance at the end of compulsory education and level of educational progress two years later. Participants were 49 young people with SLI and 56 typically developing (TD) young people. At around age 17, the two groups did not differ in frequency of educational computer use or leisure computer use. There were no associations between computer use and educational outcomes in the TD group. In the SLI group, after PIQ was controlled for, educational computer use at around 17 years of age contributed substantially to the prediction of educational progress at 19 years. The findings suggest that educational uses of computers are conducive to educational progress in young people with SLI. PMID:23300610

  7. High-performance analysis of filtered semantic graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buluc, Aydin; Fox, Armando; Gilbert, John R.

    2012-01-01

    High performance is a crucial consideration when executing a complex analytic query on a massive semantic graph. In a semantic graph, vertices and edges carry "attributes" of various types. Analytic queries on semantic graphs typically depend on the values of these attributes; thus, the computation must either view the graph through a filter that passes only those individual vertices and edges of interest, or else must first materialize a subgraph or subgraphs consisting of only the vertices and edges of interest. The filtered approach is superior due to its generality, ease of use, and memory efficiency, but may carry amore » performance cost. In the Knowledge Discovery Toolbox (KDT), a Python library for parallel graph computations, the user writes filters in a high-level language, but those filters result in relatively low performance due to the bottleneck of having to call into the Python interpreter for each edge. In this work, we use the Selective Embedded JIT Specialization (SEJITS) approach to automatically translate filters defined by programmers into a lower-level efficiency language, bypassing the upcall into Python. We evaluate our approach by comparing it with the high-performance C++ /MPI Combinatorial BLAS engine, and show that the productivity gained by using a high-level filtering language comes without sacrificing performance.« less

  8. User's Guide to "MULE"; McGill University Language for Education. A Computer-Assisted Instruction Author Language.

    ERIC Educational Resources Information Center

    Roid, Gale H.

    A computer-assisted instruction (CAI) author language and operating system is available for use by McGill instructors on the university's IBM 360/65 RAX Time-Sharing System. Instructors can use this system to prepare lessons which allow the computer and a student to "converse" in natural language. The instructor prepares a lesson by…

  9. Author Languages, Authoring Systems, and Their Relation to the Changing Focus of Computer-Aided Language Learning.

    ERIC Educational Resources Information Center

    Sussex, Roland

    1991-01-01

    Considers how the effectiveness of computer-assisted language learning (CALL) has been hampered by language teachers who lack programing and software engineering expertise, and explores the limitations and potential contributions of author languages, programs, and environments in increasing the range of options for language teachers who are not…

  10. qtcm 0.1.2: A Python Implementation of the Neelin-Zeng Quasi-Equilibrium Tropical Circulation model

    NASA Astrophysics Data System (ADS)

    Lin, J. W.-B.

    2008-10-01

    Historically, climate models have been developed incrementally and in compiled languages like Fortran. While the use of legacy compiled languages results in fast, time-tested code, the resulting model is limited in its modularity and cannot take advantage of functionality available with modern computer languages. Here we describe an effort at using the open-source, object-oriented language Python to create more flexible climate models: the package qtcm, a Python implementation of the intermediate-level Neelin-Zeng Quasi-Equilibrium Tropical Circulation model (QTCM1) of the atmosphere. The qtcm package retains the core numerics of QTCM1, written in Fortran to optimize model performance, but uses Python structures and utilities to wrap the QTCM1 Fortran routines and manage model execution. The resulting "mixed language" modeling package allows order and choice of subroutine execution to be altered at run time, and model analysis and visualization to be integrated in interactively with model execution at run time. This flexibility facilitates more complex scientific analysis using less complex code than would be possible using traditional languages alone, and provides tools to transform the traditional "formulate hypothesis → write and test code → run model → analyze results" sequence into a feedback loop that can be executed automatically by the computer.

  11. qtcm 0.1.2: a Python implementation of the Neelin-Zeng Quasi-Equilibrium Tropical Circulation Model

    NASA Astrophysics Data System (ADS)

    Lin, J. W.-B.

    2009-02-01

    Historically, climate models have been developed incrementally and in compiled languages like Fortran. While the use of legacy compiled languages results in fast, time-tested code, the resulting model is limited in its modularity and cannot take advantage of functionality available with modern computer languages. Here we describe an effort at using the open-source, object-oriented language Python to create more flexible climate models: the package qtcm, a Python implementation of the intermediate-level Neelin-Zeng Quasi-Equilibrium Tropical Circulation model (QTCM1) of the atmosphere. The qtcm package retains the core numerics of QTCM1, written in Fortran to optimize model performance, but uses Python structures and utilities to wrap the QTCM1 Fortran routines and manage model execution. The resulting "mixed language" modeling package allows order and choice of subroutine execution to be altered at run time, and model analysis and visualization to be integrated in interactively with model execution at run time. This flexibility facilitates more complex scientific analysis using less complex code than would be possible using traditional languages alone, and provides tools to transform the traditional "formulate hypothesis → write and test code → run model → analyze results" sequence into a feedback loop that can be executed automatically by the computer.

  12. Computerized Writing and Reading Instruction for Students in Grades 4 to 9 With Specific Learning Disabilities Affecting Written Language

    PubMed Central

    Tanimoto, Steven; Thompson, Rob; Berninger, Virginia W.; Nagy, William; Abbott, Robert D.

    2015-01-01

    Computer scientists and educational researchers evaluated effectiveness of computerized instruction tailored to evidence-based impairments in specific learning disabilities (SLDs) in students in grades 4 to 9 with persisting SLDs despite prior extra help. Following comprehensive, evidence-based differential diagnosis for dysgraphia (impaired handwriting), dyslexia (impaired word reading and spelling), and oral and written language learning disability (OWL LD), students completed 18 sessions of computerized instruction over about 3 months. The 11 students taught letter formation with sequential, numbered, colored arrow cues with full contours who wrote letters on lines added to iPAD screen showed more and stronger treatment effects than the 21 students taught using only visual motion cues for letter formation who wrote on an unlined computer monitor. Teaching to all levels of language in multiple functional language systems (by ear, eye, mouth, and hand) close in time resulted in significant gains in reading and writing skills for the group and in diagnosed SLD hallmark impairments for individuals; also, performance on computerized learning activities correlated with treatment gains. Results are discussed in reference to need for both accommodations and explicit instruction for persisting SLDs and the potential for computers to teach handwriting, morphophonemic orthographies, comprehension, and composition. PMID:26858470

  13. Report on the formal specification and partial verification of the VIPER microprocessor

    NASA Technical Reports Server (NTRS)

    Brock, Bishop; Hunt, Warren A., Jr.

    1991-01-01

    The VIPER microprocessor chip is partitioned into four levels of abstractions. At the highest level, VIPER is described with decreasingly abstract sets of functions in LCF-LSM. At the lowest level are the gate-level models in proprietary CAD languages. The block-level and gate-level specifications are also given in the ELLA simulation language. Among VIPER's deficiencies are the fact that there is no notion of external events in the top-level specification, and it is impossible to use the top-level specifications to prove abstract properties of programs running on VIPER computers. There is no complete proof that the gate-level specifications implement the top-level specifications. Cohn's proof that the major-state machine correctly implements the top-level specifications has no formal connection with any of the other proof attempts. None of the latter address resetting the machine, memory timeout, forced error, or single step modes.

  14. Airport Noise Prediction Model -- MOD 7

    DOT National Transportation Integrated Search

    1978-07-01

    The MOD 7 Airport Noise Prediction Model is fully operational. The language used is Fortran, and it has been run on several different computer systems. Its capabilities include prediction of noise levels for single parameter changes, for multiple cha...

  15. Python for Ecology

    EPA Science Inventory

    Python is a high-level scripting language that is becoming increasingly popular for scientific computing. This all-day workshop is designed to introduce the basics of Python programming to ecologists. Some scripting/programming experience is recommended (e.g. familiarity with R)....

  16. Wikipedia Writing as Praxis: Computer-Mediated Socialization of Second-Language Writers

    ERIC Educational Resources Information Center

    King, Brian W.

    2015-01-01

    This study explores the writing of Wikipedia articles as a form of authentic writing for learners of English in Hong Kong. Adopting "Second Language Socialization and Language Learning & Identity" approaches to language learning inquiry, it responds to an identified shortage of research on computer-mediated language socialization.…

  17. Computer-Assisted Second Language Vocabulary Instruction: A Meta-Analysis

    ERIC Educational Resources Information Center

    Chiu, Yi-Hui

    2013-01-01

    There is growing attention to incorporating computer-mediated instruction for language learning and teaching. Specifically, vocabulary is arguably the foundation of mastering a language, as the mastery of vocabulary is the fundamental step of learning a language. Second language (L2) vocabulary is important in the development of cognitive systems…

  18. Computer-based auditory training (CBAT): benefits for children with language- and reading-related learning difficulties.

    PubMed

    Loo, Jenny Hooi Yin; Bamiou, Doris-Eva; Campbell, Nicci; Luxon, Linda M

    2010-08-01

    This article reviews the evidence for computer-based auditory training (CBAT) in children with language, reading, and related learning difficulties, and evaluates the extent it can benefit children with auditory processing disorder (APD). Searches were confined to studies published between 2000 and 2008, and they are rated according to the level of evidence hierarchy proposed by the American Speech-Language Hearing Association (ASHA) in 2004. We identified 16 studies of two commercially available CBAT programs (13 studies of Fast ForWord (FFW) and three studies of Earobics) and five further outcome studies of other non-speech and simple speech sounds training, available for children with language, learning, and reading difficulties. The results suggest that, apart from the phonological awareness skills, the FFW and Earobics programs seem to have little effect on the language, spelling, and reading skills of children. Non-speech and simple speech sounds training may be effective in improving children's reading skills, but only if it is delivered by an audio-visual method. There is some initial evidence to suggest that CBAT may be of benefit for children with APD. Further research is necessary, however, to substantiate these preliminary findings.

  19. Predicting Development of Mathematical Word Problem Solving Across the Intermediate Grades

    PubMed Central

    Tolar, Tammy D.; Fuchs, Lynn; Cirino, Paul T.; Fuchs, Douglas; Hamlett, Carol L.; Fletcher, Jack M.

    2012-01-01

    This study addressed predictors of the development of word problem solving (WPS) across the intermediate grades. At beginning of 3rd grade, 4 cohorts of students (N = 261) were measured on computation, language, nonverbal reasoning skills, and attentive behavior and were assessed 4 times from beginning of 3rd through end of 5th grade on 2 measures of WPS at low and high levels of complexity. Language skills were related to initial performance at both levels of complexity and did not predict growth at either level. Computational skills had an effect on initial performance in low- but not high-complexity problems and did not predict growth at either level of complexity. Attentive behavior did not predict initial performance but did predict growth in low-complexity, whereas it predicted initial performance but not growth for high-complexity problems. Nonverbal reasoning predicted initial performance and growth for low-complexity WPS, but only growth for high-complexity WPS. This evidence suggests that although mathematical structure is fixed, different cognitive resources may act as limiting factors in WPS development when the WPS context is varied. PMID:23325985

  20. Turkish and English Language Teacher Candidates' Perceived Computer Self-Efficacy and Attitudes toward Computer

    ERIC Educational Resources Information Center

    Adalier, Ahmet

    2012-01-01

    The aim of this study is to reveal the relation between the Turkish and English language teacher candidates' social demographic characteristics and their perceived computer self-efficacy and attitudes toward computer. The population of the study consists of the teacher candidates in the Turkish and English language departments at the universities…

  1. A SCILAB Program for Computing Rotating Magnetic Compact Objects

    NASA Astrophysics Data System (ADS)

    Papasotiriou, P. J.; Geroyannis, V. S.

    We implement the so-called ``complex-plane iterative technique'' (CIT) to the computation of classical differentially rotating magnetic white dwarf and neutron star models. The program has been written in SCILAB (© INRIA-ENPC), a matrix-oriented high-level programming language, which can be downloaded free of charge from the site http://www-rocq.inria.fr/scilab. Due to the advanced capabilities of this language, the code is short and understandable. Highlights of the program are: (a) time-saving character, (b) easy use due to the built-in graphics user interface, (c) easy interfacing with Fortran via online dynamic link. We interpret our numerical results in various ways by extensively using the graphics environment of SCILAB.

  2. Neurolinguistics and psycholinguistics as a basis for computer acquisition of natural language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Powers, D.M.W.

    1983-04-01

    Research into natural language understanding systems for computers has concentrated on implementing particular grammars and grammatical models of the language concerned. This paper presents a rationale for research into natural language understanding systems based on neurological and psychological principles. Important features of the approach are that it seeks to place the onus of learning the language on the computer, and that it seeks to make use of the vast wealth of relevant psycholinguistic and neurolinguistic theory. 22 references.

  3. An overview of computer-based natural language processing

    NASA Technical Reports Server (NTRS)

    Gevarter, W. B.

    1983-01-01

    Computer based Natural Language Processing (NLP) is the key to enabling humans and their computer based creations to interact with machines in natural language (like English, Japanese, German, etc., in contrast to formal computer languages). The doors that such an achievement can open have made this a major research area in Artificial Intelligence and Computational Linguistics. Commercial natural language interfaces to computers have recently entered the market and future looks bright for other applications as well. This report reviews the basic approaches to such systems, the techniques utilized, applications, the state of the art of the technology, issues and research requirements, the major participants and finally, future trends and expectations. It is anticipated that this report will prove useful to engineering and research managers, potential users, and others who will be affected by this field as it unfolds.

  4. Origine et developpement des industries de la langue (Origin and Development of Language Utilities). Publication K-8.

    ERIC Educational Resources Information Center

    L'Homme, Marie-Claude

    The evolution of "language utilities," a concept confined largely to the francophone world and relating to the uses of language in computer science and the use of computer science for languages, is chronicled. The language utilities are of three types: (1) tools for language development, primarily dictionary databases and related tools;…

  5. Multiprocessor architecture: Synthesis and evaluation

    NASA Technical Reports Server (NTRS)

    Standley, Hilda M.

    1990-01-01

    Multiprocessor computed architecture evaluation for structural computations is the focus of the research effort described. Results obtained are expected to lead to more efficient use of existing architectures and to suggest designs for new, application specific, architectures. The brief descriptions given outline a number of related efforts directed toward this purpose. The difficulty is analyzing an existing architecture or in designing a new computer architecture lies in the fact that the performance of a particular architecture, within the context of a given application, is determined by a number of factors. These include, but are not limited to, the efficiency of the computation algorithm, the programming language and support environment, the quality of the program written in the programming language, the multiplicity of the processing elements, the characteristics of the individual processing elements, the interconnection network connecting processors and non-local memories, and the shared memory organization covering the spectrum from no shared memory (all local memory) to one global access memory. These performance determiners may be loosely classified as being software or hardware related. This distinction is not clear or even appropriate in many cases. The effect of the choice of algorithm is ignored by assuming that the algorithm is specified as given. Effort directed toward the removal of the effect of the programming language and program resulted in the design of a high-level parallel programming language. Two characteristics of the fundamental structure of the architecture (memory organization and interconnection network) are examined.

  6. Language and infant mortality in a large Canadian province.

    PubMed

    Auger, N; Bilodeau-Bertrand, M; Costopoulos, A

    2016-10-01

    Infant mortality in minority populations of Canada is poorly understood, despite evidence of ethnic inequality in other countries. We studied infant mortality in different linguistic groups of Quebec, and assessed how language and deprivation impacted rates over time. Population-level study of vital statistics data for 1,985,287 live births and 10,283 infant deaths reported in Quebec from 1989 through 2012. We computed infant mortality rates for French, English, and foreign languages according to level of material deprivation. Using Kitagawa's method, we evaluated the impact of changes in mortality rates, and population distribution of language groups, on infant mortality in the province. Infant mortality declined from 6.05 to 4.61 per 1000 between 1989-1994 and 2007-2012. Most of the decline was driven by Francophones who contributed 1.39 fewer deaths per 1000 births over time, and Anglophones of wealthy and middle socio-economic status who contributed 0.13 fewer deaths per 1000 births. The foreign language population and poor Anglophones contributed more births over time, including 0.08 and 0.02 more deaths per 1000 births, respectively. Mortality decreased for Francophones and Anglophones in each level of deprivation. Rates were lower for foreign languages, but increased over time, especially for the poor. Infant mortality rates decreased for Francophones and Anglophones in Quebec, but increased for foreign languages. Poor Anglophones and individuals of foreign languages contributed more births over time, and slowed the decrease in infant mortality. Language may be useful for identifying inequality in infant mortality in multicultural nations. Copyright © 2016 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  7. Tensor Arithmetic, Geometric and Mathematic Principles of Fluid Mechanics in Implementation of Direct Computational Experiments

    NASA Astrophysics Data System (ADS)

    Bogdanov, Alexander; Khramushin, Vasily

    2016-02-01

    The architecture of a digital computing system determines the technical foundation of a unified mathematical language for exact arithmetic-logical description of phenomena and laws of continuum mechanics for applications in fluid mechanics and theoretical physics. The deep parallelization of the computing processes results in functional programming at a new technological level, providing traceability of the computing processes with automatic application of multiscale hybrid circuits and adaptive mathematical models for the true reproduction of the fundamental laws of physics and continuum mechanics.

  8. Teachers' Support in Using Computers for Developing Students' Listening and Speaking Skills in Pre-Sessional English Courses

    ERIC Educational Resources Information Center

    Zou, Bin

    2013-01-01

    Many computer-assisted language learning (CALL) studies have found that teacher direction can help learners develop language skills at their own pace on computers. However, many teachers still do not know how to provide support for students to use computers to reinforce the development of their language skills. Hence, more examples of CALL…

  9. Turned on to Language Arts: Computer Literacy in the Primary Grades.

    ERIC Educational Resources Information Center

    Guthrie, Larry F.; Richardson, Susan

    1995-01-01

    Describes Apple Computer's Early Language Connections (ELC) program. Designed for K-2 grades, ELC integrates Macintosh computers, children's literature, instructional software, and other curriculum materials, including sample lessons constructed around thematic units. The literature-based product uses a whole-language approach (with phonics…

  10. User-Centered Computer Aided Language Learning

    ERIC Educational Resources Information Center

    Zaphiris, Panayiotis, Ed.; Zacharia, Giorgos, Ed.

    2006-01-01

    In the field of computer aided language learning (CALL), there is a need for emphasizing the importance of the user. "User-Centered Computer Aided Language Learning" presents methodologies, strategies, and design approaches for building interfaces for a user-centered CALL environment, creating a deeper understanding of the opportunities and…

  11. Sketchcode: A Documentation Technique for Computer Hobbyists and Programmers

    ERIC Educational Resources Information Center

    Voros, Todd, L.

    1978-01-01

    Sketchcode is a metaprograming pseudo-language documentation technique intended to simplify the process of program writing and debugging for both high and low-level users. Helpful hints and examples for the use of the technique are included. (CMV)

  12. A Graduate Professional Program in Translation.

    ERIC Educational Resources Information Center

    Waldinger, Renee

    1987-01-01

    The City University of New York Graduate School's professional program in translation combines high-level, specialized language learning in French, German, and Spanish with related graduate work in such disciplines as international affairs, finance, banking, jurisprudence, literature, and computer science. (CB)

  13. BigDataScript: a scripting language for data pipelines.

    PubMed

    Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu

    2015-01-01

    The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. © The Author 2014. Published by Oxford University Press.

  14. BigDataScript: a scripting language for data pipelines

    PubMed Central

    Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu

    2015-01-01

    Motivation: The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. Results: We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. Availability and implementation: BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. Contact: pablo.e.cingolani@gmail.com PMID:25189778

  15. Population Health in Pediatric Speech and Language Disorders: Available Data Sources and a Research Agenda for the Field.

    PubMed

    Raghavan, Ramesh; Camarata, Stephen; White, Karl; Barbaresi, William; Parish, Susan; Krahn, Gloria

    2018-05-17

    The aim of the study was to provide an overview of population science as applied to speech and language disorders, illustrate data sources, and advance a research agenda on the epidemiology of these conditions. Computer-aided database searches were performed to identify key national surveys and other sources of data necessary to establish the incidence, prevalence, and course and outcome of speech and language disorders. This article also summarizes a research agenda that could enhance our understanding of the epidemiology of these disorders. Although the data yielded estimates of prevalence and incidence for speech and language disorders, existing sources of data are inadequate to establish reliable rates of incidence, prevalence, and outcomes for speech and language disorders at the population level. Greater support for inclusion of speech and language disorder-relevant questions is necessary in national health surveys to build the population science in the field.

  16. Seeking Synthesis: The Integrative Problem in Understanding Language and Its Evolution.

    PubMed

    Dale, Rick; Kello, Christopher T; Schoenemann, P Thomas

    2016-04-01

    We discuss two problems for a general scientific understanding of language, sequences and synergies: how language is an intricately sequenced behavior and how language is manifested as a multidimensionally structured behavior. Though both are central in our understanding, we observe that the former tends to be studied more than the latter. We consider very general conditions that hold in human brain evolution and its computational implications, and identify multimodal and multiscale organization as two key characteristics of emerging cognitive function in our species. This suggests that human brains, and cognitive function specifically, became more adept at integrating diverse information sources and operating at multiple levels for linguistic performance. We argue that framing language evolution, learning, and use in terms of synergies suggests new research questions, and it may be a fruitful direction for new developments in theory and modeling of language as an integrated system. Copyright © 2016 Cognitive Science Society, Inc.

  17. Reduction of Flow Diagrams to Unfolded Form Modulo Snarls.

    DTIC Science & Technology

    1987-04-14

    the English name of the Greek letter zeta.) 1.) An unintelligent canonical method called the Ŗ-level crossbar/pole" representation (3cp). This... Second , it will make these pictorial representations (all of which go by the name fC. Even though this is an abuse of language , it is in the spirit...received an M.S. degree In computer and communications sciences from the University of Michigan. Bs Is currently teaching a course on assembly language

  18. Tensoral for post-processing users and simulation authors

    NASA Technical Reports Server (NTRS)

    Dresselhaus, Eliot

    1993-01-01

    The CTR post-processing effort aims to make turbulence simulations and data more readily and usefully available to the research and industrial communities. The Tensoral language, which provides the foundation for this effort, is introduced here in the form of a user's guide. The Tensoral user's guide is presented in two main sections. Section one acts as a general introduction and guides database users who wish to post-process simulation databases. Section two gives a brief description of how database authors and other advanced users can make simulation codes and/or the databases they generate available to the user community via Tensoral database back ends. The two-part structure of this document conforms to the two-level design structure of the Tensoral language. Tensoral has been designed to be a general computer language for performing tensor calculus and statistics on numerical data. Tensoral's generality allows it to be used for stand-alone native coding of high-level post-processing tasks (as described in section one of this guide). At the same time, Tensoral's specialization to a minute task (namely, to numerical tensor calculus and statistics) allows it to be easily embedded into applications written partly in Tensoral and partly in other computer languages (here, C and Vectoral). Embedded Tensoral, aimed at advanced users for more general coding (e.g. of efficient simulations, for interfacing with pre-existing software, for visualization, etc.), is described in section two of this guide.

  19. The Design and Implementation of a Graphical VHDL (VHSIC Hardware Description Language) User Interface

    DTIC Science & Technology

    1988-12-01

    VHSIC Program Office appropriately summarized the motivation behind VHDL as follows: Computer -aided engineering is a nightmare of incompatible formats and... Computer Science Branch. Interactive VHDL Workstation: Program Status Review Report, 8 October 1987. Air Force Contract F33615-85-C-1862. Information Systems...Typical Program Structure .................................. 14 3 Figure 4. GVUI Top-Level SADT Diagram ............................... .24 Figure 5

  20. Systems, methods and apparatus for implementation of formal specifications derived from informal requirements

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments an informal specification is translated without human intervention into a formal specification. In some embodiments the formal specification is a process-based specification. In some embodiments, the formal specification is translated into a high-level computer programming language which is further compiled into a set of executable computer instructions.

  1. Using Problem Solving to Teach a Programming Language.

    ERIC Educational Resources Information Center

    Milbrandt, George

    1995-01-01

    Computer studies courses should incorporate as many computer concepts and programming language experiences as possible. A gradual increase in problem difficulty will help the student to understand various computer concepts, and the programming language's syntax and structure. A sidebar provides two examples of how to establish a learning…

  2. Integrating Computer-Assisted Language Learning in Saudi Schools: A Change Model

    ERIC Educational Resources Information Center

    Alresheed, Saleh; Leask, Marilyn; Raiker, Andrea

    2015-01-01

    Computer-assisted language learning (CALL) technology and pedagogy have gained recognition globally for their success in supporting second language acquisition (SLA). In Saudi Arabia, the government aims to provide most educational institutions with computers and networking for integrating CALL into classrooms. However, the recognition of CALL's…

  3. Attitude Towards Computers and Classroom Management of Language School Teachers

    ERIC Educational Resources Information Center

    Jalali, Sara; Panahzade, Vahid; Firouzmand, Ali

    2014-01-01

    Computer-assisted language learning (CALL) is the realization of computers in schools and universities which has potentially enhanced the language learning experience inside the classrooms. The integration of the technologies into the classroom demands that the teachers adopt a number of classroom management procedures to maintain a more…

  4. Using Primary Language Support via Computer to Improve Reading Comprehension Skills of First-Grade English Language Learners

    ERIC Educational Resources Information Center

    Rodriguez, Cathi Draper; Filler, John; Higgins, Kyle

    2012-01-01

    Through this exploratory study the authors investigated the effects of primary language support delivered via computer on the English reading comprehension skills of English language learners. Participants were 28 First-grade students identified as Limited English Proficient. The primary language of all participants was Spanish. Students were…

  5. Can Computers Be Used for Whole Language Approaches to Reading and Language Arts?

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    Holistic approaches to the teaching of reading and writing, most notably the Whole Language movement, reject the philosophy that language skills can be taught. Instead, holistic teachers emphasize process, and they structure the students' classroom activities to be rich in language experience. Computers can be used as tools for whole language…

  6. Implementing Mobile-Assisted Language Learning (MALL) in an EFL Context: Iranian EFL Teachers' Perspectives on Challenges and Affordances

    ERIC Educational Resources Information Center

    Dashtestani, Reza

    2013-01-01

    The implementation of computer-assisted language learning (CALL) has provided tremendous opportunities for language teachers to promote their computer literacy and adopt a learner-centered approach to teaching. Accordingly, with the rising advent of language learning technologies, language teachers would occupy a fundamental role in preparing and…

  7. La linguistica, la glottodidattica e l'elaboratore elettronico: Note sull'introduzione dell'informatica nell'insegnamento delle lingue (Linguistics, Language Pedagogy, and Computers: Notes on the Introduction of Computer Science in the Teaching of Languages).

    ERIC Educational Resources Information Center

    Colmayer, Ciro

    1991-01-01

    Attempts to show that the use of computers in the classroom should not be limited to the teaching of math but that the language classroom is an even more appropriate place for the introduction and use of computers. (CFM)

  8. Undergraduate computational physics projects on quantum computing

    NASA Astrophysics Data System (ADS)

    Candela, D.

    2015-08-01

    Computational projects on quantum computing suitable for students in a junior-level quantum mechanics course are described. In these projects students write their own programs to simulate quantum computers. Knowledge is assumed of introductory quantum mechanics through the properties of spin 1/2. Initial, more easily programmed projects treat the basics of quantum computation, quantum gates, and Grover's quantum search algorithm. These are followed by more advanced projects to increase the number of qubits and implement Shor's quantum factoring algorithm. The projects can be run on a typical laptop or desktop computer, using most programming languages. Supplementing resources available elsewhere, the projects are presented here in a self-contained format especially suitable for a short computational module for physics students.

  9. A Diagrammatic Language for Biochemical Networks

    NASA Astrophysics Data System (ADS)

    Maimon, Ron

    2002-03-01

    I present a diagrammatic language for representing the structure of biochemical networks. The language is designed to represent modular structure in a computational fasion, with composition of reactions replacing functional composition. This notation is used to represent arbitrarily large networks efficiently. The notation finds its most natural use in representing biological interaction networks, but it is a general computing language appropriate to any naturally occuring computation. Unlike lambda-calculus, or text-derived languages, it does not impose a tree-structure on the diagrams, and so is more effective at representing biological fucntion than competing notations.

  10. What do we mean by prediction in language comprehension?

    PubMed Central

    Kuperberg, Gina R.; Jaeger, T. Florian

    2016-01-01

    We consider several key aspects of prediction in language comprehension: its computational nature, the representational level(s) at which we predict, whether we use higher level representations to predictively pre-activate lower level representations, and whether we ‘commit’ in any way to our predictions, beyond pre-activation. We argue that the bulk of behavioral and neural evidence suggests that we predict probabilistically and at multiple levels and grains of representation. We also argue that we can, in principle, use higher level inferences to predictively pre-activate information at multiple lower representational levels. We also suggest that the degree and level of predictive pre-activation might be a function of the expected utility of prediction, which, in turn, may depend on comprehenders’ goals and their estimates of the relative reliability of their prior knowledge and the bottom-up input. Finally, we argue that all these properties of language understanding can be naturally explained and productively explored within a multi-representational hierarchical actively generative architecture whose goal is to infer the message intended by the producer, and in which predictions play a crucial role in explaining the bottom-up input. PMID:27135040

  11. Computer-Based Internet-Hosted Assessment of L2 Literacy: Computerizing and Administering of the Oxford Quick Placement Test in ExamView and Moodle

    NASA Astrophysics Data System (ADS)

    Meurant, Robert C.

    Sorting of Korean English-as-a-Foreign-Language (EFL) university students by Second Language (L2) aptitude allocates students to classes of compatible ability level, and was here used to screen candidates for interview. Paper-and-pen versions of the Oxford Quick Placement Test were adapted to computer-based testing via online hosting using FSCreations ExamView. Problems with their online hosting site led to conversion to the popular computer-based learning management system Moodle, hosted on www.ninehub.com. 317 sophomores were tested online to encourage L2 digital literacy. Strategies for effective hybrid implementation of Learning Management Systems in L2 tertiary education include computer-based Internet-hosted L2 aptitude tests. These potentially provide a convenient measure of student progress in developing L2 fluency, and offer a more objective and relevant means of teacher- and course-assessment than student evaluations, which tend to confuse entertainment value and teacher popularity with academic credibility and pedagogical effectiveness.

  12. Clinical nursing informatics. Developing tools for knowledge workers.

    PubMed

    Ozbolt, J G; Graves, J R

    1993-06-01

    Current research in clinical nursing informatics is proceeding along three important dimensions: (1) identifying and defining nursing's language and structuring its data; (2) understanding clinical judgment and how computer-based systems can facilitate and not replace it; and (3) discovering how well-designed systems can transform nursing practice. A number of efforts are underway to find and use language that accurately represents nursing and that can be incorporated into computer-based information systems. These efforts add to understanding nursing problems, interventions, and outcomes, and provide the elements for databases from which nursing's costs and effectiveness can be studied. Research on clinical judgment focuses on how nurses (perhaps with different levels of expertise) assess patient needs, set goals, and plan and deliver care, as well as how computer-based systems can be developed to aid these cognitive processes. Finally, investigators are studying not only how computers can help nurses with the mechanics and logistics of processing information but also and more importantly how access to informatics tools changes nursing care.

  13. Using network science in the language sciences and clinic.

    PubMed

    Vitevitch, Michael S; Castro, Nichol

    2015-02-01

    A number of variables—word frequency, word length—have long been known to influence language processing. This study briefly reviews the effects in speech perception and production of two more recently examined variables: phonotactic probability and neighbourhood density. It then describes a new approach to study language, network science, which is an interdisciplinary field drawing from mathematics, computer science, physics and other disciplines. In this approach, nodes represent individual entities in a system (i.e. phonological word-forms in the lexicon), links between nodes represent relationships between nodes (i.e. phonological neighbours) and various measures enable researchers to assess the micro-level (i.e. the individual word), the macro-level (i.e. characteristics about the whole system) and the meso-level (i.e. how an individual fits into smaller sub-groups in the larger system). Although research on individual lexical characteristics such as word-frequency has increased understanding of language processing, these measures only assess the "micro-level". Using network science, researchers can examine words at various levels in the system and how each word relates to the many other words stored in the lexicon. Several new findings using the network science approach are summarized to illustrate how this approach can be used to advance basic research as well as clinical practice.

  14. Using network science in the language sciences and clinic

    PubMed Central

    Vitevitch, Michael S.; Castro, Nichol

    2017-01-01

    A number of variables—word frequency, word length—have long been known to influence language processing. We briefly review the effects in speech perception and production of two more recently examined variables: phonotactic probability and neighborhood density. We then describe a new approach to study language, network science, which is an interdisciplinary field drawing from mathematics, computer science, physics, and other disciplines. In this approach, nodes represent individual entities in a system (i.e., phonological word-forms in the lexicon), links between nodes represent relationships between nodes (i.e., phonological neighbors), and various measures enable researchers to assess the micro-level (i.e., the individual word), the macro-level (i.e., characteristics about the whole system), and the meso-level (i.e., how an individual fits into smaller sub-groups in the larger system). Although research on individual lexical characteristics such as word-frequency has increased our understanding of language processing, these measures only assess the “micro-level.” Using network science, researchers can examine words at various levels in the system, and how each word relates to the many other words stored in the lexicon. Several new findings using the network science approach are summarized to illustrate how this approach can be used to advance basic research as well as clinical practice. PMID:25539473

  15. Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition

    NASA Astrophysics Data System (ADS)

    Fitch, W. Tecumseh

    2014-09-01

    Progress in understanding cognition requires a quantitative, theoretical framework, grounded in the other natural sciences and able to bridge between implementational, algorithmic and computational levels of explanation. I review recent results in neuroscience and cognitive biology that, when combined, provide key components of such an improved conceptual framework for contemporary cognitive science. Starting at the neuronal level, I first discuss the contemporary realization that single neurons are powerful tree-shaped computers, which implies a reorientation of computational models of learning and plasticity to a lower, cellular, level. I then turn to predictive systems theory (predictive coding and prediction-based learning) which provides a powerful formal framework for understanding brain function at a more global level. Although most formal models concerning predictive coding are framed in associationist terms, I argue that modern data necessitate a reinterpretation of such models in cognitive terms: as model-based predictive systems. Finally, I review the role of the theory of computation and formal language theory in the recent explosion of comparative biological research attempting to isolate and explore how different species differ in their cognitive capacities. Experiments to date strongly suggest that there is an important difference between humans and most other species, best characterized cognitively as a propensity by our species to infer tree structures from sequential data. Computationally, this capacity entails generative capacities above the regular (finite-state) level; implementationally, it requires some neural equivalent of a push-down stack. I dub this unusual human propensity "dendrophilia", and make a number of concrete suggestions about how such a system may be implemented in the human brain, about how and why it evolved, and what this implies for models of language acquisition. I conclude that, although much remains to be done, a neurally-grounded framework for theoretical cognitive science is within reach that can move beyond polarized debates and provide a more adequate theoretical future for cognitive biology.

  16. Toward a computational framework for cognitive biology: unifying approaches from cognitive neuroscience and comparative cognition.

    PubMed

    Fitch, W Tecumseh

    2014-09-01

    Progress in understanding cognition requires a quantitative, theoretical framework, grounded in the other natural sciences and able to bridge between implementational, algorithmic and computational levels of explanation. I review recent results in neuroscience and cognitive biology that, when combined, provide key components of such an improved conceptual framework for contemporary cognitive science. Starting at the neuronal level, I first discuss the contemporary realization that single neurons are powerful tree-shaped computers, which implies a reorientation of computational models of learning and plasticity to a lower, cellular, level. I then turn to predictive systems theory (predictive coding and prediction-based learning) which provides a powerful formal framework for understanding brain function at a more global level. Although most formal models concerning predictive coding are framed in associationist terms, I argue that modern data necessitate a reinterpretation of such models in cognitive terms: as model-based predictive systems. Finally, I review the role of the theory of computation and formal language theory in the recent explosion of comparative biological research attempting to isolate and explore how different species differ in their cognitive capacities. Experiments to date strongly suggest that there is an important difference between humans and most other species, best characterized cognitively as a propensity by our species to infer tree structures from sequential data. Computationally, this capacity entails generative capacities above the regular (finite-state) level; implementationally, it requires some neural equivalent of a push-down stack. I dub this unusual human propensity "dendrophilia", and make a number of concrete suggestions about how such a system may be implemented in the human brain, about how and why it evolved, and what this implies for models of language acquisition. I conclude that, although much remains to be done, a neurally-grounded framework for theoretical cognitive science is within reach that can move beyond polarized debates and provide a more adequate theoretical future for cognitive biology. Copyright © 2014. Published by Elsevier B.V.

  17. Absorption of language concepts in the machine mind

    NASA Astrophysics Data System (ADS)

    Kollár, Ján

    2016-06-01

    In our approach, the machine mind is the applicative dynamic system represented by its algorithmically evolvable internal language. By other words, the mind and the language of mind are synonyms. Coming out from Shaumyan's semiotic theory of languages, we present the representation of language concepts in the machine mind as a result of our experiment, to show non-redundancy of the language of mind. To provide useful restriction for further research, we also introduce the hypothesis of semantic saturation in Computer-Computer communication, which indicates that a set of machines is not self-evolvable. The goal of our research is to increase the abstraction of Human-Computer and Computer-Computer communication. If we want humans and machines comunicate as a parent with the child, using different symbols and media, we must find the language of mind commonly usable by both machines and humans. In our opinion, there exist a kind of calm language of thinking, which we try to propose for machines in this paper. We separate the layers of a machine mind, we present the structure of the evolved mind and we discuss the selected properties. We are concentrating on the representation of symbolized concepts in the mind, that are languages, not just grammars, since they have meaning.

  18. Transferring data objects: A focused Ada investigation

    NASA Technical Reports Server (NTRS)

    Legrand, Sue

    1988-01-01

    The use of the Ada language does not guarantee that data objects will be in the same form or have the same value after they have been stored or transferred to another system. There are too many possible variables in such things as the formats used and other protocol conditions. Differences may occur at many different levels of support. These include program level, object level, application level, and system level. A standard language is only one aspect of making a complex system completely homogeneous. Many components must be standardized and the various standards must be integrated. The principal issues in providing for interaction between systems are of exchanging files and data objects between systems which may not be compatible in terms of their host computer, operating system or other factors. A typical resolution of the problem of invalidating data involves at least a common external form, for data objects and for representing the relationships and attributes of data collections. Some of the issues dealing with the transfer of data are listed and consideration is given on how these issues may be handled in the Ada language.

  19. Anglicisms in the Romanian business and technology vocabulary

    NASA Astrophysics Data System (ADS)

    Todea, L.; Demarcsek, R.

    2016-08-01

    Multinational companies in Romania have imposed the use of the predominant language, in most cases - English, in professional communication. In contexts related to workplace communication, the main motivation for foreign borrowings is the need to denote concepts and activities. The article focuses on the English language as a wide source for a great number of innovations both at the lexical and the morphological level in the Romanian vocabulary related to business and technology. The aim of the paper is to demonstrate that Romanian language displays a natural disposition towards adopting and adapting foreign words, especially borrowed English terms, in the field of computer science and business without endangering its identity.

  20. A Meta-Analysis of Effectiveness Studies on Computer Technology-Supported Language Learning

    ERIC Educational Resources Information Center

    Grgurovic, Maja; Chapelle, Carol A.; Shelley, Mack C.

    2013-01-01

    With the aim of summarizing years of research comparing pedagogies for second/foreign language teaching supported with computer technology and pedagogy not-supported by computer technology, a meta-analysis was conducted of empirical research investigating language outcomes. Thirty-seven studies yielding 52 effect sizes were included, following a…

  1. Computer-Assisted Language Learning: Diversity in Research and Practice

    ERIC Educational Resources Information Center

    Stockwell, Glenn, Ed.

    2012-01-01

    Computer-assisted language learning (CALL) is an approach to teaching and learning languages that uses computers and other technologies to present, reinforce, and assess material to be learned, or to create environments where teachers and learners can interact with one another and the outside world. This book provides a much-needed overview of the…

  2. SequenceL: Automated Parallel Algorithms Derived from CSP-NT Computational Laws

    NASA Technical Reports Server (NTRS)

    Cooke, Daniel; Rushton, Nelson

    2013-01-01

    With the introduction of new parallel architectures like the cell and multicore chips from IBM, Intel, AMD, and ARM, as well as the petascale processing available for highend computing, a larger number of programmers will need to write parallel codes. Adding the parallel control structure to the sequence, selection, and iterative control constructs increases the complexity of code development, which often results in increased development costs and decreased reliability. SequenceL is a high-level programming language that is, a programming language that is closer to a human s way of thinking than to a machine s. Historically, high-level languages have resulted in decreased development costs and increased reliability, at the expense of performance. In recent applications at JSC and in industry, SequenceL has demonstrated the usual advantages of high-level programming in terms of low cost and high reliability. SequenceL programs, however, have run at speeds typically comparable with, and in many cases faster than, their counterparts written in C and C++ when run on single-core processors. Moreover, SequenceL is able to generate parallel executables automatically for multicore hardware, gaining parallel speedups without any extra effort from the programmer beyond what is required to write the sequen tial/singlecore code. A SequenceL-to-C++ translator has been developed that automatically renders readable multithreaded C++ from a combination of a SequenceL program and sample data input. The SequenceL language is based on two fundamental computational laws, Consume-Simplify- Produce (CSP) and Normalize-Trans - pose (NT), which enable it to automate the creation of parallel algorithms from high-level code that has no annotations of parallelism whatsoever. In our anecdotal experience, SequenceL development has been in every case less costly than development of the same algorithm in sequential (that is, single-core, single process) C or C++, and an order of magnitude less costly than development of comparable parallel code. Moreover, SequenceL not only automatically parallelizes the code, but since it is based on CSP-NT, it is provably race free, thus eliminating the largest quality challenge the parallelized software developer faces.

  3. Ideas on Learning a New Language Intertwined with the Current State of Natural Language Processing and Computational Linguistics

    ERIC Educational Resources Information Center

    Snyder, Robin M.

    2015-01-01

    In 2014, in conjunction with doing research in natural language processing and attending a global conference on computational linguistics, the author decided to learn a new foreign language, Greek, that uses a non-English character set. This paper/session will present/discuss an overview of the current state of natural language processing and…

  4. Learning Styles and Individual Differences in Learning English Idioms via Computer Assisted Language Learning in English as a Second Language.

    ERIC Educational Resources Information Center

    Viteli, Jarmo

    The purpose of this study was to determine the learning styles of English-as-a-Second-Language (ESL) students and individual differences in learning English idioms via computer assisted language learning (CALL). Thirty-six Hispanic students, 26 Japanese students, and 6 students with various language backgrounds from the Nova University Intensive…

  5. A Programming Language Environment for the Unassisted Learner.

    ERIC Educational Resources Information Center

    Thomas, P. G.; Ince, D. C.

    1982-01-01

    Describes the computing environment and command language for a new programing language called OUSBASIC which is designed to enable naive users to interact usefully, with little assistance, with a computer system. (Author/CHC)

  6. The neural circuits for arithmetic principles.

    PubMed

    Liu, Jie; Zhang, Han; Chen, Chuansheng; Chen, Hui; Cui, Jiaxin; Zhou, Xinlin

    2017-02-15

    Arithmetic principles are the regularities underlying arithmetic computation. Little is known about how the brain supports the processing of arithmetic principles. The current fMRI study examined neural activation and functional connectivity during the processing of verbalized arithmetic principles, as compared to numerical computation and general language processing. As expected, arithmetic principles elicited stronger activation in bilateral horizontal intraparietal sulcus and right supramarginal gyrus than did language processing, and stronger activation in left middle temporal lobe and left orbital part of inferior frontal gyrus than did computation. In contrast, computation elicited greater activation in bilateral horizontal intraparietal sulcus (extending to posterior superior parietal lobule) than did either arithmetic principles or language processing. Functional connectivity analysis with the psychophysiological interaction approach (PPI) showed that left temporal-parietal (MTG-HIPS) connectivity was stronger during the processing of arithmetic principle and language than during computation, whereas parietal-occipital connectivities were stronger during computation than during the processing of arithmetic principles and language. Additionally, the left fronto-parietal (orbital IFG-HIPS) connectivity was stronger during the processing of arithmetic principles than during computation. The results suggest that verbalized arithmetic principles engage a neural network that overlaps but is distinct from the networks for computation and language processing. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. A distributed computing environment with support for constraint-based task scheduling and scientific experimentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahrens, J.P.; Shapiro, L.G.; Tanimoto, S.L.

    1997-04-01

    This paper describes a computing environment which supports computer-based scientific research work. Key features include support for automatic distributed scheduling and execution and computer-based scientific experimentation. A new flexible and extensible scheduling technique that is responsive to a user`s scheduling constraints, such as the ordering of program results and the specification of task assignments and processor utilization levels, is presented. An easy-to-use constraint language for specifying scheduling constraints, based on the relational database query language SQL, is described along with a search-based algorithm for fulfilling these constraints. A set of performance studies show that the environment can schedule and executemore » program graphs on a network of workstations as the user requests. A method for automatically generating computer-based scientific experiments is described. Experiments provide a concise method of specifying a large collection of parameterized program executions. The environment achieved significant speedups when executing experiments; for a large collection of scientific experiments an average speedup of 3.4 on an average of 5.5 scheduled processors was obtained.« less

  8. Legacy model integration for enhancing hydrologic interdisciplinary research

    NASA Astrophysics Data System (ADS)

    Dozier, A.; Arabi, M.; David, O.

    2013-12-01

    Many challenges are introduced to interdisciplinary research in and around the hydrologic science community due to advances in computing technology and modeling capabilities in different programming languages, across different platforms and frameworks by researchers in a variety of fields with a variety of experience in computer programming. Many new hydrologic models as well as optimization, parameter estimation, and uncertainty characterization techniques are developed in scripting languages such as Matlab, R, Python, or in newer languages such as Java and the .Net languages, whereas many legacy models have been written in FORTRAN and C, which complicates inter-model communication for two-way feedbacks. However, most hydrologic researchers and industry personnel have little knowledge of the computing technologies that are available to address the model integration process. Therefore, the goal of this study is to address these new challenges by utilizing a novel approach based on a publish-subscribe-type system to enhance modeling capabilities of legacy socio-economic, hydrologic, and ecologic software. Enhancements include massive parallelization of executions and access to legacy model variables at any point during the simulation process by another program without having to compile all the models together into an inseparable 'super-model'. Thus, this study provides two-way feedback mechanisms between multiple different process models that can be written in various programming languages and can run on different machines and operating systems. Additionally, a level of abstraction is given to the model integration process that allows researchers and other technical personnel to perform more detailed and interactive modeling, visualization, optimization, calibration, and uncertainty analysis without requiring deep understanding of inter-process communication. To be compatible, a program must be written in a programming language with bindings to a common implementation of the message passing interface (MPI), which includes FORTRAN, C, Java, the .NET languages, Python, R, Matlab, and many others. The system is tested on a longstanding legacy hydrologic model, the Soil and Water Assessment Tool (SWAT), to observe and enhance speed-up capabilities for various optimization, parameter estimation, and model uncertainty characterization techniques, which is particularly important for computationally intensive hydrologic simulations. Initial results indicate that the legacy extension system significantly decreases developer time, computation time, and the cost of purchasing commercial parallel processing licenses, while enhancing interdisciplinary research by providing detailed two-way feedback mechanisms between various process models with minimal changes to legacy code.

  9. The BASIC Instructional Program: Conversion into MAINSAIL Language.

    ERIC Educational Resources Information Center

    Dageforde, Mary L.

    This report summarizes the rewriting of the BASIC Instructional Program (BIP) (a "hands-on laboratory" that teaches elementary programming in the BASIC language) from SAIL (a programming language available only on PDP-10 computers) into MAINSAIL (a language designed for portability on a broad class of computers). Four sections contain…

  10. Quantitative Model for Choosing Programming Language for Online Instruction

    ERIC Educational Resources Information Center

    Sherman, Steven J.; Shehane, Ronald F.; Todd, Dewey W.

    2018-01-01

    Colleges are increasingly offering online courses, including computer programming courses for business school students. Programming languages that are most useful to students are those that are widely used in the job market. However, the most popular computer languages change at least every three years. Therefore, the language used for instruction…

  11. Students' Motivation towards Computer Use in EFL Learning

    ERIC Educational Resources Information Center

    Genc, Gulten; Aydin, Selami

    2010-01-01

    It has been widely recognized that language instruction that integrates technology has become popular, and has had a tremendous impact on language learning process whereas learners are expected to be more motivated in a web-based Computer assisted language learning program, and improve their comprehensive language ability. Thus, the present paper…

  12. Floating-point function generation routines for 16-bit microcomputers

    NASA Technical Reports Server (NTRS)

    Mackin, M. A.; Soeder, J. F.

    1984-01-01

    Several computer subroutines have been developed that interpolate three types of nonanalytic functions: univariate, bivariate, and map. The routines use data in floating-point form. However, because they are written for use on a 16-bit Intel 8086 system with an 8087 mathematical coprocessor, they execute as fast as routines using data in scaled integer form. Although all of the routines are written in assembly language, they have been implemented in a modular fashion so as to facilitate their use with high-level languages.

  13. The possibility of coexistence and co-development in language competition: ecology-society computational model and simulation.

    PubMed

    Yun, Jian; Shang, Song-Chao; Wei, Xiao-Dan; Liu, Shuang; Li, Zhi-Jie

    2016-01-01

    Language is characterized by both ecological properties and social properties, and competition is the basic form of language evolution. The rise and decline of one language is a result of competition between languages. Moreover, this rise and decline directly influences the diversity of human culture. Mathematics and computer modeling for language competition has been a popular topic in the fields of linguistics, mathematics, computer science, ecology, and other disciplines. Currently, there are several problems in the research on language competition modeling. First, comprehensive mathematical analysis is absent in most studies of language competition models. Next, most language competition models are based on the assumption that one language in the model is stronger than the other. These studies tend to ignore cases where there is a balance of power in the competition. The competition between two well-matched languages is more practical, because it can facilitate the co-development of two languages. A third issue with current studies is that many studies have an evolution result where the weaker language inevitably goes extinct. From the integrated point of view of ecology and sociology, this paper improves the Lotka-Volterra model and basic reaction-diffusion model to propose an "ecology-society" computational model for describing language competition. Furthermore, a strict and comprehensive mathematical analysis was made for the stability of the equilibria. Two languages in competition may be either well-matched or greatly different in strength, which was reflected in the experimental design. The results revealed that language coexistence, and even co-development, are likely to occur during language competition.

  14. The computational structural mechanics testbed procedures manual

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1991-01-01

    The purpose of this manual is to document the standard high level command language procedures of the Computational Structural Mechanics (CSM) Testbed software system. A description of each procedure including its function, commands, data interface, and use is presented. This manual is designed to assist users in defining and using command procedures to perform structural analysis in the CSM Testbed User's Manual and the CSM Testbed Data Library Description.

  15. A Generalized-Compliant-Motion Primitive

    NASA Technical Reports Server (NTRS)

    Backes, Paul G.

    1993-01-01

    Computer program bridges gap between planning and execution of compliant robotic motions developed and installed in control system of telerobot. Called "generalized-compliant-motion primitive," one of several task-execution-primitive computer programs, which receives commands from higher-level task-planning programs and executes commands by generating required trajectories and applying appropriate control laws. Program comprises four parts corresponding to nominal motion, compliant motion, ending motion, and monitoring. Written in C language.

  16. Operation of the HP2250 with the HP9000 series 200 using PASCAL 3.0

    NASA Technical Reports Server (NTRS)

    Perry, John; Stroud, C. W.

    1986-01-01

    A computer program has been written to provide an interface between the HP Series 200 desktop computers, operating under HP Standard Pascal 3.0, and the HP2250 Data Acquisition and Control System. Pascal 3.0 for the HP9000 desktop computer gives a number of procedures for handling bus communication at various levels. It is necessary, however, to reach the lowest possible level in Pascal to handle the bus protocols required by the HP2250. This makes programming extremely complex since these protocols are not documented. The program described solves those problems and allows the user to immediately program, simply and efficiently, any measurement and control language (MCL/50) application with a few procedure calls. The complete set of procedures is available on a 5 1/4 inch diskette from Cosmic. Included in this group of procedures is an Exerciser which allows the user to exercise his HP2250 interactively. The exerciser operates in a fashion similar to the Series 200 operating system programs, but is adapted to the requirements of the HP2250. The programs on the diskette and the user's manual assume the user is acquainted with both the MCL/50 programming language and HP Standard Pascal 3.0 for the HP series 200 desktop computers.

  17. 25 CFR 36.3 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... education emphasizing literacy in language arts, mathematics, natural and physical sciences, history, and related social sciences. Bureau means the Bureau of Indian Affairs of the Department of the Interior... specified level of mastery. Computer literacy used here means the general range of skills and understanding...

  18. 25 CFR 36.3 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... education emphasizing literacy in language arts, mathematics, natural and physical sciences, history, and related social sciences. Bureau means the Bureau of Indian Affairs of the Department of the Interior... specified level of mastery. Computer literacy used here means the general range of skills and understanding...

  19. 25 CFR 36.3 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... education emphasizing literacy in language arts, mathematics, natural and physical sciences, history, and related social sciences. Bureau means the Bureau of Indian Affairs of the Department of the Interior... specified level of mastery. Computer literacy used here means the general range of skills and understanding...

  20. An IBM 370 assembly language program verifier

    NASA Technical Reports Server (NTRS)

    Maurer, W. D.

    1977-01-01

    The paper describes a program written in SNOBOL which verifies the correctness of programs written in assembly language for the IBM 360 and 370 series of computers. The motivation for using assembly language as a source language for a program verifier was the realization that many errors in programs are caused by misunderstanding or ignorance of the characteristics of specific computers. The proof of correctness of a program written in assembly language must take these characteristics into account. The program has been compiled and is currently running at the Center for Academic and Administrative Computing of The George Washington University.

  1. A phenomenographic study of the ways of understanding conditional and repetition structures in computer programming languages

    NASA Astrophysics Data System (ADS)

    Bucks, Gregory Warren

    Computers have become an integral part of how engineers complete their work, allowing them to collect and analyze data, model potential solutions and aiding in production through automation and robotics. In addition, computers are essential elements of the products themselves, from tennis shoes to construction materials. An understanding of how computers function, both at the hardware and software level, is essential for the next generation of engineers. Despite the need for engineers to develop a strong background in computing, little opportunity is given for engineering students to develop these skills. Learning to program is widely seen as a difficult task, requiring students to develop not only an understanding of specific concepts, but also a way of thinking. In addition, students are forced to learn a new tool, in the form of the programming environment employed, along with these concepts and thought processes. Because of this, many students will not develop a sufficient proficiency in programming, even after progressing through the traditional introductory programming sequence. This is a significant problem, especially in the engineering disciplines, where very few students receive more than one or two semesters' worth of instruction in an already crowded engineering curriculum. To address these issues, new pedagogical techniques must be investigated in an effort to enhance the ability of engineering students to develop strong computing skills. However, these efforts are hindered by the lack of published assessment instruments available for probing an individual's understanding of programming concepts across programming languages. Traditionally, programming knowledge has been assessed by producing written code in a specific language. This can be an effective method, but does not lend itself well to comparing the pedagogical impact of different programming environments, languages or paradigms. This dissertation presents a phenomenographic research study exploring the different ways of understanding held by individuals of two programming concepts: conditional structures and repetition structures. This work lays the foundation for the development of language independent assessment instruments, which can ultimately be used to assess the pedagogical implications of various programming environments.

  2. Learning a generative probabilistic grammar of experience: a process-level model of language acquisition.

    PubMed

    Kolodny, Oren; Lotem, Arnon; Edelman, Shimon

    2015-03-01

    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this manner takes the form of a directed weighted graph, whose nodes are recursively (hierarchically) defined patterns over the elements of the input stream. We evaluated the model in seventeen experiments, grouped into five studies, which examined, respectively, (a) the generative ability of grammar learned from a corpus of natural language, (b) the characteristics of the learned representation, (c) sequence segmentation and chunking, (d) artificial grammar learning, and (e) certain types of structure dependence. The model's performance largely vindicates our design choices, suggesting that progress in modeling language acquisition can be made on a broad front-ranging from issues of generativity to the replication of human experimental findings-by bringing biological and computational considerations, as well as lessons from prior efforts, to bear on the modeling approach. Copyright © 2014 Cognitive Science Society, Inc.

  3. Second-Language Composition Instruction, Computers and First-Language Pedagogy: A Descriptive Survey.

    ERIC Educational Resources Information Center

    Harvey, T. Edward

    1987-01-01

    A national survey of full-time instructional faculty (N=208) at universities, 2-year colleges, and high schools regarding attitudes toward using computers in second-language composition instruction revealed a predomination of Apple and IBM-PC computers used, a major frustration in lack of foreign character support, and mixed opinions about real…

  4. A Proposal on the Validation Model of Equivalence between PBLT and CBLT

    ERIC Educational Resources Information Center

    Chen, Huilin

    2014-01-01

    The validity of the computer-based language test is possibly affected by three factors: computer familiarity, audio-visual cognitive competence, and other discrepancies in construct. Therefore, validating the equivalence between the paper-and-pencil language test and the computer-based language test is a key step in the procedure of designing a…

  5. New Ways of Using Computers in Language Teaching. New Ways in TESOL Series II. Innovative Classroom Techniques.

    ERIC Educational Resources Information Center

    Boswood, Tim, Ed.

    A collection of classroom approaches and activities using computers for language learning is presented. Some require sophisticated installations, but most do not, and most use software readily available on most workplace computer systems. The activities were chosen because they use sound language learning strategies. The book is divided into five…

  6. The Role of Computer-Assisted Language Learning (CALL) in Promoting Learner Autonomy

    ERIC Educational Resources Information Center

    Mutlu, Arzu; Eroz-Tuga, Betil

    2013-01-01

    Problem Statement: Teaching a language with the help of computers and the Internet has attracted the attention of many practitioners and researchers in the last 20 years, so the number of studies that investigate whether computers and the Internet promote language learning continues to increase. These studies have focused on exploring the beliefs…

  7. Pre-Service Teachers' Uses of and Barriers from Adopting Computer-Assisted Language Learning (CALL) Programs

    ERIC Educational Resources Information Center

    Samani, Ebrahim; Baki, Roselan; Razali, Abu Bakar

    2014-01-01

    Success in implementation of computer-assisted language learning (CALL) programs depends on the teachers' understanding of the roles of CALL programs in education. Consequently, it is also important to understand the barriers teachers face in the use of computer-assisted language learning (CALL) programs. The current study was conducted on 14…

  8. Mind and Material: The Interplay between Computer-Related and Second Language Factors in Online Communication Dialogues

    ERIC Educational Resources Information Center

    Wu, Pin-hsiang Natalie; Kawamura, Michelle

    2014-01-01

    With a growing demand for learning English and a trend of utilizing computers in education, methods that can achieve the effectiveness of computer-mediated communication (CMC) to support language learning in higher education have been examined. However, second language factors manipulate both the process and production of CMC and, therefore,…

  9. Possible Pedagogical Applications of a Talking Computer Terminal for the French-Speaking Blind to Foreign Language Teaching.

    ERIC Educational Resources Information Center

    Trescases, Pierre

    A computer system developed as a database access facilitator for the blind is found to have application to foreign language instruction, specifically in teaching French to speakers of English. The computer is programmed to translate symbols from the International Phonetic Alphabet (IPA) into appropriate phonemes for whatever language is being…

  10. Applications of NLP Techniques to Computer-Assisted Authoring of Test Items for Elementary Chinese

    ERIC Educational Resources Information Center

    Liu, Chao-Lin; Lin, Jen-Hsiang; Wang, Yu-Chun

    2010-01-01

    The authors report an implemented environment for computer-assisted authoring of test items and provide a brief discussion about the applications of NLP techniques for computer assisted language learning. Test items can serve as a tool for language learners to examine their competence in the target language. The authors apply techniques for…

  11. Pre-Service English Language Teachers' Perceptions of Computer Self-Efficacy and General Self-Efficacy

    ERIC Educational Resources Information Center

    Topkaya, Ece Zehir

    2010-01-01

    The primary aim of this study is to investigate pre-service English language teachers' perceptions of computer self-efficacy in relation to different variables. Secondarily, the study also explores the relationship between pre-service English language teachers' perceptions of computer self-efficacy and their perceptions of general self-efficacy.…

  12. Design and Delivery of Multiple Server-Side Computer Languages Course

    ERIC Educational Resources Information Center

    Wang, Shouhong; Wang, Hai

    2011-01-01

    Given the emergence of service-oriented architecture, IS students need to be knowledgeable of multiple server-side computer programming languages to be able to meet the needs of the job market. This paper outlines the pedagogy of an innovative course of multiple server-side computer languages for the undergraduate IS majors. The paper discusses…

  13. Computational Workbench for Multibody Dynamics

    NASA Technical Reports Server (NTRS)

    Edmonds, Karina

    2007-01-01

    PyCraft is a computer program that provides an interactive, workbenchlike computing environment for developing and testing algorithms for multibody dynamics. Examples of multibody dynamic systems amenable to analysis with the help of PyCraft include land vehicles, spacecraft, robots, and molecular models. PyCraft is based on the Spatial-Operator- Algebra (SOA) formulation for multibody dynamics. The SOA operators enable construction of simple and compact representations of complex multibody dynamical equations. Within the Py-Craft computational workbench, users can, essentially, use the high-level SOA operator notation to represent the variety of dynamical quantities and algorithms and to perform computations interactively. PyCraft provides a Python-language interface to underlying C++ code. Working with SOA concepts, a user can create and manipulate Python-level operator classes in order to implement and evaluate new dynamical quantities and algorithms. During use of PyCraft, virtually all SOA-based algorithms are available for computational experiments.

  14. At the Crossroads of Learning and Culture: Identifying a Construct for Effective Computer-Assisted Language Learning for English Language Learners

    ERIC Educational Resources Information Center

    Shaw, Yun

    2010-01-01

    Many of the commercial Computer-Assisted Language Learning (CALL) programs available today typically take a generic approach. This approach standardizes the program so that it can be used to teach any language merely by translating the content from one language to another. These CALL programs rarely consider the cultural background or preferred…

  15. Highlights of X-Stack ExM Deliverable Swift/T

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wozniak, Justin M.

    Swift/T is a key success from the ExM: System support for extreme-scale, many-task applications1 X-Stack project, which proposed to use concurrent dataflow as an innovative programming model to exploit extreme parallelism in exascale computers. The Swift/T component of the project reimplemented the Swift language from scratch to allow applications that compose scientific modules together to be build and run on available petascale computers (Blue Gene, Cray). Swift/T does this via a new compiler and runtime that generates and executes the application as an MPI program. We assume that mission-critical emerging exascale applications will be composed as scalable applications using existingmore » software components, connected by data dependencies. Developers wrap native code fragments using a higherlevel language, then build composite applications to form a computational experiment. This exemplifies hierarchical concurrency: lower-level messaging libraries are used for fine-grained parallelism; highlevel control is used for inter-task coordination. These patterns are best expressed with dataflow, but static DAGs (i.e., other workflow languages) limit the applications that can be built; they do not provide the expressiveness of Swift, such as conditional execution, iteration, and recursive functions.« less

  16. Computer Programming Languages and Expertise Needed by Practicing Engineers.

    ERIC Educational Resources Information Center

    Doelling, Irvin

    1980-01-01

    Discussed is the present engineering computer environment of a large aerospace company recognized as a leader in the application and development of computer-aided design and computer-aided manufacturing techniques. A review is given of the exposure spectrum of engineers to the world of computing, the computer languages used, and the career impacts…

  17. Tools and Trends in Self-Paced Language Instruction

    ERIC Educational Resources Information Center

    Godwin-Jones, Robert

    2007-01-01

    Ever since the PLATO system of the 1960's, CALL (computer assisted language learning) has had a major focus on providing self-paced, auto-correcting exercises for language learners to practice their skills and improve their knowledge of discrete areas of language learning. The computer has been recognized from the beginning as a patient and…

  18. Comparability of a Paper-Based Language Test and a Computer-Based Language Test.

    ERIC Educational Resources Information Center

    Choi, Inn-Chull; Kim, Kyoung Sung; Boo, Jaeyool

    2003-01-01

    Utilizing the Test of English Proficiency, developed by Seoul National University (TEPS), examined comparability between the paper-based language test and the computer-based language test based on content and construct validation employing content analyses based on corpus linguistic techniques in addition to such statistical analyses as…

  19. Computational Nonlinear Morphology with Emphasis on Semitic Languages. Studies in Natural Language Processing.

    ERIC Educational Resources Information Center

    Kiraz, George Anton

    This book presents a tractable computational model that can cope with complex morphological operations, especially in Semitic languages, and less complex morphological systems present in Western languages. It outlines a new generalized regular rewrite rule system that uses multiple finite-state automata to cater to root-and-pattern morphology,…

  20. Authenticity and Authorship in the Computer-Mediated Acquisition of L2 Literacy.

    ERIC Educational Resources Information Center

    Kramsch, Claire

    2000-01-01

    Examines two tenets of communicative language teaching--authenticity of the input and authorship of the language user--in an electronic environment. Reviews research in textually-mediated second language acquisition and analyzes two cases of computer-mediated language learning: the construction of a multimedia CD-ROM by American college learners…

  1. BASIC, Logo, and Pilot: A Comparison of Three Computer Languages.

    ERIC Educational Resources Information Center

    Maddux, Cleborne D.; Cummings, Rhoda E.

    1985-01-01

    Following a brief history of Logo, BASIC, and Pilot programing languages, common educational programing tasks (input from keyboard, evaluation of keyboard input, and computation) are presented in each language to illustrate how each can be used to perform the same tasks and to demonstrate each language's strengths and weaknesses. (MBR)

  2. An Intelligent Computer Assisted Language Learning System for Arabic Learners

    ERIC Educational Resources Information Center

    Shaalan, Khaled F.

    2005-01-01

    This paper describes the development of an intelligent computer-assisted language learning (ICALL) system for learning Arabic. This system could be used for learning Arabic by students at primary schools or by learners of Arabic as a second or foreign language. It explores the use of Natural Language Processing (NLP) techniques for learning…

  3. Computational Natural Language Inference: Robust and Interpretable Question Answering

    ERIC Educational Resources Information Center

    Sharp, Rebecca Reynolds

    2017-01-01

    We address the challenging task of "computational natural language inference," by which we mean bridging two or more natural language texts while also providing an explanation of how they are connected. In the context of question answering (i.e., finding short answers to natural language questions), this inference connects the question…

  4. Learner Use of Holistic Language Units in Multimodal, Task-Based Synchronous Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Collentine, Karina

    2009-01-01

    Second language acquisition (SLA) researchers strive to understand the language and exchanges that learners generate in synchronous computer-mediated communication (SCMC). Doughty and Long (2003) advocate replacing open-ended SCMC with task-based language teaching (TBLT) design principles. Since most task-based SCMC (TB-SCMC) research addresses an…

  5. PyMercury: Interactive Python for the Mercury Monte Carlo Particle Transport Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iandola, F N; O'Brien, M J; Procassini, R J

    2010-11-29

    Monte Carlo particle transport applications are often written in low-level languages (C/C++) for optimal performance on clusters and supercomputers. However, this development approach often sacrifices straightforward usability and testing in the interest of fast application performance. To improve usability, some high-performance computing applications employ mixed-language programming with high-level and low-level languages. In this study, we consider the benefits of incorporating an interactive Python interface into a Monte Carlo application. With PyMercury, a new Python extension to the Mercury general-purpose Monte Carlo particle transport code, we improve application usability without diminishing performance. In two case studies, we illustrate how PyMercury improvesmore » usability and simplifies testing and validation in a Monte Carlo application. In short, PyMercury demonstrates the value of interactive Python for Monte Carlo particle transport applications. In the future, we expect interactive Python to play an increasingly significant role in Monte Carlo usage and testing.« less

  6. Communication in science.

    PubMed

    Deda, H; Yakupoglu, H

    2002-01-01

    Science must have a common language. For centuries, Latin language carried out this job, but the progress in computer technology and internet world through the last 20 years, began to produce a new language with the new century; the computer language. The information masses, which need data language standardization, are the followings; Digital libraries and medical education systems, Consumer health informatics, Medical education systems, World Wide Web Applications, Database systems, Medical language processing, Automatic indexing systems, Image processing units, Telemedicine, New Generation Internet (NGI).

  7. Evaluating the Validity of Accommodations for English Learners through Evidence Based on Response Processes

    ERIC Educational Resources Information Center

    Crotts, Katrina M.

    2013-01-01

    English learners (ELs) represent one of the fastest growing student populations in the United States. Given that language can serve as a barrier in EL performance, test accommodations are provided to help level the playing field and allow ELs to better demonstrate their true performance level. Test accommodations on the computer offer the ability…

  8. What's so Simple about Simplified Texts? A Computational and Psycholinguistic Investigation of Text Comprehension and Text Processing

    ERIC Educational Resources Information Center

    Crossley, Scott A.; Yang, Hae Sung; McNamara, Danielle S.

    2014-01-01

    This study uses a moving windows self-paced reading task to assess both text comprehension and processing time of authentic texts and these same texts simplified to beginning and intermediate levels. Forty-eight second language learners each read 9 texts (3 different authentic, beginning, and intermediate level texts). Repeated measures ANOVAs…

  9. Processing subject-verb agreement in a second language depends on proficiency

    PubMed Central

    Hoshino, Noriko; Dussias, Paola E.; Kroll, Judith F.

    2010-01-01

    Subject-verb agreement is a computation that is often difficult to execute perfectly in the first language (L1) and even more difficult to produce skillfully in a second language (L2). In this study, we examined the way in which bilingual speakers complete sentence fragments in a manner that reflects access to both grammatical and conceptual number. In two experiments, we show that bilingual speakers are sensitive to both grammatical and conceptual number in the L1 and grammatical number agreement in the L2. However, only highly proficient bilinguals are also sensitive to conceptual number in the L2. The results suggest that the extent to which speakers are able to exploit conceptual information during speech planning depends on the level of language proficiency. PMID:20640178

  10. Task Description Language

    NASA Technical Reports Server (NTRS)

    Simmons, Reid; Apfelbaum, David

    2005-01-01

    Task Description Language (TDL) is an extension of the C++ programming language that enables programmers to quickly and easily write complex, concurrent computer programs for controlling real-time autonomous systems, including robots and spacecraft. TDL is based on earlier work (circa 1984 through 1989) on the Task Control Architecture (TCA). TDL provides syntactic support for hierarchical task-level control functions, including task decomposition, synchronization, execution monitoring, and exception handling. A Java-language-based compiler transforms TDL programs into pure C++ code that includes calls to a platform-independent task-control-management (TCM) library. TDL has been used to control and coordinate multiple heterogeneous robots in projects sponsored by NASA and the Defense Advanced Research Projects Agency (DARPA). It has also been used in Brazil to control an autonomous airship and in Canada to control a robotic manipulator.

  11. ICCE/ICCAI 2000 Full & Short Papers (Computer-Assisted Language Learning).

    ERIC Educational Resources Information Center

    2000

    This document contains the following full and short papers on computer-assisted language learning (CALL) from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Computer-Assisted English Abstract Words Learning Environment on the Web" (Wenli Tsou and…

  12. A comparative study of programming languages for next-generation astrodynamics systems

    NASA Astrophysics Data System (ADS)

    Eichhorn, Helge; Cano, Juan Luis; McLean, Frazer; Anderl, Reiner

    2018-03-01

    Due to the computationally intensive nature of astrodynamics tasks, astrodynamicists have relied on compiled programming languages such as Fortran for the development of astrodynamics software. Interpreted languages such as Python, on the other hand, offer higher flexibility and development speed thereby increasing the productivity of the programmer. While interpreted languages are generally slower than compiled languages, recent developments such as just-in-time (JIT) compilers or transpilers have been able to close this speed gap significantly. Another important factor for the usefulness of a programming language is its wider ecosystem which consists of the available open-source packages and development tools such as integrated development environments or debuggers. This study compares three compiled languages and three interpreted languages, which were selected based on their popularity within the scientific programming community and technical merit. The three compiled candidate languages are Fortran, C++, and Java. Python, Matlab, and Julia were selected as the interpreted candidate languages. All six languages are assessed and compared to each other based on their features, performance, and ease-of-use through the implementation of idiomatic solutions to classical astrodynamics problems. We show that compiled languages still provide the best performance for astrodynamics applications, but JIT-compiled dynamic languages have reached a competitive level of speed and offer an attractive compromise between numerical performance and programmer productivity.

  13. Communicative Language Testing: Implications for Computer Based Language Testing in French for Specific Purposes

    ERIC Educational Resources Information Center

    García Laborda, Jesús; López Santiago, Mercedes; Otero de Juan, Nuria; Álvarez Álvarez, Alfredo

    2014-01-01

    Current evolutions of language testing have led to integrating computers in FSP assessments both in oral and written communicative tasks. This paper deals with two main issues: learners' expectations about the types of questions in FSP computer based assessments and the relation with their own experience. This paper describes the experience of 23…

  14. Algerian EFL University Teachers' Attitudes towards Computer Assisted Language Learning: The Case of Djilali Liabes University

    ERIC Educational Resources Information Center

    Bouchefra, Miloud; Baghoussi, Meriem

    2017-01-01

    Computer Assisted Language Learning (CALL) is still groping its way into Algerian English as a Foreign Language (EFL) classroom, where Information Communications Technologies (ICTs) are defined in terms of occasional use of computers and data projectors for material presentation in the classroom. Though major issues in the image of the lack of…

  15. Effectiveness of Various Computer-Based Instructional Strategies in Language Teaching. Final Report, November 1, 1969-August 31, 1970.

    ERIC Educational Resources Information Center

    Van Campen, Joseph A.

    Computer software for programed language instruction, developed in the second quarter of 1970 at Stanford's Institute for Mathematical Studies in the Social Sciences is described in this report. The software includes: (1) a PDP-10 computer assembly language for generating drill sentences; (2) a coding system allowing a large number of sentences to…

  16. Vectorial Representations of Meaning for a Computational Model of Language Comprehension

    ERIC Educational Resources Information Center

    Wu, Stephen Tze-Inn

    2010-01-01

    This thesis aims to define and extend a line of computational models for text comprehension that are humanly plausible. Since natural language is human by nature, computational models of human language will always be just that--models. To the degree that they miss out on information that humans would tap into, they may be improved by considering…

  17. Large Scale Analysis of Geospatial Data with Dask and XArray

    NASA Astrophysics Data System (ADS)

    Zender, C. S.; Hamman, J.; Abernathey, R.; Evans, K. J.; Rocklin, M.; Zender, C. S.; Rocklin, M.

    2017-12-01

    The analysis of geospatial data with high level languages has acceleratedinnovation and the impact of existing data resources. However, as datasetsgrow beyond single-machine memory, data structures within these high levellanguages can become a bottleneck. New libraries like Dask and XArray resolve some of these scalability issues,providing interactive workflows that are both familiar tohigh-level-language researchers while also scaling out to much largerdatasets. This broadens the access of researchers to larger datasets on highperformance computers and, through interactive development, reducestime-to-insight when compared to traditional parallel programming techniques(MPI). This talk describes Dask, a distributed dynamic task scheduler, Dask.array, amulti-dimensional array that copies the popular NumPy interface, and XArray,a library that wraps NumPy/Dask.array with labeled and indexes axes,implementing the CF conventions. We discuss both the basic design of theselibraries and how they change interactive analysis of geospatial data, and alsorecent benefits and challenges of distributed computing on clusters ofmachines.

  18. An Overview of R in Health Decision Sciences.

    PubMed

    Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam

    2017-10-01

    As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.

  19. Linguistics, Computers, and the Language Teacher. A Communicative Approach.

    ERIC Educational Resources Information Center

    Underwood, John H.

    This analysis of the state of the art of computer programs and programming for language teaching has two parts. In the first part, an overview of the theory and practice of language teaching, Noam Chomsky's view of language, and the implications and problems of generative theory are presented. The theory behind the input model of language…

  20. Talk Across the Oceans: Language and Culture of the Global Internet Community.

    ERIC Educational Resources Information Center

    Takahashi, Shinji

    1996-01-01

    Discusses some of the technological difficulties associated with the use of English or other European languages on the Internet, and uses Japanese computing as an example. Examines the linguistic culture of the language with attention to English, how technology limits/expands communication, and the role of languages in the computer domain.…

  1. On Using Intelligent Computer-Assisted Language Learning in Real-Life Foreign Language Teaching and Learning

    ERIC Educational Resources Information Center

    Amaral, Luiz A.; Meurers, Detmar

    2011-01-01

    This paper explores the motivation and prerequisites for successful integration of Intelligent Computer-Assisted Language Learning (ICALL) tools into current foreign language teaching and learning (FLTL) practice. We focus on two aspects, which we argue to be important for effective ICALL system development and use: (i) the relationship between…

  2. Computer-Assisted Language Learning (CALL) in Support of (Re)-Learning Native Languages: The Case of Runyakitara

    ERIC Educational Resources Information Center

    Katushemererwe, Fridah; Nerbonne, John

    2015-01-01

    This study presents the results from a computer-assisted language learning (CALL) system of Runyakitara (RU_CALL). The major objective was to provide an electronic language learning environment that can enable learners with mother tongue deficiencies to enhance their knowledge of grammar and acquire writing skills in Runyakitara. The system…

  3. Graphical qualities of educational technology: Using drag-and-drop and text-based programs for introductory computer science.

    PubMed

    DiSalvo, Betsy

    2014-01-01

    To determine appropriate computer science curricula, educators sought to better understand the different affordances of teaching with a visual programming language (Alice) or a text-based language (Jython). Although students often preferred one language, that language wasn't necessarily the one from which they learned the most.

  4. HAL/SM language specification. [programming languages and computer programming for space shuttles

    NASA Technical Reports Server (NTRS)

    Williams, G. P. W., Jr.; Ross, C.

    1975-01-01

    A programming language is presented for the flight software of the NASA Space Shuttle program. It is intended to satisfy virtually all of the flight software requirements of the space shuttle. To achieve this, it incorporates a wide range of features, including applications-oriented data types and organizations, real time control mechanisms, and constructs for systems programming tasks. It is a higher order language designed to allow programmers, analysts, and engineers to communicate with the computer in a form approximating natural mathematical expression. Parts of the English language are combined with standard notation to provide a tool that readily encourages programming without demanding computer hardware expertise. Block diagrams and flow charts are included. The semantics of the language is discussed.

  5. A SCILAB Program for Computing General-Relativistic Models of Rotating Neutron Stars by Implementing Hartle's Perturbation Method

    NASA Astrophysics Data System (ADS)

    Papasotiriou, P. J.; Geroyannis, V. S.

    We implement Hartle's perturbation method to the computation of relativistic rigidly rotating neutron star models. The program has been written in SCILAB (© INRIA ENPC), a matrix-oriented high-level programming language. The numerical method is described in very detail and is applied to many models in slow or fast rotation. We show that, although the method is perturbative, it gives accurate results for all practical purposes and it should prove an efficient tool for computing rapidly rotating pulsars.

  6. Genetics and language: a neurobiological perspective on the missing link (-ing hypotheses).

    PubMed

    Poeppel, David

    2011-12-01

    The paper argues that both evolutionary and genetic approaches to studying the biological foundations of speech and language could benefit from fractionating the problem at a finer grain, aiming not to map genetics to "language"-or even subdomains of language such as "phonology" or "syntax"-but rather to link genetic results to component formal operations that underlie processing the comprehension and production of linguistic representations. Neuroanatomic and neurophysiological research suggests that language processing is broken down in space (distributed functional anatomy along concurrent pathways) and time (concurrent processing on multiple time scales). These parallel neuronal pathways and their local circuits form the infrastructure of speech and language and are the actual targets of evolution/genetics. Therefore, investigating the mapping from gene to brain circuit to linguistic phenotype at the level of generic computational operations (subroutines actually executable in these circuits) stands to provide a new perspective on the biological foundations in the healthy and challenged brain.

  7. Data flow language and interpreter for a reconfigurable distributed data processor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurt, A.D.; Heath, J.R.

    1982-01-01

    An analytic language and an interpreter whereby an applications data flow graph may serve as an input to a reconfigurable distributed data processor is proposed. The architecture considered consists of a number of loosely coupled computing elements (CES) which may be linked to data and file memories through fully nonblocking interconnect networks. The real-time performance of such an architecture depends upon its ability to alter its topology in response to changes in application, asynchronous data rates and faults. Such a data flow language enhances the versatility of a reconfigurable architecture by allowing the user to specify the machine's topology atmore » a very high level. 11 references.« less

  8. Semiotics, Information Science, Documents and Computers.

    ERIC Educational Resources Information Center

    Warner, Julian

    1990-01-01

    Discusses the relationship and value of semiotics to the established domains of information science. Highlights include documentation; computer operations; the language of computing; automata theory; linguistics; speech and writing; and the written language as a unifying principle for the document and the computer. (93 references) (LRW)

  9. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 3 (L1V3).

    PubMed

    Bergmann, Frank T; Cooper, Jonathan; König, Matthias; Moraru, Ion; Nickerson, David; Le Novère, Nicolas; Olivier, Brett G; Sahle, Sven; Smith, Lucian; Waltemath, Dagmar

    2018-03-19

    The creation of computational simulation experiments to inform modern biological research poses challenges to reproduce, annotate, archive, and share such experiments. Efforts such as SBML or CellML standardize the formal representation of computational models in various areas of biology. The Simulation Experiment Description Markup Language (SED-ML) describes what procedures the models are subjected to, and the details of those procedures. These standards, together with further COMBINE standards, describe models sufficiently well for the reproduction of simulation studies among users and software tools. The Simulation Experiment Description Markup Language (SED-ML) is an XML-based format that encodes, for a given simulation experiment, (i) which models to use; (ii) which modifications to apply to models before simulation; (iii) which simulation procedures to run on each model; (iv) how to post-process the data; and (v) how these results should be plotted and reported. SED-ML Level 1 Version 1 (L1V1) implemented support for the encoding of basic time course simulations. SED-ML L1V2 added support for more complex types of simulations, specifically repeated tasks and chained simulation procedures. SED-ML L1V3 extends L1V2 by means to describe which datasets and subsets thereof to use within a simulation experiment.

  10. Prediction During Natural Language Comprehension.

    PubMed

    Willems, Roel M; Frank, Stefan L; Nijhof, Annabel D; Hagoort, Peter; van den Bosch, Antal

    2016-06-01

    The notion of prediction is studied in cognitive neuroscience with increasing intensity. We investigated the neural basis of 2 distinct aspects of word prediction, derived from information theory, during story comprehension. We assessed the effect of entropy of next-word probability distributions as well as surprisal A computational model determined entropy and surprisal for each word in 3 literary stories. Twenty-four healthy participants listened to the same 3 stories while their brain activation was measured using fMRI. Reversed speech fragments were presented as a control condition. Brain areas sensitive to entropy were left ventral premotor cortex, left middle frontal gyrus, right inferior frontal gyrus, left inferior parietal lobule, and left supplementary motor area. Areas sensitive to surprisal were left inferior temporal sulcus ("visual word form area"), bilateral superior temporal gyrus, right amygdala, bilateral anterior temporal poles, and right inferior frontal sulcus. We conclude that prediction during language comprehension can occur at several levels of processing, including at the level of word form. Our study exemplifies the power of combining computational linguistics with cognitive neuroscience, and additionally underlines the feasibility of studying continuous spoken language materials with fMRI. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Artificial intelligence, expert systems, computer vision, and natural language processing

    NASA Technical Reports Server (NTRS)

    Gevarter, W. B.

    1984-01-01

    An overview of artificial intelligence (AI), its core ingredients, and its applications is presented. The knowledge representation, logic, problem solving approaches, languages, and computers pertaining to AI are examined, and the state of the art in AI is reviewed. The use of AI in expert systems, computer vision, natural language processing, speech recognition and understanding, speech synthesis, problem solving, and planning is examined. Basic AI topics, including automation, search-oriented problem solving, knowledge representation, and computational logic, are discussed.

  12. Logic as Marr's Computational Level: Four Case Studies.

    PubMed

    Baggio, Giosuè; van Lambalgen, Michiel; Hagoort, Peter

    2015-04-01

    We sketch four applications of Marr's levels-of-analysis methodology to the relations between logic and experimental data in the cognitive neuroscience of language and reasoning. The first part of the paper illustrates the explanatory power of computational level theories based on logic. We show that a Bayesian treatment of the suppression task in reasoning with conditionals is ruled out by EEG data, supporting instead an analysis based on defeasible logic. Further, we describe how results from an EEG study on temporal prepositions can be reanalyzed using formal semantics, addressing a potential confound. The second part of the article demonstrates the predictive power of logical theories drawing on EEG data on processing progressive constructions and on behavioral data on conditional reasoning in people with autism. Logical theories can constrain processing hypotheses all the way down to neurophysiology, and conversely neuroscience data can guide the selection of alternative computational level models of cognition. Copyright © 2014 Cognitive Science Society, Inc.

  13. Exploring Taiwanese College Students' Perceptions of Text-Based, Computer-Mediated Communication Technology in Learning Japanese as a Foreign Language

    ERIC Educational Resources Information Center

    Tanaka, Makiko

    2015-01-01

    The use of computers as an educational tool has become very popular in the context of language teaching and learning. Research into computer mediated communication (CMC) in a Japanese as a foreign language (JFL) learning and teaching context can take advantage of various pedagogical possibilities, just as in the English classroom. This study…

  14. A Survey of Quantum Programming Languages: History, Methods, and Tools

    DTIC Science & Technology

    2008-01-01

    and entanglement , to achieve computational solutions to certain problems in less time (fewer computational cycles) than is possible using classical...superposition of quantum bits, entanglement , destructive measurement, and the no-cloning theorem. These differences must be thoroughly understood and even...computers using well-known languages such as C, C++, Java, and rapid prototyping languages such as Maple, Mathematica, and Matlab . A good on-line

  15. The Advantages of Using Technology in Second Language Education: Technology Integration in Foreign Language Teaching Demonstrates the Shift from a Behavioral to a Constructivist Learning Approach

    ERIC Educational Resources Information Center

    Wang, Li

    2005-01-01

    With the advent of networked computers and Internet technology, computer-based instruction has been widely used in language classrooms throughout the United States. Computer technologies have dramatically changed the way people gather information, conduct research and communicate with others worldwide. Considering the tremendous startup expenses,…

  16. A language comparison for scientific computing on MIMD architectures

    NASA Technical Reports Server (NTRS)

    Jones, Mark T.; Patrick, Merrell L.; Voigt, Robert G.

    1989-01-01

    Choleski's method for solving banded symmetric, positive definite systems is implemented on a multiprocessor computer using three FORTRAN based parallel programming languages, the Force, PISCES and Concurrent FORTRAN. The capabilities of the language for expressing parallelism and their user friendliness are discussed, including readability of the code, debugging assistance offered, and expressiveness of the languages. The performance of the different implementations is compared. It is argued that PISCES, using the Force for medium-grained parallelism, is the appropriate choice for programming Choleski's method on the multiprocessor computer, Flex/32.

  17. Readiness for Solving Story Problems.

    ERIC Educational Resources Information Center

    Dunlap, William F.

    1982-01-01

    Readiness activities are described which are designed to help learning disabled (LD) students learn to perform computations in story problems. Activities proceed from concrete objects to numbers and involve the students in devising story problems. The language experience approach is incorporated with the enactive, iconic, and symbolic levels of…

  18. Modeling Coevolution between Language and Memory Capacity during Language Origin

    PubMed Central

    Gong, Tao; Shuai, Lan

    2015-01-01

    Memory is essential to many cognitive tasks including language. Apart from empirical studies of memory effects on language acquisition and use, there lack sufficient evolutionary explorations on whether a high level of memory capacity is prerequisite for language and whether language origin could influence memory capacity. In line with evolutionary theories that natural selection refined language-related cognitive abilities, we advocated a coevolution scenario between language and memory capacity, which incorporated the genetic transmission of individual memory capacity, cultural transmission of idiolects, and natural and cultural selections on individual reproduction and language teaching. To illustrate the coevolution dynamics, we adopted a multi-agent computational model simulating the emergence of lexical items and simple syntax through iterated communications. Simulations showed that: along with the origin of a communal language, an initially-low memory capacity for acquired linguistic knowledge was boosted; and such coherent increase in linguistic understandability and memory capacities reflected a language-memory coevolution; and such coevolution stopped till memory capacities became sufficient for language communications. Statistical analyses revealed that the coevolution was realized mainly by natural selection based on individual communicative success in cultural transmissions. This work elaborated the biology-culture parallelism of language evolution, demonstrated the driving force of culturally-constituted factors for natural selection of individual cognitive abilities, and suggested that the degree difference in language-related cognitive abilities between humans and nonhuman animals could result from a coevolution with language. PMID:26544876

  19. Modeling Coevolution between Language and Memory Capacity during Language Origin.

    PubMed

    Gong, Tao; Shuai, Lan

    2015-01-01

    Memory is essential to many cognitive tasks including language. Apart from empirical studies of memory effects on language acquisition and use, there lack sufficient evolutionary explorations on whether a high level of memory capacity is prerequisite for language and whether language origin could influence memory capacity. In line with evolutionary theories that natural selection refined language-related cognitive abilities, we advocated a coevolution scenario between language and memory capacity, which incorporated the genetic transmission of individual memory capacity, cultural transmission of idiolects, and natural and cultural selections on individual reproduction and language teaching. To illustrate the coevolution dynamics, we adopted a multi-agent computational model simulating the emergence of lexical items and simple syntax through iterated communications. Simulations showed that: along with the origin of a communal language, an initially-low memory capacity for acquired linguistic knowledge was boosted; and such coherent increase in linguistic understandability and memory capacities reflected a language-memory coevolution; and such coevolution stopped till memory capacities became sufficient for language communications. Statistical analyses revealed that the coevolution was realized mainly by natural selection based on individual communicative success in cultural transmissions. This work elaborated the biology-culture parallelism of language evolution, demonstrated the driving force of culturally-constituted factors for natural selection of individual cognitive abilities, and suggested that the degree difference in language-related cognitive abilities between humans and nonhuman animals could result from a coevolution with language.

  20. Semantic computing and language knowledge bases

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Wang, Houfeng; Yu, Shiwen

    2017-09-01

    As the proposition of the next-generation Web - semantic Web, semantic computing has been drawing more and more attention within the circle and the industries. A lot of research has been conducted on the theory and methodology of the subject, and potential applications have also been investigated and proposed in many fields. The progress of semantic computing made so far cannot be detached from its supporting pivot - language resources, for instance, language knowledge bases. This paper proposes three perspectives of semantic computing from a macro view and describes the current status of affairs about the construction of language knowledge bases and the related research and applications that have been carried out on the basis of these resources via a case study in the Institute of Computational Linguistics at Peking University.

  1. Computational Evaluation of the Traceback Method

    ERIC Educational Resources Information Center

    Kol, Sheli; Nir, Bracha; Wintner, Shuly

    2014-01-01

    Several models of language acquisition have emerged in recent years that rely on computational algorithms for simulation and evaluation. Computational models are formal and precise, and can thus provide mathematically well-motivated insights into the process of language acquisition. Such models are amenable to robust computational evaluation,…

  2. Konnen Computer das Sprachproblem losen (Can Computers Solve the Language Problem)?

    ERIC Educational Resources Information Center

    Zeilinger, Michael

    1972-01-01

    Various computer applications in linguistics, primarily speech synthesis and machine translation, are reviewed. Although the computer proves useful for statistics, dictionary building and programmed instruction, the promulgation of a world auxiliary language is considered a more human and practical solution to the international communication…

  3. The Roles of "Second Life" in a College Computer-Assisted Language Learning (CALL) Course in Taiwan, ROC

    ERIC Educational Resources Information Center

    Liou, Hsien-Chin

    2012-01-01

    Various language learning projects using "Second Life" (SL) have been documented; still, their specific learning potentials, particularly in English as a foreign language (EFL) context, remain to be explored. The current study aims to add one piece of empirical evidence on how SL can be infused into a computer-assisted language learning…

  4. Educational and Interpersonal Uses of Home Computers by Adolescents with and without Specific Language Impairment

    ERIC Educational Resources Information Center

    Durkin, Kevin; Conti-Ramsden, Gina; Walker, Allan; Simkin, Zoe

    2009-01-01

    Many uses of new media entail processing language content, yet little is known about the relationship between language ability and media use in young people. This study compares educational versus interpersonal uses of home computers in adolescents with and without a history of specific language impairment (SLI). Participants were 55 17-year-olds…

  5. Sign language recognition and translation: a multidisciplined approach from the field of artificial intelligence.

    PubMed

    Parton, Becky Sue

    2006-01-01

    In recent years, research has progressed steadily in regard to the use of computers to recognize and render sign language. This paper reviews significant projects in the field beginning with finger-spelling hands such as "Ralph" (robotics), CyberGloves (virtual reality sensors to capture isolated and continuous signs), camera-based projects such as the CopyCat interactive American Sign Language game (computer vision), and sign recognition software (Hidden Markov Modeling and neural network systems). Avatars such as "Tessa" (Text and Sign Support Assistant; three-dimensional imaging) and spoken language to sign language translation systems such as Poland's project entitled "THETOS" (Text into Sign Language Automatic Translator, which operates in Polish; natural language processing) are addressed. The application of this research to education is also explored. The "ICICLE" (Interactive Computer Identification and Correction of Language Errors) project, for example, uses intelligent computer-aided instruction to build a tutorial system for deaf or hard-of-hearing children that analyzes their English writing and makes tailored lessons and recommendations. Finally, the article considers synthesized sign, which is being added to educational material and has the potential to be developed by students themselves.

  6. SOL - SIZING AND OPTIMIZATION LANGUAGE COMPILER

    NASA Technical Reports Server (NTRS)

    Scotti, S. J.

    1994-01-01

    SOL is a computer language which is geared to solving design problems. SOL includes the mathematical modeling and logical capabilities of a computer language like FORTRAN but also includes the additional power of non-linear mathematical programming methods (i.e. numerical optimization) at the language level (as opposed to the subroutine level). The language-level use of optimization has several advantages over the traditional, subroutine-calling method of using an optimizer: first, the optimization problem is described in a concise and clear manner which closely parallels the mathematical description of optimization; second, a seamless interface is automatically established between the optimizer subroutines and the mathematical model of the system being optimized; third, the results of an optimization (objective, design variables, constraints, termination criteria, and some or all of the optimization history) are output in a form directly related to the optimization description; and finally, automatic error checking and recovery from an ill-defined system model or optimization description is facilitated by the language-level specification of the optimization problem. Thus, SOL enables rapid generation of models and solutions for optimum design problems with greater confidence that the problem is posed correctly. The SOL compiler takes SOL-language statements and generates the equivalent FORTRAN code and system calls. Because of this approach, the modeling capabilities of SOL are extended by the ability to incorporate existing FORTRAN code into a SOL program. In addition, SOL has a powerful MACRO capability. The MACRO capability of the SOL compiler effectively gives the user the ability to extend the SOL language and can be used to develop easy-to-use shorthand methods of generating complex models and solution strategies. The SOL compiler provides syntactic and semantic error-checking, error recovery, and detailed reports containing cross-references to show where each variable was used. The listings summarize all optimizations, listing the objective functions, design variables, and constraints. The compiler offers error-checking specific to optimization problems, so that simple mistakes will not cost hours of debugging time. The optimization engine used by and included with the SOL compiler is a version of Vanderplatt's ADS system (Version 1.1) modified specifically to work with the SOL compiler. SOL allows the use of the over 100 ADS optimization choices such as Sequential Quadratic Programming, Modified Feasible Directions, interior and exterior penalty function and variable metric methods. Default choices of the many control parameters of ADS are made for the user, however, the user can override any of the ADS control parameters desired for each individual optimization. The SOL language and compiler were developed with an advanced compiler-generation system to ensure correctness and simplify program maintenance. Thus, SOL's syntax was defined precisely by a LALR(1) grammar and the SOL compiler's parser was generated automatically from the LALR(1) grammar with a parser-generator. Hence unlike ad hoc, manually coded interfaces, the SOL compiler's lexical analysis insures that the SOL compiler recognizes all legal SOL programs, can recover from and correct for many errors and report the location of errors to the user. This version of the SOL compiler has been implemented on VAX/VMS computer systems and requires 204 KB of virtual memory to execute. Since the SOL compiler produces FORTRAN code, it requires the VAX FORTRAN compiler to produce an executable program. The SOL compiler consists of 13,000 lines of Pascal code. It was developed in 1986 and last updated in 1988. The ADS and other utility subroutines amount to 14,000 lines of FORTRAN code and were also updated in 1988.

  7. User interfaces for computational science: A domain specific language for OOMMF embedded in Python

    NASA Astrophysics Data System (ADS)

    Beg, Marijan; Pepper, Ryan A.; Fangohr, Hans

    2017-05-01

    Computer simulations are used widely across the engineering and science disciplines, including in the research and development of magnetic devices using computational micromagnetics. In this work, we identify and review different approaches to configuring simulation runs: (i) the re-compilation of source code, (ii) the use of configuration files, (iii) the graphical user interface, and (iv) embedding the simulation specification in an existing programming language to express the computational problem. We identify the advantages and disadvantages of different approaches and discuss their implications on effectiveness and reproducibility of computational studies and results. Following on from this, we design and describe a domain specific language for micromagnetics that is embedded in the Python language, and allows users to define the micromagnetic simulations they want to carry out in a flexible way. We have implemented this micromagnetic simulation description language together with a computational backend that executes the simulation task using the Object Oriented MicroMagnetic Framework (OOMMF). We illustrate the use of this Python interface for OOMMF by solving the micromagnetic standard problem 4. All the code is publicly available and is open source.

  8. 32 CFR 68.3 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... Academic skills. Competencies in English, reading, writing, speaking, mathematics, and computer skills that..., degree competencies (e.g., foreign language, computer literacy), and elective course options that... course requirements, degree competencies (e.g., foreign language, computer literacy), and elective course...

  9. 32 CFR 68.3 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... Academic skills. Competencies in English, reading, writing, speaking, mathematics, and computer skills that..., degree competencies (e.g., foreign language, computer literacy), and elective course options that... course requirements, degree competencies (e.g., foreign language, computer literacy), and elective course...

  10. Study for application of a sounding rocket experiment to spacelab/shuttle mission

    NASA Technical Reports Server (NTRS)

    Code, A. D.

    1975-01-01

    An inexpensive adaptation of rocket-size packages to Spacelab/Shuttle use was studied. A two-flight project extending over two years was baselined, requiring 80 man-months of effort. It was concluded that testing should be held to a minimum since rocket packages seem to be able to tolerate shuttle vibration and noise levels. A standard, flexible control and data collection language such as FORTH should be used rather than a computation language such as FORTRAN in order to hold programming costs to a minimum.

  11. A database system to support image algorithm evaluation

    NASA Technical Reports Server (NTRS)

    Lien, Y. E.

    1977-01-01

    The design is given of an interactive image database system IMDB, which allows the user to create, retrieve, store, display, and manipulate images through the facility of a high-level, interactive image query (IQ) language. The query language IQ permits the user to define false color functions, pixel value transformations, overlay functions, zoom functions, and windows. The user manipulates the images through generic functions. The user can direct images to display devices for visual and qualitative analysis. Image histograms and pixel value distributions can also be computed to obtain a quantitative analysis of images.

  12. Effects of computer-based immediate feedback on foreign language listening comprehension and test-associated anxiety.

    PubMed

    Lee, Shu-Ping; Su, Hui-Kai; Lee, Shin-Da

    2012-06-01

    This study investigated the effects of immediate feedback on computer-based foreign language listening comprehension tests and on intrapersonal test-associated anxiety in 72 English major college students at a Taiwanese University. Foreign language listening comprehension of computer-based tests designed by MOODLE, a dynamic e-learning environment, with or without immediate feedback together with the state-trait anxiety inventory (STAI) were tested and repeated after one week. The analysis indicated that immediate feedback during testing caused significantly higher anxiety and resulted in significantly higher listening scores than in the control group, which had no feedback. However, repeated feedback did not affect the test anxiety and listening scores. Computer-based immediate feedback did not lower debilitating effects of anxiety but enhanced students' intrapersonal eustress-like anxiety and probably improved their attention during listening tests. Computer-based tests with immediate feedback might help foreign language learners to increase attention in foreign language listening comprehension.

  13. Discussion Forum Interactions: Text and Context

    ERIC Educational Resources Information Center

    Montero, Begona; Watts, Frances; Garcia-Carbonell, Amparo

    2007-01-01

    Computer-mediated communication (CMC) is currently used in language teaching as a bridge for the development of written and spoken skills [Kern, R., 1995. "Restructuring classroom interaction with networked computers: effects on quantity and characteristics of language production." "The Modern Language Journal" 79, 457-476]. Within CMC…

  14. Why Virtual, Why Environments? Implementing Virtual Reality Concepts in Computer-Assisted Language Learning.

    ERIC Educational Resources Information Center

    Schwienhorst, Klaus

    2002-01-01

    Discussion of computer-assisted language learning focuses on the benefits of virtual reality environments, particularly for foreign language contexts. Topics include three approaches to learner autonomy; supporting reflection, including self-awareness; supporting interaction, including collaboration; and supporting experimentation, including…

  15. Neural Network Computing and Natural Language Processing.

    ERIC Educational Resources Information Center

    Borchardt, Frank

    1988-01-01

    Considers the application of neural network concepts to traditional natural language processing and demonstrates that neural network computing architecture can: (1) learn from actual spoken language; (2) observe rules of pronunciation; and (3) reproduce sounds from the patterns derived by its own processes. (Author/CB)

  16. Problems and Prospects in Foreign Language Computing.

    ERIC Educational Resources Information Center

    Pusack, James P.

    The problems and prospects of the field of foreign language computing are profiled through a survey of typical implementation, development, and research projects that language teachers may undertake. Basic concepts in instructional design, hardware, and software are first clarified. Implementation projects involving courseware evaluation, textbook…

  17. Interfacing the Experimenter to the Computer: Languages for Psychologists

    ERIC Educational Resources Information Center

    Wood, Ronald W.; And Others

    1975-01-01

    An examination and comparison of the computer languages which behavioral scientists are most likely to use: SCAT, INTERACT, SKED, OS/8 Fortran IV, RT11/Fortran, RSX-11M, Data General's Real-Time; Disk Operating System and its Fortran, and interpretative Languages. (EH)

  18. The Temporal Dimension of Linguistic Prediction

    ERIC Educational Resources Information Center

    Chow, Wing Yee

    2013-01-01

    This thesis explores how predictions about upcoming language inputs are computed during real-time language comprehension. Previous research has demonstrated humans' ability to use rich contextual information to compute linguistic prediction during real-time language comprehension, and it has been widely assumed that contextual information can…

  19. SuperPILOT: A Comprehensive Computer-Assisted Instruction Programming Language for the Apple II Computer.

    ERIC Educational Resources Information Center

    Falleur, David M.

    This presentation describes SuperPILOT, an extended version of Apple PILOT, a programming language for developing computer-assisted instruction (CAI) with the Apple II computer that includes the features of its early PILOT (Programmed Inquiry, Learning or Teaching) ancestors together with new features that make use of the Apple computer's advanced…

  20. L3 Interactive Data Language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hohn, Michael; Adams, Paul

    2006-09-05

    The L3 system is a computational steering environment for image processing and scientific computing. It consists of an interactive graphical language and interface. Its purpose is to help advanced users in controlling their computational software and assist in the management of data accumulated during numerical experiments. L3 provides a combination of features not found in other environments; these are: - textual and graphical construction of programs - persistence of programs and associated data - direct mapping between the scripts, the parameters, and the produced data - implicit hierarchial data organization - full programmability, including conditionals and functions - incremental executionmore » of programs The software includes the l3 language and the graphical environment. The language is a single-assignment functional language; the implementation consists of lexer, parser, interpreter, storage handler, and editing support, The graphical environment is an event-driven nested list viewer/editor providing graphical elements corresponding to the language. These elements are both the represenation of a users program and active interfaces to the values computed by that program.« less

  1. What language is the language-ready brain ready for?. Comment on "Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain" by Michael A. Arbib

    NASA Astrophysics Data System (ADS)

    Croft, William

    2016-03-01

    Arbib's computational comparative neuroprimatology [1] is a welcome model for cognitive linguists, that is, linguists who ground their models of language in human cognition and language use in social interaction. Arbib argues that language emerged via biological and cultural coevolution [1]; linguistic knowledge is represented by constructions, and semantic representations of linguistic constructions are grounded in embodied perceptual-motor schemas (the mirror system hypothesis). My comments offer some refinements from a linguistic point of view.

  2. The Formal Semantics of PVS

    NASA Technical Reports Server (NTRS)

    Owre, Sam; Shankar, Natarajan

    1999-01-01

    A specification language is a medium for expressing what is computed rather than how it is computed. Specification languages share some features with programming languages but are also different in several important ways. For our purpose, a specification language is a logic within which the behavior of computational systems can be formalized. Although a specification can be used to simulate the behavior of such systems, we mainly use specifications to state and prove system properties with mechanical assistance. We present the formal semantics of the specification language of SRI's Prototype Verification System (PVS). This specification language is based on the simply typed lambda calculus. The novelty in PVS is that it contains very expressive language features whose static analysis (e.g., typechecking) requires the assistance of a theorem prover. The formal semantics illuminates several of the design considerations underlying PVS, the interaction between theorem proving and typechecking.

  3. Investigating the Effectiveness of Computer-Assisted Language Learning (CALL) Using Google Documents in Enhancing Writing--A Study on Senior 1 Students in a Chinese Independent High School

    ERIC Educational Resources Information Center

    Ambrose, Regina Maria; Palpanathan, Shanthini

    2017-01-01

    Computer-assisted language learning (CALL) has evolved through various stages in both technology as well as the pedagogical use of technology (Warschauer & Healey, 1998). Studies show that the CALL trend has facilitated students in their English language writing with useful tools such as computer based activities and word processing. Students…

  4. Programming in HAL/S

    NASA Technical Reports Server (NTRS)

    Ryer, M. J.

    1978-01-01

    HAL/S is a computer programming language; it is a representation for algorithms which can be interpreted by either a person or a computer. HAL/S compilers transform blocks of HAL/S code into machine language which can then be directly executed by a computer. When the machine language is executed, the algorithm specified by the HAL/S code (source) is performed. This document describes how to read and write HAL/S source.

  5. Evaluation of the OpenCL AES Kernel using the Intel FPGA SDK for OpenCL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Zheming; Yoshii, Kazutomo; Finkel, Hal

    The OpenCL standard is an open programming model for accelerating algorithms on heterogeneous computing system. OpenCL extends the C-based programming language for developing portable codes on different platforms such as CPU, Graphics processing units (GPUs), Digital Signal Processors (DSPs) and Field Programmable Gate Arrays (FPGAs). The Intel FPGA SDK for OpenCL is a suite of tools that allows developers to abstract away the complex FPGA-based development flow for a high-level software development flow. Users can focus on the design of hardware-accelerated kernel functions in OpenCL and then direct the tools to generate the low-level FPGA implementations. The approach makes themore » FPGA-based development more accessible to software users as the needs for hybrid computing using CPUs and FPGAs are increasing. It can also significantly reduce the hardware development time as users can evaluate different ideas with high-level language without deep FPGA domain knowledge. In this report, we evaluate the performance of the kernel using the Intel FPGA SDK for OpenCL and Nallatech 385A FPGA board. Compared to the M506 module, the board provides more hardware resources for a larger design exploration space. The kernel performance is measured with the compute kernel throughput, an upper bound to the FPGA throughput. The report presents the experimental results in details. The Appendix lists the kernel source code.« less

  6. Neuroplasticity-Based Cognitive and Linguistic Skills Training Improves Reading and Writing Skills in College Students

    PubMed Central

    Rogowsky, Beth A.; Papamichalis, Pericles; Villa, Laura; Heim, Sabine; Tallal, Paula

    2013-01-01

    This study reports an evaluation of the effect of computer-based cognitive and linguistic training on college students’ reading and writing skills. The computer-based training included a series of increasingly challenging software programs that were designed to strengthen students’ foundational cognitive skills (memory, attention span, processing speed, and sequencing) in the context of listening and higher level reading tasks. Twenty-five college students (12 native English language; 13 English Second Language), who demonstrated poor writing skills, participated in the training group. The training group received daily training during the spring semester (11 weeks) with the Fast ForWord Literacy (FFW-L) and upper levels of the Fast ForWord Reading series (Levels 3–5). The comparison group (n = 28) selected from the general college population did not receive training. Both the training and comparison groups attended the same university. All students took the Gates MacGinitie Reading Test (GMRT) and the Oral and Written Language Scales (OWLS) Written Expression Scale at the beginning (Time 1) and end (Time 2) of the spring college semester. Results from this study showed that the training group made a statistically greater improvement from Time 1 to Time 2 in both their reading skills and their writing skills than the comparison group. The group who received training began with statistically lower writing skills before training, but exceeded the writing skills of the comparison group after training. PMID:23533100

  7. The Printout: Computers and Reading in the United Kingdom.

    ERIC Educational Resources Information Center

    Ewing, James M.

    1988-01-01

    Offers an overview of some reading and language arts computer projects in the United Kingdom, including language teaching and intelligent knowledge-based systems, assessment of written style by computer, and desktop publishing in the primary school. (ARH)

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mizell, D.; Carter, S.

    In 1987, ISI's parallel distributed computing research group implemented a prototype sequential simulation system, designed for high-level simulation of candidate (Strategic Defense Initiative) architectures. A main design goal was to produce a simulation system that could incorporate non-trivial, executable representations of battle-management computations on each platform that were capable of controlling the actions of that platform throughout the simulation. The term BMA (battle manager abstraction) was used to refer to these simulated battle-management computations. In the authors first version of the simulator, the BMAs were C++ programs that we wrote and manually inserted into the system. Since then, they havemore » designed and implemented KMAC, a high-level language for writing BMA's. The KMAC preprocessor, built using the Unix tools lex 2 and YACC 3, translates KMAC source programs into C++ programs and passes them on to the C++ compiler. The KMAC preprocessor was incorporated into and operates under the control of the simulator's interactive user interface. After the KMAC preprocessor has translated a program into C++, the user interface system invokes the C++ compiler, and incorporates the resulting object code into the simulator load module for execution as part of a simulation run. This report describes the KMAC language and its preprocessor. Section 2 provides background material on the design of the simulation system that is necessary for understanding some of the parts of KMAC and some of the reasons it is structured the way it is. Section 3 describes the syntax and semantics of the language, and Section 4 discusses design of the preprocessor.« less

  9. Simulation and Collaborative Learning in Political Science and Sociology Classrooms.

    ERIC Educational Resources Information Center

    Peters, Sandra; Saxon, Deborah

    The program described here used cooperative, content-based computer writing projects to teach Japanese students at an intermediate level of English proficiency enrolled in first-year, English-language courses in political science/environmental issues and sociology/environmental issues in an international college program. The approach was taken to…

  10. Computational Support for Early Elicitation and Classification of Tone

    ERIC Educational Resources Information Center

    Bird, Steven; Lee, Haejoong

    2014-01-01

    Investigating a tone language involves careful transcription of tone on words and phrases. This is challenging when the phonological categories--the tones or melodies--have not been identified. Effects such as coarticulation, sandhi, and phrase-level prosody appear as obstacles to early elicitation and classification of tone. This article presents…

  11. Semantics vs. World Knowledge in Prefrontal Cortex

    ERIC Educational Resources Information Center

    Pylkkanen, Liina; Oliveri, Bridget; Smart, Andrew J.

    2009-01-01

    Humans have knowledge about the properties of their native language at various levels of representation; sound, structure, and meaning computation constitute the core components of any linguistic theory. Although the brain sciences have engaged with representational theories of sound and syntactic structure, the study of the neural bases of…

  12. From Computer Assisted Language Learning (CALL) to Mobile Assisted Language Use (MALU)

    ERIC Educational Resources Information Center

    Jarvis, Huw; Achilleos, Marianna

    2013-01-01

    This article begins by critiquing the long-established acronym CALL (Computer Assisted Language Learning). We then go on to report on a small-scale study which examines how student non-native speakers of English use a range of digital devices beyond the classroom in both their first (L1) and second (L2) languages. We look also at the extent to…

  13. Computer-Assisted Foreign Language Teaching and Learning: Technological Advances

    ERIC Educational Resources Information Center

    Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.

    2013-01-01

    Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…

  14. From computing with numbers to computing with words. From manipulation of measurements to manipulation of perceptions.

    PubMed

    Zadeh, L A

    2001-04-01

    Interest in issues relating to consciousness has grown markedly during the last several years. And yet, nobody can claim that consciousness is a well-understood concept that lends itself to precise analysis. It may be argued that, as a concept, consciousness is much too complex to fit into the conceptual structure of existing theories based on Aristotelian logic and probability theory. An approach suggested in this paper links consciousness to perceptions and perceptions to their descriptors in a natural language. In this way, those aspects of consciousness which relate to reasoning and concept formation are linked to what is referred to as the methodology of computing with words (CW). Computing, in its usual sense, is centered on manipulation of numbers and symbols. In contrast, computing with words, or CW for short, is a methodology in which the objects of computation are words and propositions drawn from a natural language (e.g., small, large, far, heavy, not very likely, the price of gas is low and declining, Berkeley is near San Francisco, it is very unlikely that there will be a significant increase in the price of oil in the near future, etc.). Computing with words is inspired by the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements and any computations. Familiar examples of such tasks are parking a car, driving in heavy traffic, playing golf, riding a bicycle, understanding speech, and summarizing a story. Underlying this remarkable capability is the brain's crucial ability to manipulate perceptions--perceptions of distance, size, weight, color, speed, time, direction, force, number, truth, likelihood, and other characteristics of physical and mental objects. Manipulation of perceptions plays a key role in human recognition, decision and execution processes. As a methodology, computing with words provides a foundation for a computational theory of perceptions: a theory which may have an important bearing on how humans make--and machines might make--perception-based rational decisions in an environment of imprecision, uncertainty, and partial truth. A basic difference between perceptions and measurements is that, in general, measurements are crisp, whereas perceptions are fuzzy. One of the fundamental aims of science has been and continues to be that of progressing from perceptions to measurements. Pursuit of this aim has led to brilliant successes. We have sent men to the moon; we can build computers that are capable of performing billions of computations per second; we have constructed telescopes that can explore the far reaches of the universe; and we can date the age of rocks that are millions of years old. But alongside the brilliant successes stand conspicuous underachievements and outright failures. We cannot build robots that can move with the agility of animals or humans; we cannot automate driving in heavy traffic; we cannot translate from one language to another at the level of a human interpreter; we cannot create programs that can summarize non-trivial stories; our ability to model the behavior of economic systems leaves much to be desired; and we cannot build machines that can compete with children in the performance of a wide variety of physical and cognitive tasks. It may be argued that underlying the underachievements and failures is the unavailability of a methodology for reasoning and computing with perceptions rather than measurements. An outline of such a methodology--referred to as a computational theory of perceptions--is presented in this paper. The computational theory of perceptions (CTP) is based on the methodology of CW. In CTP, words play the role of labels of perceptions, and, more generally, perceptions are expressed as propositions in a natural language. CW-based techniques are employed to translate propositions expressed in a natural language into what is called the Generalized Constraint Language (GCL). In this language, the meaning of a proposition is expressed as a generalized constraint, X isr R, where X is the constrained variable, R is the constraining relation, and isr is a variable copula in which r is an indexing variable whose value defines the way in which R constrains X. Among the basic types of constraints are possibilistic, veristic, probabilistic, random set, Pawlak set, fuzzy graph, and usuality. The wide variety of constraints in GCL makes GCL a much more expressive language than the language of predicate logic. In CW, the initial and terminal data sets, IDS and TDS, are assumed to consist of propositions expressed in a natural language. These propositions are translated, respectively, into antecedent and consequent constraints. Consequent constraints are derived from antecedent constraints through the use of rules of constraint propagation. The principal constraint propagation rule is the generalized extension principle. (ABSTRACT TRUNCATED)

  15. Programming languages and compiler design for realistic quantum hardware.

    PubMed

    Chong, Frederic T; Franklin, Diana; Martonosi, Margaret

    2017-09-13

    Quantum computing sits at an important inflection point. For years, high-level algorithms for quantum computers have shown considerable promise, and recent advances in quantum device fabrication offer hope of utility. A gap still exists, however, between the hardware size and reliability requirements of quantum computing algorithms and the physical machines foreseen within the next ten years. To bridge this gap, quantum computers require appropriate software to translate and optimize applications (toolflows) and abstraction layers. Given the stringent resource constraints in quantum computing, information passed between layers of software and implementations will differ markedly from in classical computing. Quantum toolflows must expose more physical details between layers, so the challenge is to find abstractions that expose key details while hiding enough complexity.

  16. Programming languages and compiler design for realistic quantum hardware

    NASA Astrophysics Data System (ADS)

    Chong, Frederic T.; Franklin, Diana; Martonosi, Margaret

    2017-09-01

    Quantum computing sits at an important inflection point. For years, high-level algorithms for quantum computers have shown considerable promise, and recent advances in quantum device fabrication offer hope of utility. A gap still exists, however, between the hardware size and reliability requirements of quantum computing algorithms and the physical machines foreseen within the next ten years. To bridge this gap, quantum computers require appropriate software to translate and optimize applications (toolflows) and abstraction layers. Given the stringent resource constraints in quantum computing, information passed between layers of software and implementations will differ markedly from in classical computing. Quantum toolflows must expose more physical details between layers, so the challenge is to find abstractions that expose key details while hiding enough complexity.

  17. The software for automatic creation of the formal grammars used by speech recognition, computer vision, editable text conversion systems, and some new functions

    NASA Astrophysics Data System (ADS)

    Kardava, Irakli; Tadyszak, Krzysztof; Gulua, Nana; Jurga, Stefan

    2017-02-01

    For more flexibility of environmental perception by artificial intelligence it is needed to exist the supporting software modules, which will be able to automate the creation of specific language syntax and to make a further analysis for relevant decisions based on semantic functions. According of our proposed approach, of which implementation it is possible to create the couples of formal rules of given sentences (in case of natural languages) or statements (in case of special languages) by helping of computer vision, speech recognition or editable text conversion system for further automatic improvement. In other words, we have developed an approach, by which it can be achieved to significantly improve the training process automation of artificial intelligence, which as a result will give us a higher level of self-developing skills independently from us (from users). At the base of our approach we have developed a software demo version, which includes the algorithm and software code for the entire above mentioned component's implementation (computer vision, speech recognition and editable text conversion system). The program has the ability to work in a multi - stream mode and simultaneously create a syntax based on receiving information from several sources.

  18. Conversation Analysis in Computer-Assisted Language Learning

    ERIC Educational Resources Information Center

    González-Lloret, Marta

    2015-01-01

    The use of Conversation Analysis (CA) in the study of technology-mediated interactions is a recent methodological addition to qualitative research in the field of Computer-assisted Language Learning (CALL). The expansion of CA in Second Language Acquisition research, coupled with the need for qualitative techniques to explore how people interact…

  19. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence from Word Segmentation

    ERIC Educational Resources Information Center

    Phillips, Lawrence; Pearl, Lisa

    2015-01-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's "cognitive plausibility." We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition…

  20. Resource Guide for Persons with Speech or Language Impairments.

    ERIC Educational Resources Information Center

    IBM, Atlanta, GA. National Support Center for Persons with Disabilities.

    The resource guide identifies products which assist speech or language impaired individuals in accessing IBM (International Business Machine) Personal Computers or the IBM Personal System/2 family of products. An introduction provides a general overview of ways computers can help persons with speech or language handicaps. The document then…

  1. Computer-Assisted Analysis of Written Language: Assessing the Written Language of Deaf Children, II.

    ERIC Educational Resources Information Center

    Parkhurst, Barbara G.; MacEachron, Marion P.

    1980-01-01

    Two pilot studies investigated the accuracy of a computer parsing system for analyzing written language of deaf children. Results of the studies showed good agreement between human and machine raters. Journal availability: Elsevier North Holland, Inc., 52 Vanderbilt Avenue, New York, NY 10017. (Author)

  2. Language Use in Asynchronous Computer-Mediated Communication in Taiwan

    ERIC Educational Resources Information Center

    Huang, Daphne Li-jung

    2009-01-01

    This paper describes how Chinese-English bilinguals in Taiwan use their languages in asynchronous computer-mediated communication, specifically, via Bulletin Board System (BBS) and email. The main data includes two types: emails collected from a social network and postings collected from two BBS websites. By examining patterns of language choice…

  3. Collaboration and Computer-Assisted Acquisition of a Second Language.

    ERIC Educational Resources Information Center

    Renie, Delphine; Chanier, Thierry

    1995-01-01

    Discusses how collaborative learning (CL) can be used in a computer-assisted learning (CAL) environment for language learning, reviewing research in the fields of applied linguistics, educational psychology, and artificial intelligence. An application of CL and CAL in the learning of French as a Second Language, focusing on interrogative…

  4. A Study of Multimedia Application-Based Vocabulary Acquisition

    ERIC Educational Resources Information Center

    Shao, Jing

    2012-01-01

    The development of computer-assisted language learning (CALL) has created the opportunity for exploring the effects of the multimedia application on foreign language vocabulary acquisition in recent years. This study provides an overview the computer-assisted language learning (CALL) and detailed a developing result of CALL--multimedia. With the…

  5. Toward a molecular programming language for algorithmic self-assembly

    NASA Astrophysics Data System (ADS)

    Patitz, Matthew John

    Self-assembly is the process whereby relatively simple components autonomously combine to form more complex objects. Nature exhibits self-assembly to form everything from microscopic crystals to living cells to galaxies. With a desire to both form increasingly sophisticated products and to understand the basic components of living systems, scientists have developed and studied artificial self-assembling systems. One such framework is the Tile Assembly Model introduced by Erik Winfree in 1998. In this model, simple two-dimensional square 'tiles' are designed so that they self-assemble into desired shapes. The work in this thesis consists of a series of results which build toward the future goal of designing an abstracted, high-level programming language for designing the molecular components of self-assembling systems which can perform powerful computations and form into intricate structures. The first two sets of results demonstrate self-assembling systems which perform infinite series of computations that characterize computably enumerable and decidable languages, and exhibit tools for algorithmically generating the necessary sets of tiles. In the next chapter, methods for generating tile sets which self-assemble into complicated shapes, namely a class of discrete self-similar fractal structures, are presented. Next, a software package for graphically designing tile sets, simulating their self-assembly, and debugging designed systems is discussed. Finally, a high-level programming language which abstracts much of the complexity and tedium of designing such systems, while preventing many of the common errors, is presented. The summation of this body of work presents a broad coverage of the spectrum of desired outputs from artificial self-assembling systems and a progression in the sophistication of tools used to design them. By creating a broader and deeper set of modular tools for designing self-assembling systems, we hope to increase the complexity which is attainable. These tools provide a solid foundation for future work in both the Tile Assembly Model and explorations into more advanced models.

  6. START: a system for flexible analysis of hundreds of genomic signal tracks in few lines of SQL-like queries.

    PubMed

    Zhu, Xinjie; Zhang, Qiang; Ho, Eric Dun; Yu, Ken Hung-On; Liu, Chris; Huang, Tim H; Cheng, Alfred Sze-Lok; Kao, Ben; Lo, Eric; Yip, Kevin Y

    2017-09-22

    A genomic signal track is a set of genomic intervals associated with values of various types, such as measurements from high-throughput experiments. Analysis of signal tracks requires complex computational methods, which often make the analysts focus too much on the detailed computational steps rather than on their biological questions. Here we propose Signal Track Query Language (STQL) for simple analysis of signal tracks. It is a Structured Query Language (SQL)-like declarative language, which means one only specifies what computations need to be done but not how these computations are to be carried out. STQL provides a rich set of constructs for manipulating genomic intervals and their values. To run STQL queries, we have developed the Signal Track Analytical Research Tool (START, http://yiplab.cse.cuhk.edu.hk/start/ ), a system that includes a Web-based user interface and a back-end execution system. The user interface helps users select data from our database of around 10,000 commonly-used public signal tracks, manage their own tracks, and construct, store and share STQL queries. The back-end system automatically translates STQL queries into optimized low-level programs and runs them on a computer cluster in parallel. We use STQL to perform 14 representative analytical tasks. By repeating these analyses using bedtools, Galaxy and custom Python scripts, we show that the STQL solution is usually the simplest, and the parallel execution achieves significant speed-up with large data files. Finally, we describe how a biologist with minimal formal training in computer programming self-learned STQL to analyze DNA methylation data we produced from 60 pairs of hepatocellular carcinoma (HCC) samples. Overall, STQL and START provide a generic way for analyzing a large number of genomic signal tracks in parallel easily.

  7. Computational Investigations of Multiword Chunks in Language Learning.

    PubMed

    McCauley, Stewart M; Christiansen, Morten H

    2017-07-01

    Second-language learners rarely arrive at native proficiency in a number of linguistic domains, including morphological and syntactic processing. Previous approaches to understanding the different outcomes of first- versus second-language learning have focused on cognitive and neural factors. In contrast, we explore the possibility that children and adults may rely on different linguistic units throughout the course of language learning, with specific focus on the granularity of those units. Following recent psycholinguistic evidence for the role of multiword chunks in online language processing, we explore the hypothesis that children rely more heavily on multiword units in language learning than do adults learning a second language. To this end, we take an initial step toward using large-scale, corpus-based computational modeling as a tool for exploring the granularity of speakers' linguistic units. Employing a computational model of language learning, the Chunk-Based Learner, we compare the usefulness of chunk-based knowledge in accounting for the speech of second-language learners versus children and adults speaking their first language. Our findings suggest that while multiword units are likely to play a role in second-language learning, adults may learn less useful chunks, rely on them to a lesser extent, and arrive at them through different means than children learning a first language. Copyright © 2017 Cognitive Science Society, Inc.

  8. Interactive natural language acquisition in a multi-modal recurrent neural architecture

    NASA Astrophysics Data System (ADS)

    Heinrich, Stefan; Wermter, Stefan

    2018-01-01

    For the complex human brain that enables us to communicate in natural language, we gathered good understandings of principles underlying language acquisition and processing, knowledge about sociocultural conditions, and insights into activity patterns in the brain. However, we were not yet able to understand the behavioural and mechanistic characteristics for natural language and how mechanisms in the brain allow to acquire and process language. In bridging the insights from behavioural psychology and neuroscience, the goal of this paper is to contribute a computational understanding of appropriate characteristics that favour language acquisition. Accordingly, we provide concepts and refinements in cognitive modelling regarding principles and mechanisms in the brain and propose a neurocognitively plausible model for embodied language acquisition from real-world interaction of a humanoid robot with its environment. In particular, the architecture consists of a continuous time recurrent neural network, where parts have different leakage characteristics and thus operate on multiple timescales for every modality and the association of the higher level nodes of all modalities into cell assemblies. The model is capable of learning language production grounded in both, temporal dynamic somatosensation and vision, and features hierarchical concept abstraction, concept decomposition, multi-modal integration, and self-organisation of latent representations.

  9. Les Industries de la langue: Au confluent de la linguistique et de l'informatique (The Language Utilities: At the Confluence of Linguistics and Computer Science).

    ERIC Educational Resources Information Center

    Bourret, Annie, Ed.; L'Homme, Marie-Claude, Ed.

    A collection of essays addresses aspects of the "Language Utilities," the general term for the area of the conjunction of computer science and linguistics. The following are English translations of the titles of the articles in the collections: "Industrialization of the French Language and Its Maintenance as an Important Language of…

  10. The Adaptation Study of the Questionnaires of the Attitude towards CALL (A-CALL), the Attitude towards CAL (A-CAL), the Attitude towards Foreign Language Learning (A-FLL) to Turkish Language

    ERIC Educational Resources Information Center

    Erdem, Cahit; Saykili, Abdullah; Kocyigit, Mehmet

    2018-01-01

    This study primarily aims to adapt the Foreign Language Learning (FLL), Computer assisted Learning (CAL) and Computer assisted Language Learning (CALL) scales developed by Vandewaetere and Desmet into Turkish context. The instrument consists of three scales which are "the attitude towards CALL questionnaire" ("A-CALL")…

  11. Towards a behavioral-matching based compilation of synthetic biology functions.

    PubMed

    Basso-Blandin, Adrien; Delaplace, Franck

    2015-09-01

    The field of synthetic biology is looking forward engineering framework for safely designing reliable de-novo biological functions. In this undertaking, Computer-Aided-Design (CAD) environments should play a central role for facilitating the design. Although, CAD environment is widely used to engineer artificial systems the application in synthetic biology is still in its infancy. In this article we address the problem of the design of a high level language which at the core of CAD environment. More specifically the Gubs (Genomic Unified Behavioural Specification) language is a specification language used to describe the observations of the expected behaviour. The compiler appropriately selects components such that the observation of the synthetic biological function resulting to their assembly complies to the programmed behaviour.

  12. Application of Computer Assisted Colposcopy Education

    DTIC Science & Technology

    2001-05-01

    design allowed for less generalizability of findings when compared with a randomized, controlled study. Language, age , and a literacy level of seventh...participants (Bensen et al., 1999; Lewis, 1999). Lewis (1999) noted CAI to be effective for persons across the age continuum. Even patients with low literacy...years of age or older and eligible for military medical care. Additionally, participants had to read at least at a seventh grade level, speak English

  13. Semantic Ambiguity Effects in L2 Word Recognition.

    PubMed

    Ishida, Tomomi

    2018-06-01

    The present study examined the ambiguity effects in second language (L2) word recognition. Previous studies on first language (L1) lexical processing have observed that ambiguous words are recognized faster and more accurately than unambiguous words on lexical decision tasks. In this research, L1 and L2 speakers of English were asked whether a letter string on a computer screen was an English word or not. An ambiguity advantage was found for both groups and greater ambiguity effects were found for the non-native speaker group when compared to the native speaker group. The findings imply that the larger ambiguity advantage for L2 processing is due to their slower response time in producing adequate feedback activation from the semantic level to the orthographic level.

  14. RosettaScripts: a scripting language interface to the Rosetta macromolecular modeling suite.

    PubMed

    Fleishman, Sarel J; Leaver-Fay, Andrew; Corn, Jacob E; Strauch, Eva-Maria; Khare, Sagar D; Koga, Nobuyasu; Ashworth, Justin; Murphy, Paul; Richter, Florian; Lemmon, Gordon; Meiler, Jens; Baker, David

    2011-01-01

    Macromolecular modeling and design are increasingly useful in basic research, biotechnology, and teaching. However, the absence of a user-friendly modeling framework that provides access to a wide range of modeling capabilities is hampering the wider adoption of computational methods by non-experts. RosettaScripts is an XML-like language for specifying modeling tasks in the Rosetta framework. RosettaScripts provides access to protocol-level functionalities, such as rigid-body docking and sequence redesign, and allows fast testing and deployment of complex protocols without need for modifying or recompiling the underlying C++ code. We illustrate these capabilities with RosettaScripts protocols for the stabilization of proteins, the generation of computationally constrained libraries for experimental selection of higher-affinity binding proteins, loop remodeling, small-molecule ligand docking, design of ligand-binding proteins, and specificity redesign in DNA-binding proteins.

  15. HEPMath 1.4: A mathematica package for semi-automatic computations in high energy physics

    NASA Astrophysics Data System (ADS)

    Wiebusch, Martin

    2015-10-01

    This article introduces the Mathematica package HEPMath which provides a number of utilities and algorithms for High Energy Physics computations in Mathematica. Its functionality is similar to packages like FormCalc or FeynCalc, but it takes a more complete and extensible approach to implementing common High Energy Physics notations in the Mathematica language, in particular those related to tensors and index contractions. It also provides a more flexible method for the generation of numerical code which is based on new features for C code generation in Mathematica. In particular it can automatically generate Python extension modules which make the compiled functions callable from Python, thus eliminating the need to write any code in a low-level language like C or Fortran. It also contains seamless interfaces to LHAPDF, FeynArts, and LoopTools.

  16. Computer-assisted instruction and diagnosis of radiographic findings.

    PubMed

    Harper, D; Butler, C; Hodder, R; Allman, R; Woods, J; Riordan, D

    1984-04-01

    Recent advances in computer technology, including high bit-density storage, digital imaging, and the ability to interface microprocessors with videodisk, create enormous opportunities in the field of medical education. This program, utilizing a personal computer, videodisk, BASIC language, a linked textfile system, and a triangulation approach to the interpretation of radiographs developed by Dr. W. L. Thompson, can enable the user to engage in a user-friendly, dynamic teaching program in radiology, applicable to various levels of expertise. Advantages include a relatively more compact and inexpensive system with rapid access and ease of revision which requires little instruction to the user.

  17. The employment of a spoken language computer applied to an air traffic control task.

    NASA Technical Reports Server (NTRS)

    Laveson, J. I.; Silver, C. A.

    1972-01-01

    Assessment of the merits of a limited spoken language (56 words) computer in a simulated air traffic control (ATC) task. An airport zone approximately 60 miles in diameter with a traffic flow simulation ranging from single-engine to commercial jet aircraft provided the workload for the controllers. This research determined that, under the circumstances of the experiments carried out, the use of a spoken-language computer would not improve the controller performance.

  18. Are You Listening to Your Computer?

    ERIC Educational Resources Information Center

    Shugg, Alan

    1992-01-01

    Accepting the great motivational value of computers in second-language learning, this article describes ways to use authentic language recorded on a computer with HyperCard. Graphics, sound, and hardware/software requirements are noted, along with brief descriptions of programing with sound and specific programs. (LB)

  19. Concurrent extensions to the FORTRAN language for parallel programming of computational fluid dynamics algorithms

    NASA Technical Reports Server (NTRS)

    Weeks, Cindy Lou

    1986-01-01

    Experiments were conducted at NASA Ames Research Center to define multi-tasking software requirements for multiple-instruction, multiple-data stream (MIMD) computer architectures. The focus was on specifying solutions for algorithms in the field of computational fluid dynamics (CFD). The program objectives were to allow researchers to produce usable parallel application software as soon as possible after acquiring MIMD computer equipment, to provide researchers with an easy-to-learn and easy-to-use parallel software language which could be implemented on several different MIMD machines, and to enable researchers to list preferred design specifications for future MIMD computer architectures. Analysis of CFD algorithms indicated that extensions of an existing programming language, adaptable to new computer architectures, provided the best solution to meeting program objectives. The CoFORTRAN Language was written in response to these objectives and to provide researchers a means to experiment with parallel software solutions to CFD algorithms on machines with parallel architectures.

  20. L-Py: An L-System Simulation Framework for Modeling Plant Architecture Development Based on a Dynamic Language

    PubMed Central

    Boudon, Frédéric; Pradal, Christophe; Cokelaer, Thomas; Prusinkiewicz, Przemyslaw; Godin, Christophe

    2012-01-01

    The study of plant development requires increasingly powerful modeling tools to help understand and simulate the growth and functioning of plants. In the last decade, the formalism of L-systems has emerged as a major paradigm for modeling plant development. Previous implementations of this formalism were made based on static languages, i.e., languages that require explicit definition of variable types before using them. These languages are often efficient but involve quite a lot of syntactic overhead, thus restricting the flexibility of use for modelers. In this work, we present an adaptation of L-systems to the Python language, a popular and powerful open-license dynamic language. We show that the use of dynamic language properties makes it possible to enhance the development of plant growth models: (i) by keeping a simple syntax while allowing for high-level programming constructs, (ii) by making code execution easy and avoiding compilation overhead, (iii) by allowing a high-level of model reusability and the building of complex modular models, and (iv) by providing powerful solutions to integrate MTG data-structures (that are a common way to represent plants at several scales) into L-systems and thus enabling to use a wide spectrum of computer tools based on MTGs developed for plant architecture. We then illustrate the use of L-Py in real applications to build complex models or to teach plant modeling in the classroom. PMID:22670147

  1. L-py: an L-system simulation framework for modeling plant architecture development based on a dynamic language.

    PubMed

    Boudon, Frédéric; Pradal, Christophe; Cokelaer, Thomas; Prusinkiewicz, Przemyslaw; Godin, Christophe

    2012-01-01

    The study of plant development requires increasingly powerful modeling tools to help understand and simulate the growth and functioning of plants. In the last decade, the formalism of L-systems has emerged as a major paradigm for modeling plant development. Previous implementations of this formalism were made based on static languages, i.e., languages that require explicit definition of variable types before using them. These languages are often efficient but involve quite a lot of syntactic overhead, thus restricting the flexibility of use for modelers. In this work, we present an adaptation of L-systems to the Python language, a popular and powerful open-license dynamic language. We show that the use of dynamic language properties makes it possible to enhance the development of plant growth models: (i) by keeping a simple syntax while allowing for high-level programming constructs, (ii) by making code execution easy and avoiding compilation overhead, (iii) by allowing a high-level of model reusability and the building of complex modular models, and (iv) by providing powerful solutions to integrate MTG data-structures (that are a common way to represent plants at several scales) into L-systems and thus enabling to use a wide spectrum of computer tools based on MTGs developed for plant architecture. We then illustrate the use of L-Py in real applications to build complex models or to teach plant modeling in the classroom.

  2. A Study of the Programming Languages Used in Information Systems and in Computer Science Curricula

    ERIC Educational Resources Information Center

    Russell, Jack; Russell, Barbara; Pollacia, Lissa F.; Tastle, William J.

    2010-01-01

    This paper researches the computer languages taught in the first, second and third programming courses in Computer Information Systems (CIS), Management Information Systems (MIS or IS) curricula as well as in Computer Science (CS) and Information Technology (IT) curricula. Instructors teaching the first course in programming within a four year…

  3. The impact of computer-based versus "traditional" textbook science instruction on selected student learning outcomes

    NASA Astrophysics Data System (ADS)

    Rothman, Alan H.

    This study reports the results of research designed to examine the impact of computer-based science instruction on elementary school level students' science content achievement, their attitude about science learning, their level of critical thinking-inquiry skills, and their level of cognitive and English language development. The study compared these learning outcomes resulting from a computer-based approach compared to the learning outcomes from a traditional, textbook-based approach to science instruction. The computer-based approach was inherent in a curriculum titled The Voyage of the Mimi , published by The Bank Street College Project in Science and Mathematics (1984). The study sample included 209 fifth-grade students enrolled in three schools in a suburban school district. This sample was divided into three groups, each receiving one of the following instructional treatments: (a) Mixed-instruction primarily based on the use of a hardcopy textbook in conjunction with computer-based instructional materials as one component of the science course; (b) Non-Traditional, Technology-Based -instruction fully utilizing computer-based material; and (c) Traditional, Textbook-Based-instruction utilizing only the textbook as the basis for instruction. Pre-test, or pre-treatment, data related to each of the student learning outcomes was collected at the beginning of the school year and post-test data was collected at the end of the school year. Statistical analyses of pre-test data were used as a covariate to account for possible pre-existing differences with regard to the variables examined among the three student groups. This study concluded that non-traditional, computer-based instruction in science significantly improved students' attitudes toward science learning and their level of English language development. Non-significant, positive trends were found for the following student learning outcomes: overall science achievement and development of critical thinking-inquiry skills. These conclusions support the value of a non-traditional, computer-based approach to instruction, such as exemplified by The Voyage of the Mimi curriculum, and a recommendation for reform in science teaching that has recommended the use of computer technology to enhance learning outcomes from science instruction to assist in reversing the trend toward what has been perceived to be relatively poor science performance by American students, as documented by the 1996 Third International Mathematics and Science Study (TIMSS).

  4. A Randomized Field Trial of the Fast ForWord Language Computer-Based Training Program

    ERIC Educational Resources Information Center

    Borman, Geoffrey D.; Benson, James G.; Overman, Laura

    2009-01-01

    This article describes an independent assessment of the Fast ForWord Language computer-based training program developed by Scientific Learning Corporation. Previous laboratory research involving children with language-based learning impairments showed strong effects on their abilities to recognize brief and fast sequences of nonspeech and speech…

  5. Language Analysis Package (L.A.P.) Version I System Design.

    ERIC Educational Resources Information Center

    Porch, Ann

    To permit researchers to use the speed and versatility of the computer to process natural language text as well as numerical data without undergoing special training in programing or computer operations, a language analysis package has been developed partially based on several existing programs. An overview of the design is provided and system…

  6. Conversation Analysis of Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Gonzalez-Lloret, Marta

    2011-01-01

    The potential of computer-mediated communication (CMC) for language learning resides mainly in the possibility that learners have to engage with other speakers of the language, including L1 speakers. The inclusion of CMC in the L2 classroom provides an opportunity for students to utilize authentic language in real interaction, rather than the more…

  7. Using Computer-Mediated Communication (CMC) in Language Teaching

    ERIC Educational Resources Information Center

    Goertler, Senta

    2009-01-01

    This article discusses how new and familiar computer technology tools can be used in a communicative language classroom. It begins by outlining the benefits and challenges of using such technology for language teaching in general, and it describes some sample activities that the author has used. Readers are shown how to implement various computer…

  8. English Loanwords in Spanish Computer Language

    ERIC Educational Resources Information Center

    Cabanillas, Isabel de la Cruz; Martinez, Cristina Tejedor; Prados, Mercedes Diez; Redondo, Esperanza Cerda

    2007-01-01

    Contact with the English language, especially from the 20th century onwards, has had as a consequence an increase in the number of words that are borrowed from English into Spanish. This process is particularly noticeable in Spanish for Specific Purposes, and, more specifically, in the case of Spanish computer language. Although sociocultural and…

  9. Attitudes of Jordanian Undergraduate Students towards Using Computer Assisted Language Learning (CALL)

    ERIC Educational Resources Information Center

    Saeed, Farah Jamal Abed Alrazeq; Al-Zayed, Norma Nawaf

    2018-01-01

    The study aimed at investigating the attitudes of Jordanian undergraduate students towards using computer assisted-language learning (CALL) and its effectiveness in the process of learning the English language. In order to fulfill the study's objective, the researchers used a questionnaire to collect data, followed-up with semi-structured…

  10. Cognition, Corpora, and Computing: Triangulating Research in Usage-Based Language Learning

    ERIC Educational Resources Information Center

    Ellis, Nick C.

    2017-01-01

    Usage-based approaches explore how we learn language from our experience of language. Related research thus involves the analysis of the usage from which learners learn and of learner usage as it develops. This program involves considerable data recording, transcription, and analysis, using a variety of corpus and computational techniques, many of…

  11. Evaluating the Motivational Impact of CALL Systems: Current Practices and Future Directions

    ERIC Educational Resources Information Center

    Bodnar, Stephen; Cucchiarini, Catia; Strik, Helmer; van Hout, Roeland

    2016-01-01

    A major aim of computer-assisted language learning (CALL) is to create computer environments that facilitate students' second language (L2) acquisition. To achieve this aim, CALL employs technological innovations to create novel types of language practice. Evaluations of the new practice types serve the important role of distinguishing effective…

  12. Scaling Up and Zooming In: Big Data and Personalization in Language Learning

    ERIC Educational Resources Information Center

    Godwin-Jones, Robert

    2017-01-01

    From its earliest days, practitioners of computer-assisted language learning (CALL) have collected data from computer-mediated learning environments. Indeed, that has been a central aspect of the field from the beginning. Usage logs provided valuable insights into how systems were used and how effective they were for language learning. That…

  13. English Language Teachers' Perceptions of Computer-Assisted Language Learning

    ERIC Educational Resources Information Center

    Feng, Yu Lin

    2012-01-01

    A growing number of studies have reported the potential use of computer-assisted language learning (CALL) and other types of technology for ESL and EFL students. So far, most studies on CALL have focused on CALL-classroom comparisons (Chenoweth & Murday, 2003; Chenoweth, Ushida, & Murday, 2007; Fitze, 2006; Neri, Mich, Gerosa, &…

  14. Observations in the Computer Room: L2 Output and Learner Behaviour

    ERIC Educational Resources Information Center

    Leahy, Christine

    2004-01-01

    This article draws on second language theory, particularly output theory as defined by Swain (1995), in order to conceptualise observations made in a computer-assisted language learning setting. It investigates second language output and learner behaviour within an electronic role-play setting, based on a subject-specific problem solving task and…

  15. A Computer Assisted Language Analysis System.

    ERIC Educational Resources Information Center

    Rush, J. E.; And Others

    A description is presented of a computer-assisted language analysis system (CALAS) which can serve as a method for isolating and displaying language utterances found in conversation. The purpose of CALAS is stated as being to deal with the question of whether it is possible to detect, isolate, and display information indicative of what is…

  16. Evidence and Interpretation in Language Learning Research: Opportunities for Collaboration with Computational Linguistics

    ERIC Educational Resources Information Center

    Meurers, Detmar; Dickinson, Markus

    2017-01-01

    This article discusses two types of opportunities for interdisciplinary collaboration between computational linguistics (CL) and language learning research. We target the connection between data and theory in second language (L2) research and highlight opportunities to (a) enrich the options for obtaining data and (b) support the identification…

  17. Knowledge Intensive Programming: A New Educational Computing Environment.

    ERIC Educational Resources Information Center

    Seidman, Robert H.

    1990-01-01

    Comparison of the process of problem solving using a conventional procedural computer programing language (e.g., BASIC, Logo, Pascal), with the process when using a logic programing language (i.e., Prolog), focuses on the potential of the two types of programing languages to facilitate the transfer of problem-solving skills, cognitive development,…

  18. Current Trends in Computer-Based Language Instruction.

    ERIC Educational Resources Information Center

    Hart, Robert S.

    1987-01-01

    A discussion of computer-based language instruction examines the quality of materials currently in use and looks at developments in the field. It is found that language courseware is generally weak in the areas of error analysis and feedback, communicative realism, and convenience of lesson authoring. A review of research under way to improve…

  19. Microcomputer Based Computer-Assisted Learning System: CASTLE.

    ERIC Educational Resources Information Center

    Garraway, R. W. T.

    The purpose of this study was to investigate the extent to which a sophisticated computer assisted instruction (CAI) system could be implemented on the type of microcomputer system currently found in the schools. A method was devised for comparing CAI languages and was used to rank five common CAI languages. The highest ranked language, NATAL,…

  20. Integrating Computer-Assisted Translation Tools into Language Learning

    ERIC Educational Resources Information Center

    Fernández-Parra, María

    2016-01-01

    Although Computer-Assisted Translation (CAT) tools play an important role in the curriculum in many university translator training programmes, they are seldom used in the context of learning a language, as a good command of a language is needed before starting to translate. Since many institutions often have translator-training programmes as well…

  1. A programming language for composable DNA circuits

    PubMed Central

    Phillips, Andrew; Cardelli, Luca

    2009-01-01

    Recently, a range of information-processing circuits have been implemented in DNA by using strand displacement as their main computational mechanism. Examples include digital logic circuits and catalytic signal amplification circuits that function as efficient molecular detectors. As new paradigms for DNA computation emerge, the development of corresponding languages and tools for these paradigms will help to facilitate the design of DNA circuits and their automatic compilation to nucleotide sequences. We present a programming language for designing and simulating DNA circuits in which strand displacement is the main computational mechanism. The language includes basic elements of sequence domains, toeholds and branch migration, and assumes that strands do not possess any secondary structure. The language is used to model and simulate a variety of circuits, including an entropy-driven catalytic gate, a simple gate motif for synthesizing large-scale circuits and a scheme for implementing an arbitrary system of chemical reactions. The language is a first step towards the design of modelling and simulation tools for DNA strand displacement, which complements the emergence of novel implementation strategies for DNA computing. PMID:19535415

  2. A programming language for composable DNA circuits.

    PubMed

    Phillips, Andrew; Cardelli, Luca

    2009-08-06

    Recently, a range of information-processing circuits have been implemented in DNA by using strand displacement as their main computational mechanism. Examples include digital logic circuits and catalytic signal amplification circuits that function as efficient molecular detectors. As new paradigms for DNA computation emerge, the development of corresponding languages and tools for these paradigms will help to facilitate the design of DNA circuits and their automatic compilation to nucleotide sequences. We present a programming language for designing and simulating DNA circuits in which strand displacement is the main computational mechanism. The language includes basic elements of sequence domains, toeholds and branch migration, and assumes that strands do not possess any secondary structure. The language is used to model and simulate a variety of circuits, including an entropy-driven catalytic gate, a simple gate motif for synthesizing large-scale circuits and a scheme for implementing an arbitrary system of chemical reactions. The language is a first step towards the design of modelling and simulation tools for DNA strand displacement, which complements the emergence of novel implementation strategies for DNA computing.

  3. How should a speech recognizer work?

    PubMed

    Scharenborg, Odette; Norris, Dennis; Bosch, Louis; McQueen, James M

    2005-11-12

    Although researchers studying human speech recognition (HSR) and automatic speech recognition (ASR) share a common interest in how information processing systems (human or machine) recognize spoken language, there is little communication between the two disciplines. We suggest that this lack of communication follows largely from the fact that research in these related fields has focused on the mechanics of how speech can be recognized. In Marr's (1982) terms, emphasis has been on the algorithmic and implementational levels rather than on the computational level. In this article, we provide a computational-level analysis of the task of speech recognition, which reveals the close parallels between research concerned with HSR and ASR. We illustrate this relation by presenting a new computational model of human spoken-word recognition, built using techniques from the field of ASR that, in contrast to current existing models of HSR, recognizes words from real speech input. 2005 Lawrence Erlbaum Associates, Inc.

  4. Toward an Understanding of Preservice English as a Foreign Language Teachers' Acceptance of Computer-Assisted Language Learning 2.0 in the People's Republic of China

    ERIC Educational Resources Information Center

    Mei, Bing; Brown, Gavin T. L.; Teo, Timothy

    2018-01-01

    Despite the rapid proliferation of information and communication technologies, there exists a paucity of empirical research on the causes of the current low acceptance of computer-assisted language learning (CALL) by English as a foreign language (EFL) teachers in the People's Republic of China (PRC). This study aims to remedy this situation…

  5. GOAL-to-HAL translation study

    NASA Technical Reports Server (NTRS)

    Flanders, J. H.; Helmers, C. T.; Stanten, S. F.

    1973-01-01

    This report deals with the feasibility, problems, solutions, and mapping of a GOAL language to HAL language translator. Ground Operations Aerospace Language, or GOAL, is a test-oriented higher order language developed by the John F. Kennedy Space Center to be used in checkout and launch of the space shuttle. HAL is a structured higher order language developed by the Johnson Space Center to be used in writing the flight software for the onboard shuttle computers. Since the onboard computers will extensively support ground checkout of the space shuttle, and since these computers and the software development facilities on the ground use the HAL language as baseline, the translation of GOAL to HAL becomes significant. The issue of feasibility was examined and it was found that a GOAL to HAL translator is feasible. Special problems are identified and solutions proposed. Finally, examples of translation are provided for each category of complete GOAL statement.

  6. Culture and biology in the origins of linguistic structure.

    PubMed

    Kirby, Simon

    2017-02-01

    Language is systematically structured at all levels of description, arguably setting it apart from all other instances of communication in nature. In this article, I survey work over the last 20 years that emphasises the contributions of individual learning, cultural transmission, and biological evolution to explaining the structural design features of language. These 3 complex adaptive systems exist in a network of interactions: individual learning biases shape the dynamics of cultural evolution; universal features of linguistic structure arise from this cultural process and form the ultimate linguistic phenotype; the nature of this phenotype affects the fitness landscape for the biological evolution of the language faculty; and in turn this determines individuals' learning bias. Using a combination of computational simulation, laboratory experiments, and comparison with real-world cases of language emergence, I show that linguistic structure emerges as a natural outcome of cultural evolution once certain minimal biological requirements are in place.

  7. Associative programming language and virtual associative access manager

    NASA Technical Reports Server (NTRS)

    Price, C.

    1978-01-01

    APL provides convenient associative data manipulation functions in a high level language. Six statements were added to PL/1 via a preprocessor: CREATE, INSERT, FIND, FOR EACH, REMOVE, and DELETE. They allow complete control of all data base operations. During execution, data base management programs perform the functions required to support the APL language. VAAM is the data base management system designed to support the APL language. APL/VAAM is used by CADANCE, an interactive graphic computer system. VAAM is designed to support heavily referenced files. Virtual memory files, which utilize the paging mechanism of the operating system, are used. VAAM supports a full network data structure. The two basic blocks in a VAAM file are entities and sets. Entities are the basic information element and correspond to PL/1 based structures defined by the user. Sets contain the relationship information and are implemented as arrays.

  8. Structured Design Language for Computer Programs

    NASA Technical Reports Server (NTRS)

    Pace, Walter H., Jr.

    1986-01-01

    Box language used at all stages of program development. Developed to provide improved productivity in designing, coding, and maintaining computer programs. BOX system written in FORTRAN 77 for batch execution.

  9. Preserving Tradition through Technology.

    ERIC Educational Resources Information Center

    Wakshul, Barbra

    2001-01-01

    Language is easiest to learn before age 5. The Cherokee Nation supported production of a toy that teaches young children basic Cherokee words. When figures that come with the toy are placed into it, a computer chip activates a voice speaking the name of the figure in Cherokee. Learning takes place on visual, auditory, and tactile levels. (TD)

  10. Personalized Intelligent Mobile Learning System for Supporting Effective English Learning

    ERIC Educational Resources Information Center

    Chen, Chih-Ming; Hsu, Shih-Hsun

    2008-01-01

    Since English has been an international language, how to enhance English levels of people by useful computer assisted learning forms or tools is a critical issue in non-English speaking countries because it definitely affects the overall competition ability of a country. With the rapid growth of wireless and mobile technologies, the mobile…

  11. Integrating Science and Mathematics Curricula Using Computer Mediated Communications: A Vygotskian Perspective.

    ERIC Educational Resources Information Center

    Charnitski, Christina Wotell; Harvey, Francis A.

    This paper presents the theories of L.S. Vygotsky as a conceptual framework for implementing instruction that supports concept development and promotes higher level thinking skills in students. Three major components (i.e., language, scientific and spontaneous concepts, and the zone of proximal development) of Vygotsky's socio-cultural-historical…

  12. Tapping into the Intellectual Capital at the University

    ERIC Educational Resources Information Center

    Griffith, Mary

    2017-01-01

    Content and Language Integrated Learning (CLIL) is as full of challenges as it is of possibilities. We will explore the challenges while seeking realistic solutions as eight Computer Science professors teach their subjects through English for the first time. We hope to gain insights into the bilingual classroom at the university level where…

  13. Preparing the Faculty. Faculty Development for the Microcomputing Program.

    ERIC Educational Resources Information Center

    Drexel Univ., Philadelphia, PA. Microcomputing Program.

    The preparation of Drexel University faculty for the introduction of a microcomputing program is described. Faculty training had to be done on a variety of levels, from basic training in computer operation for the novice to advanced training in highly technical procedures and languages. Maximum faculty participation was sought throughout the…

  14. Design of a Production System for Cognitive Modeling #1. Technical Report 77-2.

    ERIC Educational Resources Information Center

    Anderson, John R.; Kline, Paul J.

    This report describes several of the design decisions underlying ACT, a production system model of human cognition. ACT can be considered a high level computer programming language as well as a theory of the cognitive mechanisms underlying human information processing. ACT design decisions were based on both psychological and artificial…

  15. Theory Meets Praxis: From Derrida to the Beginning German Classroom via the Internet

    ERIC Educational Resources Information Center

    Hasty, Will

    2006-01-01

    Based on practical experience in a new online beginning German course sequence, the author of this essay argues that contemporary cultural developments associated with the emergence of new technologies, particularly computer-assisted language learning, provide new opportunities to theorize German Studies curricula from the beginning level onward.…

  16. Microcomputers, Software and Foreign Languages for Special Purposes: An Analysis of TXTPRO.

    ERIC Educational Resources Information Center

    Tang, Michael S.

    TXTPRO, a computer program developed as a graduate-level research tool for descriptive linguistic analysis, produces simple alphabetic and word frequency lists, analyzes word combinations, and develops concordances. With modifications, a teacher could enter the program into a mainframe or a microcomputer and use it for text analyses to develop…

  17. Computer vs. Workbook Instruction in Second Language Acquisition.

    ERIC Educational Resources Information Center

    Nagata, Noriko

    1996-01-01

    Compares the effectiveness of Nihongo-CALI (Japanese Computer Assisted Language Instruction) with non-CALI workbook instruction. Findings reveal that given the same grammar notes and exercises, ongoing intelligent computer feedback is more effective than simple workbook answer sheets for developing learners' grammatical skill in producing Japanese…

  18. Conversational Simulation in Computer-Assisted Language Learning: Potential and Reality.

    ERIC Educational Resources Information Center

    Coleman, D. Wells

    1988-01-01

    Addresses the potential of conversational simulations for computer-assisted language learning (CALL) and reasons why this potential is largely untapped. Topics discussed include artificial intelligence; microworlds; parsing; realism versus reality in computer software; intelligent tutoring systems; and criteria to clarify what kinds of CALL…

  19. The Jupyter/IPython architecture: a unified view of computational research, from interactive exploration to communication and publication.

    NASA Astrophysics Data System (ADS)

    Ragan-Kelley, M.; Perez, F.; Granger, B.; Kluyver, T.; Ivanov, P.; Frederic, J.; Bussonnier, M.

    2014-12-01

    IPython has provided terminal-based tools for interactive computing in Python since 2001. The notebook document format and multi-process architecture introduced in 2011 have expanded the applicable scope of IPython into teaching, presenting, and sharing computational work, in addition to interactive exploration. The new architecture also allows users to work in any language, with implementations in Python, R, Julia, Haskell, and several other languages. The language agnostic parts of IPython have been renamed to Jupyter, to better capture the notion that a cross-language design can encapsulate commonalities present in computational research regardless of the programming language being used. This architecture offers components like the web-based Notebook interface, that supports rich documents that combine code and computational results with text narratives, mathematics, images, video and any media that a modern browser can display. This interface can be used not only in research, but also for publication and education, as notebooks can be converted to a variety of output formats, including HTML and PDF. Recent developments in the Jupyter project include a multi-user environment for hosting notebooks for a class or research group, a live collaboration notebook via Google Docs, and better support for languages other than Python.

  20. Applying and evaluating computer-animated tutors

    NASA Astrophysics Data System (ADS)

    Massaro, Dominic W.; Bosseler, Alexis; Stone, Patrick S.; Connors, Pamela

    2002-05-01

    We have developed computer-assisted speech and language tutors for deaf, hard of hearing, and autistic children. Our language-training program utilizes our computer-animated talking head, Baldi, as the conversational agent, who guides students through a variety of exercises designed to teach vocabulary and grammer, to improve speech articulation, and to develop linguistic and phonological awareness. Baldi is an accurate three-dimensional animated talking head appropriately aligned with either synthesized or natural speech. Baldi has a tongue and palate, which can be displayed by making his skin transparent. Two specific language-training programs have been evaluated to determine if they improve word learning and speech articulation. The results indicate that the programs are effective in teaching receptive and productive language. Advantages of utilizing a computer-animated agent as a language tutor are the popularity of computers and embodied conversational agents with autistic kids, the perpetual availability of the program, and individualized instruction. Students enjoy working with Baldi because he offers extreme patience, he doesn't become angry, tired, or bored, and he is in effect a perpetual teaching machine. The results indicate that the psychology and technology of Baldi holds great promise in language learning and speech therapy. [Work supported by NSF Grant Nos. CDA-9726363 and BCS-9905176 and Public Health Service Grant No. PHS R01 DC00236.

  1. Acceptability of a Virtual Patient Educator for Hispanic Women.

    PubMed

    Wells, Kristen J; Vàzquez-Otero, Coralia; Bredice, Marissa; Meade, Cathy D; Chaet, Alexis; Rivera, Maria I; Arroyo, Gloria; Proctor, Sara K; Barnes, Laura E

    2015-01-01

    There are few Spanish language interactive, technology-driven health education programs. Objectives of this feasibility study were to (a) learn more about computer and technology usage among Hispanic women living in a rural community and (b) evaluate acceptability of the concept of using an embodied conversational agent (ECA) computer application among this population. A survey about computer usage history and interest in computers was administered to a convenience sample of 26 women. A sample video prototype of a hospital discharge ECA was administered followed by questions to gauge opinion about the ECA. Data indicate women exhibited both a high level of computer experience and enthusiasm for the ECA. Feedback from community is essential to ensure equity in state of the art dissemination of health information.

  2. A distributed Clips implementation: dClips

    NASA Technical Reports Server (NTRS)

    Li, Y. Philip

    1993-01-01

    A distributed version of the Clips language, dClips, was implemented on top of two existing generic distributed messaging systems to show that: (1) it is easy to create a coarse-grained parallel programming environment out of an existing language if a high level messaging system is used; and (2) the computing model of a parallel programming environment can be changed easily if we change the underlying messaging system. dClips processes were first connected with a simple master-slave model. A client-server model with intercommunicating agents was later implemented. The concept of service broker is being investigated.

  3. The representation of grammatical categories in the brain.

    PubMed

    Shapiro, Kevin; Caramazza, Alfonso

    2003-05-01

    Language relies on the rule-based combination of words with different grammatical properties, such as nouns and verbs. Yet most research on the problem of word retrieval has focused on the production of concrete nouns, leaving open a crucial question: how is knowledge about different grammatical categories represented in the brain, and what components of the language production system make use of it? Drawing on evidence from neuropsychology, electrophysiology and neuroimaging, we argue that information about a word's grammatical category might be represented independently of its meaning at the levels of word form and morphological computation.

  4. Validation of English and Spanish-language versions of a screening questionnaire for rheumatoid arthritis in an underserved community.

    PubMed

    Potter, Jeffrey; Odutola, Jennifer; Gonzales, Christian Amurrio; Ward, Michael M

    2008-08-01

    Questionnaires to screen for rheumatoid arthritis (RA) have been tested in groups that were primarily well educated and Caucasian. We sought to validate the RA questions of the Connective Tissue Disease Screening Questionnaire (CSQ) in ethnic minorities in an underserved community, and to test a Spanish-language version. The Spanish-language version was developed by 2 native speakers. Consecutive English-speaking or Spanish-speaking patients in a community-based rheumatology practice completed the questionnaire. Diagnoses were confirmed by medical record review. Sensitivity and specificity of the questionnaire for a diagnosis of RA were computed for each language version, using 2 groups as controls: patients with noninflammatory conditions, and participants recruited from the community. The English-language version was tested in 53 patients with RA (79% ethnic minorities; mean education level 11.3 yrs), 85 rheumatology controls with noninflammatory conditions, and 82 community controls. Using 3 positive responses as indicating a positive screening test, the sensitivity of the questionnaire was 0.77, the specificity based on rheumatology controls was 0.45, and the specificity based on community controls was 0.94. The Spanish-language version was tested in 55 patients with RA (mean education level 7.8 yrs), 149 rheumatology controls, and 88 community controls. The sensitivity of the Spanish-language version was 0.87, with specificities of 0.60 and 0.97 using the rheumatology controls and community controls, respectively. The sensitivity of the English-language version of the RA questions of the CSQ was lower in this study than in other cohorts, reflecting differences in the performance of the questions in different ethnic or socioeconomic groups. The Spanish-language version demonstrated good sensitivity, and both had excellent specificity when tested in community controls.

  5. Computers for symbolic processing

    NASA Technical Reports Server (NTRS)

    Wah, Benjamin W.; Lowrie, Matthew B.; Li, Guo-Jie

    1989-01-01

    A detailed survey on the motivations, design, applications, current status, and limitations of computers designed for symbolic processing is provided. Symbolic processing computations are performed at the word, relation, or meaning levels, and the knowledge used in symbolic applications may be fuzzy, uncertain, indeterminate, and ill represented. Various techniques for knowledge representation and processing are discussed from both the designers' and users' points of view. The design and choice of a suitable language for symbolic processing and the mapping of applications into a software architecture are then considered. The process of refining the application requirements into hardware and software architectures is treated, and state-of-the-art sequential and parallel computers designed for symbolic processing are discussed.

  6. A software methodology for compiling quantum programs

    NASA Astrophysics Data System (ADS)

    Häner, Thomas; Steiger, Damian S.; Svore, Krysta; Troyer, Matthias

    2018-04-01

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. To achieve scalable quantum computation, optimizing compilers and a corresponding software design flow will be essential. We present a software architecture for compiling quantum programs from a high-level language program to hardware-specific instructions. We describe the necessary layers of abstraction and their differences and similarities to classical layers of a computer-aided design flow. For each layer of the stack, we discuss the underlying methods for compilation and optimization. Our software methodology facilitates more rapid innovation among quantum algorithm designers, quantum hardware engineers, and experimentalists. It enables scalable compilation of complex quantum algorithms and can be targeted to any specific quantum hardware implementation.

  7. Computer Anxiety: A Comparison of Adolescents with and without a History of Specific Language Impairment (SLI)

    ERIC Educational Resources Information Center

    Conti-Ramsden, Gina; Durkin, Kevin; Walker, Allan J.

    2010-01-01

    Individuals who are anxious about computers may be at a disadvantage in their learning. This investigation focused on the use of home computers for educational purposes. It compared computer anxiety in adolescents with and without a history of special needs related to language difficulties. Participants were 55 17-year-olds with specific language…

  8. On the Net: ICT4LT--Information and Communications Technology for Language Teachers

    ERIC Educational Resources Information Center

    LeLoup, Jean W.; Ponterio, Robert

    2004-01-01

    Foreign language (FL) teachers have long been leaders in the use of technology in the classroom, from short wave radio and newspapers, to film strips, to tape recorders, to records, 16 mm films, video, and now computers, as a means of bringing authentic language and culture to their students. Computer and Internet technologies require…

  9. The Impact of Computer-Mediated Communication Environments on Foreign Language Learning: A Review of the Literature

    ERIC Educational Resources Information Center

    Mahdi, Hassan Saleh

    2014-01-01

    This article reviews the literature on the implementation of computer-mediated communication (CMC) in language learning, aiming at understanding how CMC environments have been implemented to foster language learning. The paper draws on 40 recent research articles selected from 10 peer-reviewed journals, 2 book chapters and one conference…

  10. Looking towards the Future of Language Assessment: Usability of Tablet PCs in Language Testing

    ERIC Educational Resources Information Center

    Garcia Laborda, Jesus; Magal Royo, Teresa; Bakieva, Margarita

    2016-01-01

    This research addresses the change in how the Spanish University Entrance Examination can be delivered in the future. There is a wide acknowledgement that computer tests are very demanding for the delivering institutions which makes computer language testing difficult to implement. However, the use of tablet PCs can facilitate the delivery at even…

  11. Gender, "Discourse," and Technology. Center for Equity and Diversity Working Paper 5.

    ERIC Educational Resources Information Center

    Hanson, Katherine

    This paper identifies and discusses the connections between the way individuals frame their world based on the language they use and the impact of language and stereotyping on the perception that computer technology is primarily for certain individuals. The study explores how some of the dimensions of the language of computers and technology,…

  12. Newsletter for Asian and Middle Eastern Languages on Computer, Volume 1, Numbers 3 & 4.

    ERIC Educational Resources Information Center

    Meadow, Anthony, Ed.

    1986-01-01

    Volume 1, numbers 3 and 4, of the newsletter on the use of non-Western languages with computers contains the following articles: "Reversing the Screen under MS/PC-DOS" (Dan Brink); "Comments on Diacritics Using Wordstar, etc. and CP/M Software for Non-Western Languages" (Michael Broschat); "Carving Tibetan in Silicon: A…

  13. Natural Language Processing in Game Studies Research: An Overview

    ERIC Educational Resources Information Center

    Zagal, Jose P.; Tomuro, Noriko; Shepitsen, Andriy

    2012-01-01

    Natural language processing (NLP) is a field of computer science and linguistics devoted to creating computer systems that use human (natural) language as input and/or output. The authors propose that NLP can also be used for game studies research. In this article, the authors provide an overview of NLP and describe some research possibilities…

  14. Listening Strategy Use and Influential Factors in Web-Based Computer Assisted Language Learning

    ERIC Educational Resources Information Center

    Chen, L.; Zhang, R.; Liu, C.

    2014-01-01

    This study investigates second and foreign language (L2) learners' listening strategy use and factors that influence their strategy use in a Web-based computer assisted language learning (CALL) system. A strategy inventory, a factor questionnaire and a standardized listening test were used to collect data from a group of 82 Chinese students…

  15. Your Verbal Zone: An Intelligent Computer-Assisted Language Learning Program in Support of Turkish Learners' Vocabulary Learning

    ERIC Educational Resources Information Center

    Esit, Omer

    2011-01-01

    This study investigated the effectiveness of an intelligent computer-assisted language learning (ICALL) program on Turkish learners' vocabulary learning. Within the scope of this research, an ICALL application with a morphological analyser (Your Verbal Zone, YVZ) was developed and used in an English language preparatory class to measure its…

  16. The Impact of Computer-Based Instruction on the Development of EFL Learners' Writing Skills

    ERIC Educational Resources Information Center

    Zaini, A.; Mazdayasna, G.

    2015-01-01

    The current study investigated the application and effectiveness of computer assisted language learning (CALL) in teaching academic writing to Iranian EFL (English as a Foreign Language) learners by means of Microsoft Word Office. To this end, 44 sophomore intermediate university students majoring in English Language and Literature at an Iranian…

  17. Discourse Functions and Vocabulary Use in English Language Learners' Synchronous Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Rabab'ah, Ghaleb

    2013-01-01

    This study explores the discourse generated by English as a foreign language (EFL) learners using synchronous computer-mediated communication (CMC) as an approach to help English language learners to create social interaction in the classroom. It investigates the impact of synchronous CMC mode on the quantity of total words, lexical range and…

  18. Effects of Computer Assisted Learning Instructions on Reading Achievement among Middle School English Language Learners

    ERIC Educational Resources Information Center

    Bayley-Hamlet, Simone O.

    2017-01-01

    The purpose of this study was to examine the effect of Imagine Learning, a computer assisted language learning (CALL) program, on addressing reading achievement for English language learners (ELLs). This is a measurement used in the Accessing Comprehension and Communication in English State-to-State (ACCESS for ELLs or ACCESS) reading scale…

  19. Un projet de logiciels d'assistance a l'apprentissage de la lecture en FLE (An Interdisciplinary Research Project Oriented toward Computer Programs for Reading Instruction in French as a Second Language).

    ERIC Educational Resources Information Center

    Challe, Odile; And Others

    1985-01-01

    Describes a French project entitled "Lecticiel," jointly undertaken by specialists in reading, computer programing, and second language instruction to integrate these disciplines and provide assistance for students learning to read French as a foreign language. (MSE)

  20. Introduction to the Atari Computer. A Program Written in the Pilot Programming Language.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    Designed to be an introduction to the Atari microcomputers for beginners, the interactive computer program listed in this document is written in the Pilot programing language. Instructions are given for entering and storing the program in the computer memory for use by students. (MES)

  1. Introduction to the theory of machines and languages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weidhaas, P. P.

    1976-04-01

    This text is intended to be an elementary ''guided tour'' through some basic concepts of modern computer science. Various models of computing machines and formal languages are studied in detail. Discussions center around questions such as, ''What is the scope of problems that can or cannot be solved by computers.''

  2. Data management and language enhancement for generalized set theory computer language for operation of large relational databases

    NASA Technical Reports Server (NTRS)

    Finley, Gail T.

    1988-01-01

    This report covers the study of the relational database implementation in the NASCAD computer program system. The existing system is used primarily for computer aided design. Attention is also directed to a hidden-surface algorithm for final drawing output.

  3. Technology Teaching or Mediated Learning, Part I: Are Computers Skinnerian or Vygotskian?

    ERIC Educational Resources Information Center

    Coufal, Kathy L.

    2002-01-01

    This article highlights the theoretical framework that dominated speech-language pathology prior to the widespread introduction of microcomputers and poses questions regarding the application of computers in assessment and intervention for children with language-learning impairments. It discusses implications of computer use in the context of…

  4. Using Computer-Assisted Instruction to Enhance Achievement of English Language Learners

    ERIC Educational Resources Information Center

    Keengwe, Jared; Hussein, Farhan

    2014-01-01

    Computer-assisted instruction (CAI) in English-Language environments offer practice time, motivates students, enhance student learning, increase authentic materials that students can study, and has the potential to encourage teamwork between students. The findings from this particular study suggested that students who used computer assisted…

  5. Exploiting loop level parallelism in nonprocedural dataflow programs

    NASA Technical Reports Server (NTRS)

    Gokhale, Maya B.

    1987-01-01

    Discussed are how loop level parallelism is detected in a nonprocedural dataflow program, and how a procedural program with concurrent loops is scheduled. Also discussed is a program restructuring technique which may be applied to recursive equations so that concurrent loops may be generated for a seemingly iterative computation. A compiler which generates C code for the language described below has been implemented. The scheduling component of the compiler and the restructuring transformation are described.

  6. Application Portable Parallel Library

    NASA Technical Reports Server (NTRS)

    Cole, Gary L.; Blech, Richard A.; Quealy, Angela; Townsend, Scott

    1995-01-01

    Application Portable Parallel Library (APPL) computer program is subroutine-based message-passing software library intended to provide consistent interface to variety of multiprocessor computers on market today. Minimizes effort needed to move application program from one computer to another. User develops application program once and then easily moves application program from parallel computer on which created to another parallel computer. ("Parallel computer" also include heterogeneous collection of networked computers). Written in C language with one FORTRAN 77 subroutine for UNIX-based computers and callable from application programs written in C language or FORTRAN 77.

  7. Universal Entropy of Word Ordering Across Linguistic Families

    PubMed Central

    Montemurro, Marcelo A.; Zanette, Damián H.

    2011-01-01

    Background The language faculty is probably the most distinctive feature of our species, and endows us with a unique ability to exchange highly structured information. In written language, information is encoded by the concatenation of basic symbols under grammatical and semantic constraints. As is also the case in other natural information carriers, the resulting symbolic sequences show a delicate balance between order and disorder. That balance is determined by the interplay between the diversity of symbols and by their specific ordering in the sequences. Here we used entropy to quantify the contribution of different organizational levels to the overall statistical structure of language. Methodology/Principal Findings We computed a relative entropy measure to quantify the degree of ordering in word sequences from languages belonging to several linguistic families. While a direct estimation of the overall entropy of language yielded values that varied for the different families considered, the relative entropy quantifying word ordering presented an almost constant value for all those families. Conclusions/Significance Our results indicate that despite the differences in the structure and vocabulary of the languages analyzed, the impact of word ordering in the structure of language is a statistical linguistic universal. PMID:21603637

  8. jInv: A Modular and Scalable Framework for Electromagnetic Inverse Problems

    NASA Astrophysics Data System (ADS)

    Belliveau, P. T.; Haber, E.

    2016-12-01

    Inversion is a key tool in the interpretation of geophysical electromagnetic (EM) data. Three-dimensional (3D) EM inversion is very computationally expensive and practical software for inverting large 3D EM surveys must be able to take advantage of high performance computing (HPC) resources. It has traditionally been difficult to achieve those goals in a high level dynamic programming environment that allows rapid development and testing of new algorithms, which is important in a research setting. With those goals in mind, we have developed jInv, a framework for PDE constrained parameter estimation problems. jInv provides optimization and regularization routines, a framework for user defined forward problems, and interfaces to several direct and iterative solvers for sparse linear systems. The forward modeling framework provides finite volume discretizations of differential operators on rectangular tensor product meshes and tetrahedral unstructured meshes that can be used to easily construct forward modeling and sensitivity routines for forward problems described by partial differential equations. jInv is written in the emerging programming language Julia. Julia is a dynamic language targeted at the computational science community with a focus on high performance and native support for parallel programming. We have developed frequency and time-domain EM forward modeling and sensitivity routines for jInv. We will illustrate its capabilities and performance with two synthetic time-domain EM inversion examples. First, in airborne surveys, which use many sources, we achieve distributed memory parallelism by decoupling the forward and inverse meshes and performing forward modeling for each source on small, locally refined meshes. Secondly, we invert grounded source time-domain data from a gradient array style induced polarization survey using a novel time-stepping technique that allows us to compute data from different time-steps in parallel. These examples both show that it is possible to invert large scale 3D time-domain EM datasets within a modular, extensible framework written in a high-level, easy to use programming language.

  9. Role of PROLOG (Programming and Logic) in natural-language processing. Report for September-December 1987

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McHale, M.L.

    The field of artificial Intelligence strives to produce computer programs that exhibit intelligent behavior. One of the areas of interest is the processing of natural language. This report discusses the role of the computer language PROLOG in Natural Language Processing (NLP) both from theoretic and pragmatic viewpoints. The reasons for using PROLOG for NLP are numerous. First, linguists can write natural-language grammars almost directly as PROLOG programs; this allows fast-prototyping of NLP systems and facilitates analysis of NLP theories. Second, semantic representations of natural-language texts that use logic formalisms are readily produced in PROLOG because of PROLOG's logical foundations. Third,more » PROLOG's built-in inferencing mechanisms are often sufficient for inferences on the logical forms produced by NLPs. Fourth, the logical, declarative nature of PROLOG may make it the language of choice for parallel computing systems. Finally, the fact that PROLOG has a de facto standard (Edinburgh) makes the porting of code from one computer system to another virtually trouble free. Perhaps the strongest tie one could make between NLP and PROLOG was stated by John Stuart Mill in his inaugural Address at St. Andrews: The structure of every sentence is a lesson in logic.« less

  10. Soviet Cybernetics Review. Volume 2, Number 5,

    DTIC Science & Technology

    prize; Aeroflot’s sirena system turned on; Computer system controls 2500 construction sites; Automation of aircraft languages; Diagnosis by teletype; ALGEM-1 and ALGEM-2 languages; Nuclear institute’s computer facilities.

  11. Quantum error correction in crossbar architectures

    NASA Astrophysics Data System (ADS)

    Helsen, Jonas; Steudtner, Mark; Veldhorst, Menno; Wehner, Stephanie

    2018-07-01

    A central challenge for the scaling of quantum computing systems is the need to control all qubits in the system without a large overhead. A solution for this problem in classical computing comes in the form of so-called crossbar architectures. Recently we made a proposal for a large-scale quantum processor (Li et al arXiv:1711.03807 (2017)) to be implemented in silicon quantum dots. This system features a crossbar control architecture which limits parallel single-qubit control, but allows the scheme to overcome control scaling issues that form a major hurdle to large-scale quantum computing systems. In this work, we develop a language that makes it possible to easily map quantum circuits to crossbar systems, taking into account their architecture and control limitations. Using this language we show how to map well known quantum error correction codes such as the planar surface and color codes in this limited control setting with only a small overhead in time. We analyze the logical error behavior of this surface code mapping for estimated experimental parameters of the crossbar system and conclude that logical error suppression to a level useful for real quantum computation is feasible.

  12. Automating quantum experiment control

    NASA Astrophysics Data System (ADS)

    Stevens, Kelly E.; Amini, Jason M.; Doret, S. Charles; Mohler, Greg; Volin, Curtis; Harter, Alexa W.

    2017-03-01

    The field of quantum information processing is rapidly advancing. As the control of quantum systems approaches the level needed for useful computation, the physical hardware underlying the quantum systems is becoming increasingly complex. It is already becoming impractical to manually code control for the larger hardware implementations. In this chapter, we will employ an approach to the problem of system control that parallels compiler design for a classical computer. We will start with a candidate quantum computing technology, the surface electrode ion trap, and build a system instruction language which can be generated from a simple machine-independent programming language via compilation. We incorporate compile time generation of ion routing that separates the algorithm description from the physical geometry of the hardware. Extending this approach to automatic routing at run time allows for automated initialization of qubit number and placement and additionally allows for automated recovery after catastrophic events such as qubit loss. To show that these systems can handle real hardware, we present a simple demonstration system that routes two ions around a multi-zone ion trap and handles ion loss and ion placement. While we will mainly use examples from transport-based ion trap quantum computing, many of the issues and solutions are applicable to other architectures.

  13. Current implementation and future plans on new code architecture, programming language and user interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brun, B.

    1997-07-01

    Computer technology has improved tremendously during the last years with larger media capacity, memory and more computational power. Visual computing with high-performance graphic interface and desktop computational power have changed the way engineers accomplish everyday tasks, development and safety studies analysis. The emergence of parallel computing will permit simulation over a larger domain. In addition, new development methods, languages and tools have appeared in the last several years.

  14. The Usage Evaluation of Official Computer Terms in Bahasa Indonesia in Indonesian Government Official Websites

    NASA Astrophysics Data System (ADS)

    Amalia, A.; Gunawan, D.; Lydia, M. S.; Charlie, C.

    2017-03-01

    According to Undang-Undang Dasar Republik Indonesia 1945 Pasal 36, Bahasa Indonesia is a National Language of Indonesia. It means Bahasa Indonesia must be used as an official language in all levels ranging from government to education as well as in development of science and technology. The Government of Republic of Indonesia as the highest and formal authority must use official Bahasa Indonesia in their activities including in their official websites. Therefore, the government issued a regulation instruction called Instruksi Presiden (Inpres) No. 2 Tahun 2001 to govern the usage of official computer terms in Bahasa Indonesia. The purpose of this research is to evaluate the usage of official computer terms in Bahasa Indonesia compared to the computer terms in English. The data are obtained from the government official websites in Indonesia. The method consists of data gathering, template detection, string extraction and data analysis. The evaluation of official computer terms in Bahasa Indonesia falls into three categories, such as good, moderate and poor. The number of websites in good category is 281 websites, the moderate category is 512 websites and the poor category is 290 websites. The authorized institution may use this result as additional information to evaluate the implementation of official information technology terms in Bahasa Indonesia.

  15. Tolerant (parallel) Programming

    NASA Technical Reports Server (NTRS)

    DiNucci, David C.; Bailey, David H. (Technical Monitor)

    1997-01-01

    In order to be truly portable, a program must be tolerant of a wide range of development and execution environments, and a parallel program is just one which must be tolerant of a very wide range. This paper first defines the term "tolerant programming", then describes many layers of tools to accomplish it. The primary focus is on F-Nets, a formal model for expressing computation as a folded partial-ordering of operations, thereby providing an architecture-independent expression of tolerant parallel algorithms. For implementing F-Nets, Cooperative Data Sharing (CDS) is a subroutine package for implementing communication efficiently in a large number of environments (e.g. shared memory and message passing). Software Cabling (SC), a very-high-level graphical programming language for building large F-Nets, possesses many of the features normally expected from today's computer languages (e.g. data abstraction, array operations). Finally, L2(sup 3) is a CASE tool which facilitates the construction, compilation, execution, and debugging of SC programs.

  16. Evolving a lingua franca and associated software infrastructure for computational systems biology: the Systems Biology Markup Language (SBML) project.

    PubMed

    Hucka, M; Finney, A; Bornstein, B J; Keating, S M; Shapiro, B E; Matthews, J; Kovitz, B L; Schilstra, M J; Funahashi, A; Doyle, J C; Kitano, H

    2004-06-01

    Biologists are increasingly recognising that computational modelling is crucial for making sense of the vast quantities of complex experimental data that are now being collected. The systems biology field needs agreed-upon information standards if models are to be shared, evaluated and developed cooperatively. Over the last four years, our team has been developing the Systems Biology Markup Language (SBML) in collaboration with an international community of modellers and software developers. SBML has become a de facto standard format for representing formal, quantitative and qualitative models at the level of biochemical reactions and regulatory networks. In this article, we summarise the current and upcoming versions of SBML and our efforts at developing software infrastructure for supporting and broadening its use. We also provide a brief overview of the many SBML-compatible software tools available today.

  17. Language Processing as Cue Integration: Grounding the Psychology of Language in Perception and Neurophysiology

    PubMed Central

    Martin, Andrea E.

    2016-01-01

    I argue that cue integration, a psychophysiological mechanism from vision and multisensory perception, offers a computational linking hypothesis between psycholinguistic theory and neurobiological models of language. I propose that this mechanism, which incorporates probabilistic estimates of a cue's reliability, might function in language processing from the perception of a phoneme to the comprehension of a phrase structure. I briefly consider the implications of the cue integration hypothesis for an integrated theory of language that includes acquisition, production, dialogue and bilingualism, while grounding the hypothesis in canonical neural computation. PMID:26909051

  18. The evolution of the Faculty of Language from a Chomskyan perspective: bridging linguistics and biology.

    PubMed

    Longa, Victor Manuel

    2013-01-01

    While language was traditionally considered a purely cultural trait, the advent of Noam Chomsky's Generative Grammar in the second half of the twentieth century dramatically challenged that view. According to that theory, language is an innate feature, part of the human biological endowment. If language is indeed innate, it had to biologically evolve. This review has two main objectives: firstly, it characterizes from a Chomskyan perspective the evolutionary processes by which language could have come into being. Secondly, it proposes a new method for interpreting the archaeological record that radically differs from the usual types of evidence Paleoanthropology has concentrated on when dealing with language evolution: while archaeological remains have usually been regarded from the view of the behavior they could be associated with, the paper will consider archaeological remains from the view of the computational processes and capabilities at work for their production. This computational approach, illustrated with a computational analysis of prehistoric geometric engravings, will be used to challenge the usual generative thinking on language evolution, based on the high specificity of language. The paper argues that the biological machinery of language is neither specifically linguistic nor specifically human, although language itself can still be considered a species-specific innate trait. From such a view, language would be one of the consequences of a slight modification operated on an ancestral architecture shared with vertebrates.

  19. Turing's Man, Turing's Woman, or Turing's Person?: Gender, Language, and Computers. Working Paper No. 166.

    ERIC Educational Resources Information Center

    Rothschild, Joan

    This essay compares two recent books on computer technology in terms of their usage of gendered or gender-free language. The two books examined are "Turing's Man: Western Culture in the Computer Age" by J. David Bolter and "The Second Self: Computers and the Human Spirit" by Sherry Turkle. It is argued that the two authors' gender differences in…

  20. Development of a KSC test and flight engineering oriented computer language, Phase 1

    NASA Technical Reports Server (NTRS)

    Case, C. W.; Kinney, E. L.; Gyure, J.

    1970-01-01

    Ten, primarily test oriented, computer languages reviewed during the phase 1 study effort are described. Fifty characteristics of ATOLL, ATLAS, and CLASP are compared. Unique characteristics of the other languages, including deficiencies, problems, safeguards, and checking provisions are identified. Programming aids related to these languages are reported, and the conclusions resulting from this phase of the study are discussed. A glossary and bibliography are included. For the reports on phase 2 of the study, see N71-35027 and N71-35029.

Top