ERIC Educational Resources Information Center
Novak, Gordon S., Jr.
GLISP is a LISP-based language which provides high-level language features not found in ordinary LISP. The GLISP language is implemented by means of a compiler which accepts GLISP as input and produces ordinary LISP as output. This output can be further compiled to machine code by the LISP compiler. GLISP is available for several LISP dialects,…
Khomtchouk, Bohdan B; Weitz, Edmund; Karp, Peter D; Wahlestedt, Claes
2018-05-01
We present a rationale for expanding the presence of the Lisp family of programming languages in bioinformatics and computational biology research. Put simply, Lisp-family languages enable programmers to more quickly write programs that run faster than in other languages. Languages such as Common Lisp, Scheme and Clojure facilitate the creation of powerful and flexible software that is required for complex and rapidly evolving domains like biology. We will point out several important key features that distinguish languages of the Lisp family from other programming languages, and we will explain how these features can aid researchers in becoming more productive and creating better code. We will also show how these features make these languages ideal tools for artificial intelligence and machine learning applications. We will specifically stress the advantages of domain-specific languages (DSLs): languages that are specialized to a particular area, and thus not only facilitate easier research problem formulation, but also aid in the establishment of standards and best programming practices as applied to the specific research field at hand. DSLs are particularly easy to build in Common Lisp, the most comprehensive Lisp dialect, which is commonly referred to as the 'programmable programming language'. We are convinced that Lisp grants programmers unprecedented power to build increasingly sophisticated artificial intelligence systems that may ultimately transform machine learning and artificial intelligence research in bioinformatics and computational biology.
Khomtchouk, Bohdan B; Weitz, Edmund; Karp, Peter D; Wahlestedt, Claes
2018-01-01
Abstract We present a rationale for expanding the presence of the Lisp family of programming languages in bioinformatics and computational biology research. Put simply, Lisp-family languages enable programmers to more quickly write programs that run faster than in other languages. Languages such as Common Lisp, Scheme and Clojure facilitate the creation of powerful and flexible software that is required for complex and rapidly evolving domains like biology. We will point out several important key features that distinguish languages of the Lisp family from other programming languages, and we will explain how these features can aid researchers in becoming more productive and creating better code. We will also show how these features make these languages ideal tools for artificial intelligence and machine learning applications. We will specifically stress the advantages of domain-specific languages (DSLs): languages that are specialized to a particular area, and thus not only facilitate easier research problem formulation, but also aid in the establishment of standards and best programming practices as applied to the specific research field at hand. DSLs are particularly easy to build in Common Lisp, the most comprehensive Lisp dialect, which is commonly referred to as the ‘programmable programming language’. We are convinced that Lisp grants programmers unprecedented power to build increasingly sophisticated artificial intelligence systems that may ultimately transform machine learning and artificial intelligence research in bioinformatics and computational biology. PMID:28040748
An engineering approach to automatic programming
NASA Technical Reports Server (NTRS)
Rubin, Stuart H.
1990-01-01
An exploratory study of the automatic generation and optimization of symbolic programs using DECOM - a prototypical requirement specification model implemented in pure LISP was undertaken. It was concluded, on the basis of this study, that symbolic processing languages such as LISP can support a style of programming based upon formal transformation and dependent upon the expression of constraints in an object-oriented environment. Such languages can represent all aspects of the software generation process (including heuristic algorithms for effecting parallel search) as dynamic processes since data and program are represented in a uniform format.
NASA Technical Reports Server (NTRS)
Mathur, F. P.
1972-01-01
Several common higher level program languages are described. FORTRAN, ALGOL, COBOL, PL/1, and LISP 1.5 are summarized and compared. FORTRAN is the most widely used scientific programming language. ALGOL is a more powerful language for scientific programming. COBOL is used for most commercial programming applications. LISP 1.5 is primarily a list-processing language. PL/1 attempts to combine the desirable features of FORTRAN, ALGOL, and COBOL into a single language.
NASA Astrophysics Data System (ADS)
Alshakova, E. L.
2017-01-01
The program in the AutoLISP language allows automatically to form parametrical drawings during the work in the AutoCAD software product. Students study development of programs on AutoLISP language with the use of the methodical complex containing methodical instructions in which real examples of creation of images and drawings are realized. Methodical instructions contain reference information necessary for the performance of the offered tasks. The method of step-by-step development of the program is the basis for training in programming on AutoLISP language: the program draws elements of the drawing of a detail by means of definitely created function which values of arguments register in that sequence in which AutoCAD gives out inquiries when performing the corresponding command in the editor. The process of the program design is reduced to the process of step-by-step formation of functions and sequence of their calls. The author considers the development of the AutoLISP program for the creation of parametrical drawings of details, the defined design, the user enters the dimensions of elements of details. These programs generate variants of tasks of the graphic works performed in educational process of "Engineering graphics", "Engineering and computer graphics" disciplines. Individual tasks allow to develop at students skills of independent work in reading and creation of drawings, as well as 3D modeling.
A Programming Language Supporting First-Class Parallel Environments
1989-01-01
Symmetric Lisp later in the thesis. 1.5.1.2 Procedures as Data - Comparison with Lisp Classical Lisp[48, 54] has been altered and extended in many ways... manangement problems. A resource manager controls access to one or more resources shared by concurrently executing processes. Database transaction systems...symmetric languages are related to languages based on more classical models? 3. What are the kinds of uniformity that the symmetric model supports and what
Languages for artificial intelligence: Implementing a scheduler in LISP and in Ada
NASA Technical Reports Server (NTRS)
Hays, Dan
1988-01-01
A prototype scheduler for space experiments originally programmed in a dialect of LISP using some of the more traditional techniques of that language, was recast using an object-oriented LISP, Common LISP with Flavors on the Symbolics. This object-structured version was in turn partially implemented in Ada. The Flavors version showed a decided improvement in both speed of execution and readability of code. The recasting into Ada involved various practical problems of implementation as well as certain challenges of reconceptualization in going from one language to the other. Advantages were realized, however, in greater clarity of the code, especially where more standard flow of control was used. This exercise raised issues about the influence of programming language on the design of flexible and sensitive programs such as schedule planners, and called attention to the importance of factors external to the languages themselves such as system embeddedness, hardware context, and programmer practice.
Programmable Applications: Interpreter Meets Interface
1991-10-01
ics program written for professional architects and designers, and including a huge library of files written in AutoLisp , a "design-enriched" Lisp... AutoLisp procedures). The choice of Lisp as a base language is a happy one for AutoCAD; the application has clearly benefitted from the contribution
A Natural Language Interface to Databases
NASA Technical Reports Server (NTRS)
Ford, D. R.
1990-01-01
The development of a Natural Language Interface (NLI) is presented which is semantic-based and uses Conceptual Dependency representation. The system was developed using Lisp and currently runs on a Symbolics Lisp machine.
Benchmark Lisp And Ada Programs
NASA Technical Reports Server (NTRS)
Davis, Gloria; Galant, David; Lim, Raymond; Stutz, John; Gibson, J.; Raghavan, B.; Cheesema, P.; Taylor, W.
1992-01-01
Suite of nonparallel benchmark programs, ELAPSE, designed for three tests: comparing efficiency of computer processing via Lisp vs. Ada; comparing efficiencies of several computers processing via Lisp; or comparing several computers processing via Ada. Tests efficiency which computer executes routines in each language. Available for computer equipped with validated Ada compiler and/or Common Lisp system.
A Computer Language at the Crossroads: Logo.
ERIC Educational Resources Information Center
Thornburg, David D.
1986-01-01
Reviews Logo programming language's developmental history, including Papert's vision, creation of LISP, and evolution of Logo from LISP; discusses reasons for Logo not becoming a commonplace programming language; describes Logo program design and its utility for serious programmers; and lists sources of further information on Logo. (MBR)
LLOGO: An Implementation of LOGO in LISP. Artificial Intelligence Memo Number 307.
ERIC Educational Resources Information Center
Goldstein, Ira; And Others
LISP LOGO is a computer language invented for the beginning student of man-machine interaction. The language has the advantages of simplicity and naturalness as well as that of emphasizing the difference between programs and data. The language is based on the LOGO language and uses mnemonic syllables as commands. It can be used in conjunction with…
Knowledge, programming, and programming cultures: LISP, C, and Ada
NASA Technical Reports Server (NTRS)
Rochowiak, Daniel
1990-01-01
The results of research 'Ada as an implementation language for knowledge based systems' are presented. The purpose of the research was to compare Ada to other programming languages. The report focuses on the programming languages Ada, C, and Lisp, the programming cultures that surround them, and the programming paradigms they support.
LISP as an Environment for Software Design: Powerful and Perspicuous
Blum, Robert L.; Walker, Michael G.
1986-01-01
The LISP language provides a useful set of features for prototyping knowledge-intensive, clinical applications software that is not found In most other programing environments. Medical computer programs that need large medical knowledge bases, such as programs for diagnosis, therapeutic consultation, education, simulation, and peer review, are hard to design, evolve continually, and often require major revisions. They necessitate an efficient and flexible program development environment. The LISP language and programming environments bullt around it are well suited for program prototyping. The lingua franca of artifical intelligence researchers, LISP facllitates bullding complex systems because it is simple yet powerful. Because of its simplicity, LISP programs can read, execute, modify and even compose other LISP programs at run time. Hence, it has been easy for system developers to create programming tools that greatly speed the program development process, and that may be easily extended by users. This has resulted in the creation of many useful graphical interfaces, editors, and debuggers, which facllitate the development of knowledge-intensive medical applications.
Artificial intelligence programming languages for computer aided manufacturing
NASA Technical Reports Server (NTRS)
Rieger, C.; Samet, H.; Rosenberg, J.
1979-01-01
Eight Artificial Intelligence programming languages (SAIL, LISP, MICROPLANNER, CONNIVER, MLISP, POP-2, AL, and QLISP) are presented and surveyed, with examples of their use in an automated shop environment. Control structures are compared, and distinctive features of each language are highlighted. A simple programming task is used to illustrate programs in SAIL, LISP, MICROPLANNER, and CONNIVER. The report assumes reader knowledge of programming concepts, but not necessarily of the languages surveyed.
A natural language interface to databases
NASA Technical Reports Server (NTRS)
Ford, D. R.
1988-01-01
The development of a Natural Language Interface which is semantic-based and uses Conceptual Dependency representation is presented. The system was developed using Lisp and currently runs on a Symbolics Lisp machine. A key point is that the parser handles morphological analysis, which expands its capabilities of understanding more words.
Comparison of LISP and MUMPS as implementation languages for knowledge-based systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis, A.C.
1984-01-01
Major components of knowledge-based systems are summarized, along with the programming language features generally useful in their implementation. LISP and MUMPS are briefly described and compared as vehicles for building knowledge-based systems. The paper concludes with suggestions for extensions to MUMPS which might increase its usefulness in artificial intelligence applications without affecting the essential nature of the language. 8 references.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azevedo, S.G.; Fitch, J.P.
1987-10-21
Conventional software interfaces that use imperative computer commands or menu interactions are often restrictive environments when used for researching new algorithms or analyzing processed experimental data. We found this to be true with current signal-processing software (SIG). As an alternative, ''functional language'' interfaces provide features such as command nesting for a more natural interaction with the data. The Image and Signal LISP Environment (ISLE) is an example of an interpreted functional language interface based on common LISP. Advantages of ISLE include multidimensional and multiple data-type independence through dispatching functions, dynamic loading of new functions, and connections to artificial intelligence (AI)more » software. 10 refs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azevedo, S.G.; Fitch, J.P.
1987-05-01
Conventional software interfaces which utilize imperative computer commands or menu interactions are often restrictive environments when used for researching new algorithms or analyzing processed experimental data. We found this to be true with current signal processing software (SIG). Existing ''functional language'' interfaces provide features such as command nesting for a more natural interaction with the data. The Image and Signal Lisp Environment (ISLE) will be discussed as an example of an interpreted functional language interface based on Common LISP. Additional benefits include multidimensional and multiple data-type independence through dispatching functions, dynamic loading of new functions, and connections to artificial intelligencemore » software.« less
ERIC Educational Resources Information Center
Novak, Gordon S., Jr.
GLISP is a high-level computer language (based on Lisp and including Lisp as a sublanguage) which is compiled into Lisp. GLISP programs are compiled relative to a knowledge base of object descriptions, a form of abstract datatypes. A primary goal of the use of abstract datatypes in GLISP is to allow program code to be written in terms of objects,…
NASA Technical Reports Server (NTRS)
Davis, G. J.
1994-01-01
One area of research of the Information Sciences Division at NASA Ames Research Center is devoted to the analysis and enhancement of processors and advanced computer architectures, specifically in support of automation and robotic systems. To compare systems' abilities to efficiently process Lisp and Ada, scientists at Ames Research Center have developed a suite of non-parallel benchmarks called ELAPSE. The benchmark suite was designed to test a single computer's efficiency as well as alternate machine comparisons on Lisp, and/or Ada languages. ELAPSE tests the efficiency with which a machine can execute the various routines in each environment. The sample routines are based on numeric and symbolic manipulations and include two-dimensional fast Fourier transformations, Cholesky decomposition and substitution, Gaussian elimination, high-level data processing, and symbol-list references. Also included is a routine based on a Bayesian classification program sorting data into optimized groups. The ELAPSE benchmarks are available for any computer with a validated Ada compiler and/or Common Lisp system. Of the 18 routines that comprise ELAPSE, provided within this package are 14 developed or translated at Ames. The others are readily available through literature. The benchmark that requires the most memory is CHOLESKY.ADA. Under VAX/VMS, CHOLESKY.ADA requires 760K of main memory. ELAPSE is available on either two 5.25 inch 360K MS-DOS format diskettes (standard distribution) or a 9-track 1600 BPI ASCII CARD IMAGE format magnetic tape. The contents of the diskettes are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. The ELAPSE benchmarks were written in 1990. VAX and VMS are trademarks of Digital Equipment Corporation. MS-DOS is a registered trademark of Microsoft Corporation.
PORTABLE LISP; a list-processing interpreter. [CDC7600; PASCAL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, W.P.
The program constitutes a complete, basic LISP (LIST-Processing language) interpreter. LISP expressions are evaluated one by one with both the input expression and the resulting evaluated expression printed. Expressions are evaluated until a FIN card is encountered. Between expression evaluations a garbage-collection algorithm is invoked to recover list space used in the previous evaluation.CDC7600; PASCAL; SCOPE; The sample problem was executed in 7000 (octal) words of memory on a CDC7600.
Automated Design of a Microprogrammed Controller for a Finite State Machine
1988-06-01
Franz Lisp and the Liszt compiler. The most im- portant component of lincoln.l is the defstruct (define structure) macro. The defstruct macro is used to...multiple definition problem for the global variable "ospeed". This problem was the result of loading a C language object file into Franz Lisp or Liszt with...a global variable which had the same name as one inside Lisp or Liszt . MacPitts was written using an older Opus of Franz that did not have a termcap
What Is a Programming Language?
ERIC Educational Resources Information Center
Wold, Allen
1983-01-01
Explains what a computer programing language is in general, the differences between machine language, assembler languages, and high-level languages, and the functions of compilers and interpreters. High-level languages mentioned in the article are: BASIC, FORTRAN, COBOL, PILOT, LOGO, LISP, and SMALLTALK. (EAO)
Principled design for an integrated computational environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Disessa, A.A.
Boxer is a computer language designed to be the base of an integrated computational environment providing a broad array of functionality -- from text editing to programming -- for naive and novice users. It stands in the line of Lisp inspired languages (Lisp, Logo, Scheme), but differs from these in achieving much of its understandability from pervasive use of a spatial metaphor reinforced through suitable graphics. This paper describes a set of learnability and understandability issues first and then uses them to motivate design decisions made concerning Boxer and the environment in which it is embedded.
ERIC Educational Resources Information Center
Tesler, Lawrence G.
1984-01-01
Discusses the nature of programing languages, considering the features of BASIC, LOGO, PASCAL, COBOL, FORTH, APL, and LISP. Also discusses machine/assembly codes, the operation of a compiler, and trends in the evolution of programing languages (including interest in notational systems called object-oriented languages). (JN)
An evaluation of Ada for Al applications
NASA Technical Reports Server (NTRS)
Wallace, David R.
1986-01-01
Expert system technology seems to be the most promising type of Artificial Intelligence (AI) application for Ada. An expert system implemented with an expert system shell provides a highly structured approach that fits well with the structured approach found in Ada systems. The current commercial expert system shells use Lisp. In this highly structured situation a shell could be built that used Ada just as well. On the other hand, if it is necessary to deal with some AI problems that are not suited to expert systems, the use of Ada becomes more problematical. Ada was not designed as an AI development language, and is not suited to that. It is possible that an application developed in say, Common Lisp could be translated to Ada for actual use in a particular application, but this could be difficult. Some standard Ada packages could be developed to make such a translation easier. If the most general AI programs need to be dealt with, a Common Lisp system integrated with the Ada Environment is probably necessary. Aside from problems with language features, Ada, by itself, is not well suited to the prototyping and incremental development that is well supported by Lisp.
Computer Aided Design Parameters for Forward Basing
1988-12-01
21 meters. Systematic errors within limits stated for absolute accuracy are tolerated at this level. DEM data acquired photogrammetrically using manual ...This is a professional drawing package, 19 capable of the manipulation required for this project. With the AutoLISP programming language (a variation on...Table 2). 0 25 Data Conversion Package II GWN System’s Digital Terrain Modeling (DTM) package was used. This AutoLISP -based third party software is
Clips as a knowledge based language
NASA Technical Reports Server (NTRS)
Harrington, James B.
1987-01-01
CLIPS is a language for writing expert systems applications on a personal or small computer. Here, the CLIPS programming language is described and compared to three other artificial intelligence (AI) languages (LISP, Prolog, and OPS5) with regard to the processing they provide for the implementation of a knowledge based system (KBS). A discussion is given on how CLIPS would be used in a control system.
The Most Important Languages Today.
ERIC Educational Resources Information Center
Graduating Engineer, 1985
1985-01-01
Lists and describes seven languages (COBOL, FORTRAN, BASIC, PASCAL, C, ADA, and LISP), pointing out that familiarity with one or more will enhance employability. Also lists and describes four operating systems (MSDOS, UNIX, MVS, and VAXX/VMS), indicating that knowledge of these systems will further enhance employability or on-the-job performance.…
Programming While Construction of Engineering 3D Models of Complex Geometry
NASA Astrophysics Data System (ADS)
Kheyfets, A. L.
2017-11-01
The capabilities of geometrically accurate computational 3D models construction with the use of programming are presented. The construction of models of an architectural arch and a glo-boid worm gear is considered as an example. The models are designed in the AutoCAD pack-age. Three programs of construction are given. The first program is for designing a multi-section architectural arch. The control of the arch’s geometry by impacting its main parameters is shown. The second program is for designing and studying the working surface of a globoid gear’s worm. The article shows how to make the animation for this surface’s formation. The third program is for formation of a worm gear cavity surface. The cavity formation dynamics is studied. The programs are written in the AutoLisp programming language. The program texts are provided.
Web-based UMLS concept retrieval by automatic text scanning: a comparison of two methods.
Brandt, C; Nadkarni, P
2001-01-01
The Web is increasingly the medium of choice for multi-user application program delivery. Yet selection of an appropriate programming environment for rapid prototyping, code portability, and maintainability remain issues. We summarize our experience on the conversion of a LISP Web application, Search/SR to a new, functionally identical application, Search/SR-ASP using a relational database and active server pages (ASP) technology. Our results indicate that provision of easy access to database engines and external objects is almost essential for a development environment to be considered viable for rapid and robust application delivery. While LISP itself is a robust language, its use in Web applications may be hard to justify given that current vendor implementations do not provide such functionality. Alternative, currently available scripting environments for Web development appear to have most of LISP's advantages and few of its disadvantages.
AUTOCLASS III - AUTOMATIC CLASS DISCOVERY FROM DATA
NASA Technical Reports Server (NTRS)
Cheeseman, P. C.
1994-01-01
The program AUTOCLASS III, Automatic Class Discovery from Data, uses Bayesian probability theory to provide a simple and extensible approach to problems such as classification and general mixture separation. Its theoretical basis is free from ad hoc quantities, and in particular free of any measures which alter the data to suit the needs of the program. As a result, the elementary classification model used lends itself easily to extensions. The standard approach to classification in much of artificial intelligence and statistical pattern recognition research involves partitioning of the data into separate subsets, known as classes. AUTOCLASS III uses the Bayesian approach in which classes are described by probability distributions over the attributes of the objects, specified by a model function and its parameters. The calculation of the probability of each object's membership in each class provides a more intuitive classification than absolute partitioning techniques. AUTOCLASS III is applicable to most data sets consisting of independent instances, each described by a fixed length vector of attribute values. An attribute value may be a number, one of a set of attribute specific symbols, or omitted. The user specifies a class probability distribution function by associating attribute sets with supplied likelihood function terms. AUTOCLASS then searches in the space of class numbers and parameters for the maximally probable combination. It returns the set of class probability function parameters, and the class membership probabilities for each data instance. AUTOCLASS III is written in Common Lisp, and is designed to be platform independent. This program has been successfully run on Symbolics and Explorer Lisp machines. It has been successfully used with the following implementations of Common LISP on the Sun: Franz Allegro CL, Lucid Common Lisp, and Austin Kyoto Common Lisp and similar UNIX platforms; under the Lucid Common Lisp implementations on VAX/VMS v5.4, VAX/Ultrix v4.1, and MIPS/Ultrix v4, rev. 179; and on the Macintosh personal computer. The minimum Macintosh required is the IIci. This program will not run under CMU Common Lisp or VAX/VMS DEC Common Lisp. A minimum of 8Mb of RAM is required for Macintosh platforms and 16Mb for workstations. The standard distribution medium for this program is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format and a 3.5 inch diskette in Macintosh format. An electronic copy of the documentation is included on the distribution medium. AUTOCLASS was developed between March 1988 and March 1992. It was initially released in May 1991. Sun is a trademark of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories. DEC, VAX, VMS, and ULTRIX are trademarks of Digital Equipment Corporation. Macintosh is a trademark of Apple Computer, Inc. Allegro CL is a registered trademark of Franz, Inc.
Compiling knowledge-based systems from KEE to Ada
NASA Technical Reports Server (NTRS)
Filman, Robert E.; Bock, Conrad; Feldman, Roy
1990-01-01
The dominant technology for developing AI applications is to work in a multi-mechanism, integrated, knowledge-based system (KBS) development environment. Unfortunately, systems developed in such environments are inappropriate for delivering many applications - most importantly, they carry the baggage of the entire Lisp environment and are not written in conventional languages. One resolution of this problem would be to compile applications from complex environments to conventional languages. Here the first efforts to develop a system for compiling KBS developed in KEE to Ada (trademark). This system is called KATYDID, for KEE/Ada Translation Yields Development Into Delivery. KATYDID includes early prototypes of a run-time KEE core (object-structure) library module for Ada, and translation mechanisms for knowledge structures, rules, and Lisp code to Ada. Using these tools, part of a simple expert system was compiled (not quite automatically) to run in a purely Ada environment. This experience has given us various insights on Ada as an artificial intelligence programming language, potential solutions of some of the engineering difficulties encountered in early work, and inspiration on future system development.
Benchmarks of programming languages for special purposes in the space station
NASA Technical Reports Server (NTRS)
Knoebel, Arthur
1986-01-01
Although Ada is likely to be chosen as the principal programming language for the Space Station, certain needs, such as expert systems and robotics, may be better developed in special languages. The languages, LISP and Prolog, are studied and some benchmarks derived. The mathematical foundations for these languages are reviewed. Likely areas of the space station are sought out where automation and robotics might be applicable. Benchmarks are designed which are functional, mathematical, relational, and expert in nature. The coding will depend on the particular versions of the languages which become available for testing.
A PC based fault diagnosis expert system
NASA Technical Reports Server (NTRS)
Marsh, Christopher A.
1990-01-01
The Integrated Status Assessment (ISA) prototype expert system performs system level fault diagnosis using rules and models created by the user. The ISA evolved from concepts to a stand-alone demonstration prototype using OPS5 on a LISP Machine. The LISP based prototype was rewritten in C and the C Language Integrated Production System (CLIPS) to run on a Personal Computer (PC) and a graphics workstation. The ISA prototype has been used to demonstrate fault diagnosis functions of Space Station Freedom's Operation Management System (OMS). This paper describes the development of the ISA prototype from early concepts to the current PC/workstation version used today and describes future areas of development for the prototype.
The Design and Implementation of an Object-Oriented, Production-Rule Interpreter.
1984-12-01
S. CONTRACT OR GRANT NUMBER(s) .Heinz M. McArthur 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK AREA & WORK UNIT...implementation of two prototype interpreters for Omega, an object-oriented, production- rule programming language. The first implementation is a throw- away...production-rule programming language. The first implementa- tion is a throw-away prototype written in LISP; the second implementation is a more complete
Building a base map with AutoCAD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flarity, S.J.
1989-12-01
The fundamental step in the exploration process is building a base map. Consequently, any serious computer exploration program should be capable of providing base maps. Data used in constructing base maps are available from commercial sources such as Tobin. and Petroleum Information. These data sets include line and well data, the line data being latitude longitude vectors, and the ell data any identifying text information for well and their locations. AutoCAD is a commercial program useful in building base maps. Its features include infinite zoom and pan capability, layering, block definition, text dialog boxes, and a command language, AutoLisp. AutoLispmore » provides more power by allowing the geologist to modify the way the program works. Three AutoLisp routines presented here allow geologists to construct a geologic base map from raw Tobin data. The first program, WELLS.LSP, sets up the map environment for the subsequent programs, WELLADD.LSP and LINEADD.LSP. Welladd.lisp reads the Tobin data and spots the well symbols and the identifying information. Lineadd.lsp performs the same task on line and textural information contained within the data set.« less
Experiments with microcomputer-based artificial intelligence environments
Summers, E.G.; MacDonald, R.A.
1988-01-01
The U.S. Geological Survey (USGS) has been experimenting with the use of relatively inexpensive microcomputers as artificial intelligence (AI) development environments. Several AI languages are available that perform fairly well on desk-top personal computers, as are low-to-medium cost expert system packages. Although performance of these systems is respectable, their speed and capacity limitations are questionable for serious earth science applications foreseen by the USGS. The most capable artificial intelligence applications currently are concentrated on what is known as the "artificial intelligence computer," and include Xerox D-series, Tektronix 4400 series, Symbolics 3600, VAX, LMI, and Texas Instruments Explorer. The artificial intelligence computer runs expert system shells and Lisp, Prolog, and Smalltalk programming languages. However, these AI environments are expensive. Recently, inexpensive 32-bit hardware has become available for the IBM/AT microcomputer. USGS has acquired and recently completed Beta-testing of the Gold Hill Systems 80386 Hummingboard, which runs Common Lisp on an IBM/AT microcomputer. Hummingboard appears to have the potential to overcome many of the speed/capacity limitations observed with AI-applications on standard personal computers. USGS is a Beta-test site for the Gold Hill Systems GoldWorks expert system. GoldWorks combines some high-end expert system shell capabilities in a medium-cost package. This shell is developed in Common Lisp, runs on the 80386 Hummingboard, and provides some expert system features formerly available only on AI-computers including frame and rule-based reasoning, on-line tutorial, multiple inheritance, and object-programming. ?? 1988 International Association for Mathematical Geology.
Performance Analysis of Garbage Collection and Dynamic Reordering in a Lisp System. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Llames, Rene Lim
1991-01-01
Generation based garbage collection and dynamic reordering of objects are two techniques for improving the efficiency of memory management in Lisp and similar dynamic language systems. An analysis of the effect of generation configuration is presented, focusing on the effect of a number of generations and generation capabilities. Analytic timing and survival models are used to represent garbage collection runtime and to derive structural results on its behavior. The survival model provides bounds on the age of objects surviving a garbage collection at a particular level. Empirical results show that execution time is most sensitive to the capacity of the youngest generation. A technique called scanning for transport statistics, for evaluating the effectiveness of reordering independent of main memory size, is presented.
Issues central to a useful image understanding environment
NASA Astrophysics Data System (ADS)
Beveridge, J. Ross; Draper, Bruce A.; Hanson, Allen R.; Riseman, Edward M.
1992-04-01
A recent DARPA initiative has sparked interested in software environments for computer vision. The goal is a single environment to support both basic research and technology transfer. This paper lays out six fundamental attributes such a system must possess: (1) support for both C and Lisp, (2) extensibility, (3) data sharing, (4) data query facilities tailored to vision, (5) graphics, and (6) code sharing. The first three attributes fundamentally constrain the system design. Support for both C and Lisp demands some form of database or data-store for passing data between languages. Extensibility demands that system support facilities, such as spatial retrieval of data, be readily extended to new user-defined datatypes. Finally, data sharing demands that data saved by one user, including data of a user-defined type, must be readable by another user.
1991-08-05
adjectives rule with the no-pronoun rule: Take 250-grams-of-large-fresh-ripe-tomatoes. Peel the 250-grams-of-large-fresh-ripe-tomatoes. Chop the 250...me whether you store the word " banana " using one molecule of acetylcholine or two, and by a garden path argument I cannot therefore care whether you
NASA Astrophysics Data System (ADS)
Anderson, John R.; Boyle, C. Franklin; Reiser, Brian J.
1985-04-01
Cognitive psychology, artificial intelligence, and computer technology have advanced to the point where it is feasible to build computer systems that are as effective as intelligent human tutors. Computer tutors based on a set of pedagogical principles derived from the ACT theory of cognition have been developed for teaching students to do proofs in geometry and to write computer programs in the language LISP.
Anderson, J R; Boyle, C F; Reiser, B J
1985-04-26
Cognitive psychology, artificial intelligence, and computer technology have advanced to the point where it is feasible to build computer systems that are as effective as intelligent human tutors. Computer tutors based on a set of pedagogical principles derived from the ACT theory of cognition have been developed for teaching students to do proofs in geometry and to write computer programs in the language LISP.
Intelligibility of Digital Speech Masked by Noise: Normal Hearing and Hearing Impaired Listeners
1990-06-01
spectrograms of these phrases were generated by a List 13 Processing Language (LISP) on a Symbolics 3670 artificial intelligence computer (see Figure 10). The...speech and the amount of difference varies with the type of vocoder. 26 ADPC INTELIGIBILITY AND TOE OF MAING 908 78- INTELLIGIBILITY 48 LI OS NORMA 30
Silicon Compilation Using a Lisp-Based Layout Language.
1986-06-01
12, 15 October 19184. Gajski , D.D., "The Structure of A Silicon Compiler", IEEE International Conference on Circuits and Comouters 1982(ICCC 82...IEEE Press, 1982. Gajski , D.D. and Kuhn, R.H.," Guest Editors’ Introduction: New VLSI Tools", Comguter Volume 16, Number 12, 1983. Gajs i, D.D., "Silicon
Microdefects and self-interstitial diffusion in crystalline silicon
NASA Astrophysics Data System (ADS)
Knowlton, William Barthelemy
In this thesis, a study is presented of D-defects and self-interstitial diffusion in silicon using Li ion (Lisp+) drifting in an electric field and transmission electron microscopy (TEM). Obstruction of Lisp+ drifting has been found in wafers from certain but not all FZ p-type Si. Incomplete Lisp+ drifting always occurs in the central region of the wafers. This work established that interstitial oxygen is not responsible for hindering Lisp+ drifting. The Osb i concentration was measured ({˜}2× 10sp{15}\\ cmsp{-3}) by local vibrational mode Fourier transform infrared spectroscopy and did not vary radially across the wafer. TEM was performed on a samples from the partially Lisp+ drifted area and compared to regions without D-defects. Precipitates were found only in the region containing D-defects that had partially Lisp+ drifted. This result indicates D-defects are responsible for the precipitation that halts the Lisp+ drift process. The precipitates were characterized using selected area diffraction (SAD) and image contrast analysis. The results suggested that the precipitates may cause stacking faults and their identity may be lithium silicides such as Lisb{21}Sisb5\\ and\\ Lisb{13}Sisb4. TEM revealed a decreasing distribution of Li precipitates as a function of Lisp+ drift depth along the growth direction. A preliminary model is presented that simulates Lisp+ drifting. The objective of the model is to incorporate the Li precipitate density distribution and Lisp+ drift depth to extract the size and capture cross-section of the D-defects. Nitrogen (N) doping has been shown to eliminate D-defects as measured by conventional techniques. However, Lisp+ drifting has shown that D-defects are indeed still present. Lisp+ drifting is able to detect D-defects at concentrations lower than conventional techniques. Lisp+ drifting and D-defects provide a useful means to study Si self-interstitial diffusion. The process modeling program SUPREM-IV was used to simulate the results of Si self-interstitial diffusion obtained from Lisp+ drifting experiments. Anomalous results from the Si self-interstitial diffusion experiments forced a re-examination of the possibility of thermal dissociation of D-defects. Thermal annealing experiments that were performed support this possibility. A review of the current literature illustrates the need for more research on the effects of thermal processing on FZ Si to understand the dissolution kinetics of D-defects.
Intelligent guidance and control for wind shear encounter
NASA Technical Reports Server (NTRS)
Stengel, Robert F.
1988-01-01
The principal objective is to develop methods for assessing the likelihood of wind shear encounter, for deciding what flight path to pursue, and for using the aircraft's full potential for combating wind shear. This study requires the definition of both deterministic and statistical techniques for fusing internal and external information, for making go/no-go decisions, and for generating commands to the aircraft's cockpit displays and autopilot for both manually controlled and automatic flight. The program has begun with the development of a real-time expert system for pilot aiding that is based on the results of the FAA Windshear Training Aids Program. A two-volume manual that presents an overview, pilot guide, training program, and substantiating data provides guidelines for this initial development. The Expert System to Avoid Wind Shear (ESAWS) currently contains over 140 rules and is coded in the LISP programming language for implementation on a Symbolics 3670 LISP machine.
SX User's Manual for SX version 2. 0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, S.A.; Braddy, D.
1993-01-04
Scheme is a lexically scoped, properly tail recursive dialect of the LISP programming language. The PACT implementation is described abstractly in Abelson and Sussman's book, Structure and Interpretation of Computer Programs. It features all of the essential procedures'' described in the Revised Report on Scheme'' which defines the standard for Scheme. In PACT, Scheme is implemented as a library; however, a small driver delivers a stand alone Scheme interpreter. The PACT implementation features a reference counting incremental garbage collector. This distributes the overhead of memory management throughout the running of Scheme code. It also tends to keep Scheme from tryingmore » to grab the entire machine on which it is running which some garbage collection schemes will attempt to do. SX is perhaps the ultimate PACT statement. It is simply Scheme plus the other parts of PACT. A more precise way to describe it is as a dialect of LISP with extensions for PGS, PDB, PDBX, PML, and PANACEA. What this yields is an interpretive language whose primitive procedures span the functionality of all of PACT. Like the Scheme implementation which it extends, SX provides both a library and a stand alone application. The stand alone interpreter is the engine behind applications such as PDBView and PDBDiff. The SX library is the heart of TRANSL, a tool to translate data files from one database format to another. The modularization and layering make it possible to use the PACT components like building blocks. In addition, SX contains functionality which is the generalization of that found in ULTRA II. This means that as the development of SX proceeds, an SX driven application will be able to,perform arbitrary dimensional presentation, analysis, and manipulation tasks. Because of the fundamental unity of these two PACT parts, they are documented in a single manual. The first part will cover the standard Scheme functionality and the second part will discuss the SX extensions.« less
SX User`s Manual for SX version 2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, S.A.; Braddy, D.
1993-01-04
Scheme is a lexically scoped, properly tail recursive dialect of the LISP programming language. The PACT implementation is described abstractly in Abelson and Sussman`s book, Structure and Interpretation of Computer Programs. It features all of the ``essential procedures`` described in the ``Revised Report on Scheme`` which defines the standard for Scheme. In PACT, Scheme is implemented as a library; however, a small driver delivers a stand alone Scheme interpreter. The PACT implementation features a reference counting incremental garbage collector. This distributes the overhead of memory management throughout the running of Scheme code. It also tends to keep Scheme from tryingmore » to grab the entire machine on which it is running which some garbage collection schemes will attempt to do. SX is perhaps the ultimate PACT statement. It is simply Scheme plus the other parts of PACT. A more precise way to describe it is as a dialect of LISP with extensions for PGS, PDB, PDBX, PML, and PANACEA. What this yields is an interpretive language whose primitive procedures span the functionality of all of PACT. Like the Scheme implementation which it extends, SX provides both a library and a stand alone application. The stand alone interpreter is the engine behind applications such as PDBView and PDBDiff. The SX library is the heart of TRANSL, a tool to translate data files from one database format to another. The modularization and layering make it possible to use the PACT components like building blocks. In addition, SX contains functionality which is the generalization of that found in ULTRA II. This means that as the development of SX proceeds, an SX driven application will be able to,perform arbitrary dimensional presentation, analysis, and manipulation tasks. Because of the fundamental unity of these two PACT parts, they are documented in a single manual. The first part will cover the standard Scheme functionality and the second part will discuss the SX extensions.« less
Toward a theory of distributed word expert natural language parsing
NASA Technical Reports Server (NTRS)
Rieger, C.; Small, S.
1981-01-01
An approach to natural language meaning-based parsing in which the unit of linguistic knowledge is the word rather than the rewrite rule is described. In the word expert parser, knowledge about language is distributed across a population of procedural experts, each representing a word of the language, and each an expert at diagnosing that word's intended usage in context. The parser is structured around a coroutine control environment in which the generator-like word experts ask questions and exchange information in coming to collective agreement on sentence meaning. The word expert theory is advanced as a better cognitive model of human language expertise than the traditional rule-based approach. The technical discussion is organized around examples taken from the prototype LISP system which implements parts of the theory.
The desktop interface in intelligent tutoring systems
NASA Technical Reports Server (NTRS)
Baudendistel, Stephen; Hua, Grace
1987-01-01
The interface between an Intelligent Tutoring System (ITS) and the person being tutored is critical to the success of the learning process. If the interface to the ITS is confusing or non-supportive of the tutored domain, the effectiveness of the instruction will be diminished or lost entirely. Consequently, the interface to an ITS should be highly integrated with the domain to provide a robust and semantically rich learning environment. In building an ITS for ZetaLISP on a LISP Machine, a Desktop Interface was designed to support a programming learning environment. Using the bitmapped display, windows, and mouse, three desktops were designed to support self-study and tutoring of ZetaLISP. Through organization, well-defined boundaries, and domain support facilities, the desktops provide substantial flexibility and power for the student and facilitate learning ZetaLISP programming while screening the student from the complex LISP Machine environment. The student can concentrate on learning ZetaLISP programming and not on how to operate the interface or a LISP Machine.
A natural language query system for Hubble Space Telescope proposal selection
NASA Technical Reports Server (NTRS)
Hornick, Thomas; Cohen, William; Miller, Glenn
1987-01-01
The proposal selection process for the Hubble Space Telescope is assisted by a robust and easy to use query program (TACOS). The system parses an English subset language sentence regardless of the order of the keyword phases, allowing the user a greater flexibility than a standard command query language. Capabilities for macro and procedure definition are also integrated. The system was designed for flexibility in both use and maintenance. In addition, TACOS can be applied to any knowledge domain that can be expressed in terms of a single reaction. The system was implemented mostly in Common LISP. The TACOS design is described in detail, with particular attention given to the implementation methods of sentence processing.
Research in mathematical theory of computation. [computer programming applications
NASA Technical Reports Server (NTRS)
Mccarthy, J.
1973-01-01
Research progress in the following areas is reviewed: (1) new version of computer program LCF (logic for computable functions) including a facility to search for proofs automatically; (2) the description of the language PASCAL in terms of both LCF and in first order logic; (3) discussion of LISP semantics in LCF and attempt to prove the correctness of the London compilers in a formal way; (4) design of both special purpose and domain independent proving procedures specifically program correctness in mind; (5) design of languages for describing such proof procedures; and (6) the embedding of ideas in the first order checker.
C-Language Integrated Production System, Version 5.1
NASA Technical Reports Server (NTRS)
Riley, Gary; Donnell, Brian; Ly, Huyen-Anh VU; Culbert, Chris; Savely, Robert T.; Mccoy, Daniel J.; Giarratano, Joseph
1992-01-01
CLIPS 5.1 provides cohesive software tool for handling wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming provides representation of knowledge by use of heuristics. Object-oriented programming enables modeling of complex systems as modular components. Procedural programming enables CLIPS to represent knowledge in ways similar to those allowed in such languages as C, Pascal, Ada, and LISP. Working with CLIPS 5.1, one can develop expert-system software by use of rule-based programming only, object-oriented programming only, procedural programming only, or combinations of the three.
Computational Understanding: Analysis of Sentences and Context
1974-05-01
Computer Science Department Stanford, California 9430b 10- PROGRAM ELEMENT. PROJECT. TASK AREA « WORK UNIT NUMBERS II. CONTROLLING OFFICE NAME...these is the need tor programs that can respond in useful ways to information expressed in a natural language. However a computational understanding...buying structure because "Mary" appears where it does. But the time for analysis was rarely over five seconds of computer time, when the Lisp program
1984-06-01
programming environment and then dumped, as described in the Franz Lisp manual [Ref. 13]. A synopsis of the functional elements which make up this LISP...the average system usage rate. Lines i4 and 15 reflect a function of Franz Lisp wherein past used storage locations are reclaimed for the available... Franz Lisp Opus 38. Also included in this distribu- tion are two library files containing the bonding Fad a layouts in CIF, and a library file
Translating an AI application from Lisp to Ada: A case study
NASA Technical Reports Server (NTRS)
Davis, Gloria J.
1991-01-01
A set of benchmarks was developed to test the performance of a newly designed computer executing both Lisp and Ada. Among these was AutoClassII -- a large Artificial Intelligence (AI) application written in Common Lisp. The extraction of a representative subset of this complex application was aided by a Lisp Code Analyzer (LCA). The LCA enabled rapid analysis of the code, putting it in a concise and functionally readable form. An equivalent benchmark was created in Ada through manual translation of the Lisp version. A comparison of the execution results of both programs across a variety of compiler-machine combinations indicate that line-by-line translation coupled with analysis of the initial code can produce relatively efficient and reusable target code.
An expert system for wind shear avoidance
NASA Technical Reports Server (NTRS)
Stengel, Robert F.; Stratton, D. Alexander
1990-01-01
The principal objectives are to develop methods for assessing the likelihood of wind shear encounter (based on real-time information in the cockpit), for deciding what flight path to pursue (e.g., takeoff abort, landing go-around, or normal climbout or glide slope), and for using the aircraft's full potential for combating wind shear. This study requires the definition of both deterministic and statistical techniques for fusing internal and external information, for making go/no-go decisions, and for generating commands to the aircraft's autopilot and flight directors for both automatic and manually controlled flight. The expert system for pilot aiding is based on the results of the FAA Windshear Training Aids Program, a two-volume manual that presents an overview, pilot guide, training program, and substantiating data that provides guidelines for this initial development. The Windshear Safety Advisor expert system currently contains over 140 rules and is coded in the LISP programming language for implementation on a Symbolics 3670 LISP Machine.
Application programs written by using customizing tools of a computer-aided design system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, X.; Huang, R.; Juricic, D.
1995-12-31
Customizing tools of Computer-Aided Design Systems have been developed to such a degree as to become equivalent to powerful higher-level programming languages that are especially suitable for graphics applications. Two examples of application programs written by using AutoCAD`s customizing tools are given in some detail to illustrate their power. One tool uses AutoLISP list-processing language to develop an application program that produces four views of a given solid model. The other uses AutoCAD Developmental System, based on program modules written in C, to produce an application program that renders a freehand sketch from a given CAD drawing.
The Prevalence of Lisping in Gay Men
ERIC Educational Resources Information Center
Van Borsel, John; De Bruyn, Els; Lefebvre, Evelien; Sokoloff, Anouschka; De Ley, Sophia; Baudonck, Nele
2009-01-01
This study evaluated the stereotype that gay men lisp. Two clinicians who were unaware of the specific purpose of the study and the populations involved judged randomized audio-recordings of 175 gay males, 100 heterosexual males and 100 heterosexual females for the presence of lisping during reading of a standardized text. In the gay males a…
An Expert System For Tuning Particle-Beam Accelerators
NASA Astrophysics Data System (ADS)
Lager, Darrel L.; Brand, Hal R.; Maurer, William J.; Searfus, Robert M.; Hernandez, Jose E.
1989-03-01
We have developed a proof-of-concept prototype of an expert system for tuning particle beam accelerators. It is designed to function as an intelligent assistant for an operator. In its present form it implements the strategies and reasoning followed by the operator for steering through the beam transport section of the Advanced Test Accelerator at Lawrence Livermore Laboratory's Site 300. The system is implemented in the language LISP using the Artificial Intelligence concepts of frames, daemons, and a representation we developed called a Monitored Decision Script.
VLSI (Very Large Scale Integrated Circuits) Design with the MacPitts Silicon Compiler.
1985-09-01
the background. If the algorithm is not fully debugged, then issue instead macpitts basename herald so MacPitts diagnostics and Liszt diagnostics both...command interpreter. Upon compilation, however, the following LI!F compiler ( Liszt ) diagnostic results, Error: Non-number to minus nil where the first...language used in the MacPitts source code. The more instructive solution is to write the Franz LISP code to decide if a jumper wire is needed, and if so, to
Organizational Response to the Introduction of New Computer Software Technology
1991-07-01
the documentation to be much use at all." Another said that "the tutorial did a good job, but ... the manual did an average job." The Lotus Manuscript...when they have a specific use in mind and believe they can find the information easily in the manual . 12 The AutoCAD users were also split on their...AutoCAD user with AutoLISP , a programming language included in the package. (Some CADD packages come with these features and others as part of the
Modeling and Implementation of Visibility in Programming Languages
1987-12-01
Birtwistle et al. 1973] [Goldberg and Robson 1983] [Maurer 1976] [Rees et al. 1984] [Rees and Adams IV 1982] [Jones and Muchnick 1978] Table 2.1...Word and Object, MIT Press, Cambridge, 1960. REES, J. A. and ADAMS IV, N. I., "T: a dialect of LISP or, L AMBDA: the ultimate software tool...34, Conference record of the 1982 ACM symposium on USP and functional progranuning, 1982. REES, J. A., ADAMS IV, N. 1., and MEEHAN, J. R., The T man ual
LispSEI: The Programmer’s Manual
1988-01-01
defun print-entities ( str entities etype) (format t str ) (dolist (entity entities) (format t " -A" (entity-name entity *type)))) (detun entity-name...fields are munged only after the filters are executed. This makes things much easier. ;:Algorithm: (1) get initial list. (2) take out those entitles which...don’t meet all the constraints. 1, 3) pass the entities list through all the filters.(4) munge the appropriate fields (5)u return the result. (defn s
DICE: An Object Oriented Programming Environment for Cooperative Engineering Design
1989-03-20
environment called PARMENIDES /FRULEKIT; PARMENIDES /FRULEKIT supports programming in frames and rules and was developed in LISP at Carnegie-Mellon...the domain of building design and construction. The Blackboard in DICEY-BUILDER is represented as frames in PARMENIDES , while the KMs are implemented... PARMENIDES fo rart omat format d a b C /envelope BLACKBOAR D machine to machine (’BLACKBOARD l m message f il transfer message p read •d message format J
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spires, S.
This code provides an application programming interface to the Macintosh OSX Carbon Databrowser from Macintosh Common Lisp. The Databrowser API is made available to Lisp via high level native CLOS classes and methods, obviating the need to write low-level Carbon code. This code is primarily glue in that its job is to provide an interface between two extant software tools: Macintosh Common Lisp and the OSX Databrowser, both of which are COTS products from private vendors. The Databrowser is an extremely useful user interface widget that is provided with Apples OSX (and to some extent, OS9) operating systems. One Apple-sanctionedmore » method for using the Databrowser is via an API called Carbon, which is designed for C and C++ programmers. We have translated the low-level Carbon programming interface to the Databrowser into high-level object-oriented Common Lisp calls, functions, methods. and classes to enable MCL programmers to more readily take advantage of the Databrowser from Lisp programs.« less
SKETCH 4B: An Image Understanding Operating System
1989-06-14
LISP Nlambda Function] EQUIVALENT TO: Standard FRANZ liszt , but modified so that it may be called with no arguments, will read and execute...WESTERN ELECTRIC DEVICE INDEPENDENT TROFF (UNLESS YOU DO NOT WANT TO PRINT DOCUMENTATION) 4. FRANZ LISP FROM FRANZ INC. IF SUN3 (NOT NECESSARY IF VAX...RESERVED. DEVELOPED AT LINCOLN LABORATORY. CHAPTERS 1 INTRODUCTION. 2. LISP TUTORIAL. 3. FRANZ EXTENSIONS. 4. ATOMS. 5. OBJECTS. 6. CATALOGS
NASA Technical Reports Server (NTRS)
Jaworski, Allan; Lavallee, David; Zoch, David
1987-01-01
The prototype demonstrates the feasibility of using Ada for expert systems and the implementation of an expert-friendly interface which supports knowledge entry. In the Ford LISP-Ada Connection (FLAC) system LISP and Ada are used in ways which complement their respective capabilities. Future investigation will concentrate on the enhancement of the expert knowledge entry/debugging interface and on the issues associated with multitasking and real-time expert systems implementation in Ada.
Modification of earth-satellite orbits using medium-energy pulsed lasers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phipps, C.R.
1992-01-01
Laser Impulse Space Propulsion (LISP) has become an attractive concept, due to recent advances in gas laser technology, high-speed segmented mirrors and improved coeffici-ents for momentum coupling to targets in pulsed laser ablation. There are numerous specialized applications of the basic concept to space science-ranging from far-future and high capital cost to the immediate and inexpensive, such as: LEO-LISP (launch of massive objects into low-Earth-Orbit at dramatically improved cost-per-kg relative to present practice); LEGO-LISP (LEO to geosynchronous transfers); LO-LISP) (periodic re-boost of decaying LEO orbits); and LISK (geosynchronous satellite station-keeping). It is unlikely that one type of laser will bemore » best for all scenarios. In this paper, we will focus on the last two applications.« less
Modification of earth-satellite orbits using medium-energy pulsed lasers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phipps, C.R.
1992-10-01
Laser Impulse Space Propulsion (LISP) has become an attractive concept, due to recent advances in gas laser technology, high-speed segmented mirrors and improved coeffici-ents for momentum coupling to targets in pulsed laser ablation. There are numerous specialized applications of the basic concept to space science-ranging from far-future and high capital cost to the immediate and inexpensive, such as: LEO-LISP (launch of massive objects into low-Earth-Orbit at dramatically improved cost-per-kg relative to present practice); LEGO-LISP (LEO to geosynchronous transfers); LO-LISP) (periodic re-boost of decaying LEO orbits); and LISK (geosynchronous satellite station-keeping). It is unlikely that one type of laser will bemore » best for all scenarios. In this paper, we will focus on the last two applications.« less
1991-03-21
sectional representation of the spatial figure can be correctly determined. 6 The AutoLisp language system in the AutoCAD software provides the most...softwares are developed on the 32-bit machines and little progress has been reported for the 16-bit machines. Even the AutoCAD is a two-ard-a-half... AutoCAD software as the basis, developed the design package of 3-D cartridge valve blocks on IM PC/AT. To realize the 3-D displaying of cartridge valves
Software For Fault-Tree Diagnosis Of A System
NASA Technical Reports Server (NTRS)
Iverson, Dave; Patterson-Hine, Ann; Liao, Jack
1993-01-01
Fault Tree Diagnosis System (FTDS) computer program is automated-diagnostic-system program identifying likely causes of specified failure on basis of information represented in system-reliability mathematical models known as fault trees. Is modified implementation of failure-cause-identification phase of Narayanan's and Viswanadham's methodology for acquisition of knowledge and reasoning in analyzing failures of systems. Knowledge base of if/then rules replaced with object-oriented fault-tree representation. Enhancement yields more-efficient identification of causes of failures and enables dynamic updating of knowledge base. Written in C language, C++, and Common LISP.
C-Language Integrated Production System, Version 6.0
NASA Technical Reports Server (NTRS)
Riley, Gary; Donnell, Brian; Ly, Huyen-Anh Bebe; Ortiz, Chris
1995-01-01
C Language Integrated Production System (CLIPS) computer programs are specifically intended to model human expertise or other knowledge. CLIPS is designed to enable research on, and development and delivery of, artificial intelligence on conventional computers. CLIPS 6.0 provides cohesive software tool for handling wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming: representation of knowledge as heuristics - essentially, rules of thumb that specify set of actions performed in given situation. Object-oriented programming: modeling of complex systems comprised of modular components easily reused to model other systems or create new components. Procedural-programming: representation of knowledge in ways similar to those of such languages as C, Pascal, Ada, and LISP. Version of CLIPS 6.0 for IBM PC-compatible computers requires DOS v3.3 or later and/or Windows 3.1 or later.
An easy-to-use diagnostic system development shell
NASA Technical Reports Server (NTRS)
Tsai, L. C.; Ross, J. B.; Han, C. Y.; Wee, W. G.
1987-01-01
The Diagnostic System Development Shell (DSDS), an expert system development shell for diagnostic systems, is described. The major objective of building the DSDS is to create a very easy to use and friendly environment for knowledge engineers and end-users. The DSDS is written in OPS5 and CommonLisp. It runs on a VAX/VMS system. A set of domain independent, generalized rules is built in the DSDS, so the users need not be concerned about building the rules. The facts are explicitly represented in a unified format. A powerful check facility which helps the user to check the errors in the created knowledge bases is provided. A judgement facility and other useful facilities are also available. A diagnostic system based on the DSDS system is question driven and can call or be called by other knowledge based systems written in OPS5 and CommonLisp. A prototype diagnostic system for diagnosing a Philips constant potential X-ray system has been built using the DSDS.
NASA Technical Reports Server (NTRS)
Bawden, A.; Burke, G. S.; Hoffman, C. W.
1981-01-01
A common subset of selected facilities available in Maclisp and its derivatives (PDP-10 and Multics Maclisp, Lisp Machine Lisp (Zetalisp), and NIL) is decribed. The object is to add in writing code which can run compatibly in more than one of these environments.
NASA Technical Reports Server (NTRS)
Culbert, Chris
1990-01-01
Although they have reached a point of commercial viability, expert systems were originally developed in artificial intelligence (AI) research environments. Many of the available tools still work best in such environments. These environments typically utilize special hardware such as LISP machines and relatively unfamiliar languages such as LISP or Prolog. Space Station applications will require deep integration of expert system technology with applications developed in conventional languages, specifically Ada. The ability to apply automation to Space Station functions could be greatly enhanced by widespread availability of state-of-the-art expert system tools based on Ada. Although there have been some efforts to examine the use of Ada for AI applications, there are few, if any, existing products which provide state-of-the-art AI capabilities in an Ada tool. The goal of the ART/Ada Design Project is to conduct research into the implementation in Ada of state-of-the-art hybrid expert systems building tools (ESBT's). This project takes the following approach: using the existing design of the ART-IM ESBT as a starting point, analyze the impact of the Ada language and Ada development methodologies on that design; redesign the system in Ada; and analyze its performance. The research project will attempt to achieve a comprehensive understanding of the potential for embedding expert systems in Ada systems for eventual application in future Space Station Freedom projects. During Phase 1 of the project, initial requirements analysis, design, and implementation of the kernel subset of ART-IM functionality was completed. During Phase 2, the effort has been focused on the implementation and performance analysis of several versions with increasing functionality. Since production quality ART/Ada tools will not be available for a considerable time, and additional subtask of this project will be the completion of an Ada version of the CLIPS expert system shell developed by NASA. This tool will provide full syntactic compatibility with any eventual products of the ART/Ada design while allowing SSFP developers early access to this technology.
Transfer Functions Via Laplace- And Fourier-Borel Transforms
NASA Technical Reports Server (NTRS)
Can, Sumer; Unal, Aynur
1991-01-01
Approach to solution of nonlinear ordinary differential equations involves transfer functions based on recently-introduced Laplace-Borel and Fourier-Borel transforms. Main theorem gives transform of response of nonlinear system as Cauchy product of transfer function and transform of input function of system, together with memory effects. Used to determine responses of electrical circuits containing variable inductances or resistances. Also possibility of doing all noncommutative algebra on computers in such symbolic programming languages as Macsyma, Reduce, PL1, or Lisp. Process of solution organized and possibly simplified by algebraic manipulations reducing integrals in solutions to known or tabulated forms.
Computers Simulate Human Experts.
ERIC Educational Resources Information Center
Roberts, Steven K.
1983-01-01
Discusses recent progress in artificial intelligence in such narrowly defined areas as medical and electronic diagnosis. Also discusses use of expert systems, man-machine communication problems, novel programing environments (including comments on LISP and LISP machines), and types of knowledge used (factual, heuristic, and meta-knowledge). (JN)
Pedagogical Strategies for Human and Computer Tutoring.
ERIC Educational Resources Information Center
Reiser, Brian J.
The pedagogical strategies of human tutors in problem solving domains are described and the possibility of incorporating these techniques into computerized tutors is examined. GIL (Graphical Instruction in LISP), an intelligent tutoring system for LISP programming, is compared to human tutors teaching the same material in order to identify how the…
Specification and verification of gate-level VHDL models of synchronous and asynchronous circuits
NASA Technical Reports Server (NTRS)
Russinoff, David M.
1995-01-01
We present a mathematical definition of hardware description language (HDL) that admits a semantics-preserving translation to a subset of VHDL. Our HDL includes the basic VHDL propagation delay mechanisms and gate-level circuit descriptions. We also develop formal procedures for deriving and verifying concise behavioral specifications of combinational and sequential devices. The HDL and the specification procedures have been formally encoded in the computational logic of Boyer and Moore, which provides a LISP implementation as well as a facility for mechanical proof-checking. As an application, we design, specify, and verify a circuit that achieves asynchronous communication by means of the biphase mark protocol.
Time-based air traffic management using expert systems
NASA Technical Reports Server (NTRS)
Tobias, L.; Scoggins, J. L.
1986-01-01
A prototype expert system has been developed for the time scheduling of aircraft into the terminal area. The three functions of the air-traffic-control schedule advisor are as follows: (1) for each new arrival, it develops an admisible flight plan for that aircraft; (2) as the aircraft progresses through the terminal area, it monitors deviations from the aircraft's flight plan and provides advisories to return the aircraft to its assigned schedule; and (3) if major disruptions such as missed approaches occur, it develops a revised plan. The advisor is operational on a Symbolics 3600, and is programmed in MRS (a logic programming language), Lisp, and Fortran.
Time-based air traffic management using expert systems
NASA Technical Reports Server (NTRS)
Tobias, L.; Scoggins, J. L.
1986-01-01
A prototype expert system was developed for the time scheduling of aircraft into the terminal area. The three functions of the air traffic control schedule advisor are as follows: first, for each new arrival, it develops an admissible flight plan for that aircraft. Second, as the aircraft progresses through the terminal area, it monitors deviations from the flight plan and provides advisories to return the aircraft to its assigned schedule. Third, if major disruptions such as missed approaches occur, it develops a revised plan. The advisor is operational on a Symbolics 3600, and is programed in MRS (a logic programming language), Lisp, and FORTRAN.
A Multi-Modal Approach to Intervention for One Adolescent's Frontal Lisp
ERIC Educational Resources Information Center
Lipetz, Heidi Massel; Bernhardt, B. May
2013-01-01
An adolescent with a persistent frontal lisp participated in a two-part 11-session intervention case study. The first phase used ultrasound imagery and acoustic, phonetic and voice education to provide information about articulatory setting (AS) and general awareness of the speech production process. The second phase used traditional articulation…
Querying and Computing with BioCyc Databases
Krummenacker, Markus; Paley, Suzanne; Mueller, Lukas; Yan, Thomas; Karp, Peter D.
2006-01-01
Summary We describe multiple methods for accessing and querying the complex and integrated cellular data in the BioCyc family of databases: access through multiple file formats, access through Application Program Interfaces (APIs) for LISP, Perl and Java, and SQL access through the BioWarehouse relational database. Availability The Pathway Tools software and 20 BioCyc DBs in Tiers 1 and 2 are freely available to academic users; fees apply to some types of commercial use. For download instructions see http://BioCyc.org/download.shtml PMID:15961440
Expert System for Automated Design Synthesis
NASA Technical Reports Server (NTRS)
Rogers, James L., Jr.; Barthelemy, Jean-Francois M.
1987-01-01
Expert-system computer program EXADS developed to aid users of Automated Design Synthesis (ADS) general-purpose optimization program. EXADS aids engineer in determining best combination based on knowledge of specific problem and expert knowledge stored in knowledge base. Available in two interactive machine versions. IBM PC version (LAR-13687) written in IQ-LISP. DEC VAX version (LAR-13688) written in Franz-LISP.
Dypas: A dynamic payload scheduler for shuttle missions
NASA Technical Reports Server (NTRS)
Davis, Stephen
1988-01-01
Decision and analysis systems have had broad and very practical application areas in the human decision making process. These software systems range from the help sections in simple accounting packages, to the more complex computer configuration programs. Dypas is a decision and analysis system that aids prelaunch shutlle scheduling, and has added functionality to aid the rescheduling done in flight. Dypas is written in Common Lisp on a Symbolics Lisp machine. Dypas differs from other scheduling programs in that it can draw its knowledge from different rule bases and apply them to different rule interpretation schemes. The system has been coded with Flavors, an object oriented extension to Common Lisp on the Symbolics hardware. This allows implementation of objects (experiments) to better match the problem definition, and allows a more coherent solution space to be developed. Dypas was originally developed to test a programmer's aptitude toward Common Lisp and the Symbolics software environment. Since then the system has grown into a large software effort with several programmers and researchers thrown into the effort. Dypas is currently using two expert systems and three inferencing procedures to generate a many object schedule. The paper will review the abilities of Dypas and comment on its functionality.
Cross-Compiler for Modeling Space-Flight Systems
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
Ripples is a computer program that makes it possible to specify arbitrarily complex space-flight systems in an easy-to-learn, high-level programming language and to have the specification automatically translated into LibSim, which is a text-based computing language in which such simulations are implemented. LibSim is a very powerful simulation language, but learning it takes considerable time, and it requires that models of systems and their components be described at a very low level of abstraction. To construct a model in LibSim, it is necessary to go through a time-consuming process that includes modeling each subsystem, including defining its fault-injection states, input and output conditions, and the topology of its connections to other subsystems. Ripples makes it possible to describe the same models at a much higher level of abstraction, thereby enabling the user to build models faster and with fewer errors. Ripples can be executed in a variety of computers and operating systems, and can be supplied in either source code or binary form. It must be run in conjunction with a Lisp compiler.
Software Design for Interactive Graphic Radiation Treatment Simulation Systems*
Kalet, Ira J.; Sweeney, Christine; Jacky, Jonathan
1990-01-01
We examine issues in the design of interactive computer graphic simulation programs for radiation treatment planning (RTP), as well as expert system programs that automate parts of the RTP process, in light of ten years of experience at designing, building and using such programs. An experiment in object-oriented design using standard Pascal shows that while some advantage is gained from the design, it is still difficult to achieve modularity and to integrate expert system components. A new design based on the Common LISP Object System (CLOS) is described. This series of designs for RTP software shows that this application benefits in specific ways from object-oriented design methods and appropriate languages and tools.
Use of symbolic computation in robotics education
NASA Technical Reports Server (NTRS)
Vira, Naren; Tunstel, Edward
1992-01-01
An application of symbolic computation in robotics education is described. A software package is presented which combines generality, user interaction, and user-friendliness with the systematic usage of symbolic computation and artificial intelligence techniques. The software utilizes MACSYMA, a LISP-based symbolic algebra language, to automatically generate closed-form expressions representing forward and inverse kinematics solutions, the Jacobian transformation matrices, robot pose error-compensation models equations, and Lagrange dynamics formulation for N degree-of-freedom, open chain robotic manipulators. The goal of such a package is to aid faculty and students in the robotics course by removing burdensome tasks of mathematical manipulations. The software package has been successfully tested for its accuracy using commercially available robots.
1981-01-01
THIS PAGZ(Whan Doee Es tMord) Item 20 (Cont’d) ------ work in the area of artificial intelligence and those used in general program development into a...Controlling Gfile) IS. SECURITY CLASS. (of tis report) Same .,/ UNCLASSIFIED 13d. DECLASSIFICATION/ DOWN GRADING ..- ". .--- /A!CHEDULEI t I IS...logic programming with LISP for implementing intelligent data base query systems. Continued developments will allow for enhancements to be made to the
NASA Technical Reports Server (NTRS)
Rogers, J. L.; Barthelemy, J.-F. M.
1986-01-01
An expert system called EXADS has been developed to aid users of the Automated Design Synthesis (ADS) general purpose optimization program. ADS has approximately 100 combinations of strategy, optimizer, and one-dimensional search options from which to choose. It is difficult for a nonexpert to make this choice. This expert system aids the user in choosing the best combination of options based on the users knowledge of the problem and the expert knowledge stored in the knowledge base. The knowledge base is divided into three categories; constrained problems, unconstrained problems, and constrained problems being treated as unconstrained problems. The inference engine and rules are written in LISP, contains about 200 rules, and executes on DEC-VAX (with Franz-LISP) and IBM PC (with IQ-LISP) computers.
Naval Computer-Based Instruction: Cost, Implementation and Effectiveness Issues.
1988-03-01
logical follow on to MITIPAC and are an attempt to use some artificial intelligence (AI) techniques with computer-based training. A good intelligent ...principles of steam plant operation and maintenance. Steamer was written in LISP on a LISP machine in an attempt to use artificial intelligence . "What... Artificial Intelligence and Speech Technology", Electronic Learning, September 1987. Montague, William. E., code 5, Navy Personnel Research and
A comparison of CLIPS- and LISP-based approaches to the development of a real-time expert system
NASA Technical Reports Server (NTRS)
Frainier, R.; Groleau, N.; Bhatnagar, R.; Lam, C.; Compton, M.; Colombano, S.; Lai, S.; Szolovits, P.; Manahan, M.; Statler, I.
1990-01-01
This paper describes an ongoing expert system development effort started in 1988 which is evaluating both CLIPS- and LISP- based approaches. The expert system is being developed to a project schedule and is planned for flight on Space Shuttle Mission SLS-2 in 1992. The expert system will help astronauts do the best possible science for a vestibular physiology experiment already scheduled for that mission. The system gathers and reduces data from the experiment, flags 'interesting' results, and proposes changes in the experiment both to exploit the in-flight observations and to stay within the time allowed by Mission Control for the experiment. These tasks must all be performed in real time. Two Apple Macintosh computers are used. The CLIPS- and LISP- based environments are layered above the Macintosh computer Operating System. The 'CLIPS-based' environment includes CLIPS and HyperCard. The LlSP-based environment includes Common LISP, Parmenides (a frame system), and FRuleKit (a rule system). Important evaluation factors include ease of programming, performance against real-time requirements, usability by an astronaut, robustness, and ease of maintenance. Current results on the factors of ease of programming, performance against real-time requirements, and ease of maintenance are discussed.
The Julia programming language: the future of scientific computing
NASA Astrophysics Data System (ADS)
Gibson, John
2017-11-01
Julia is an innovative new open-source programming language for high-level, high-performance numerical computing. Julia combines the general-purpose breadth and extensibility of Python, the ease-of-use and numeric focus of Matlab, the speed of C and Fortran, and the metaprogramming power of Lisp. Julia uses type inference and just-in-time compilation to compile high-level user code to machine code on the fly. A rich set of numeric types and extensive numerical libraries are built-in. As a result, Julia is competitive with Matlab for interactive graphical exploration and with C and Fortran for high-performance computing. This talk interactively demonstrates Julia's numerical features and benchmarks Julia against C, C++, Fortran, Matlab, and Python on a spectral time-stepping algorithm for a 1d nonlinear partial differential equation. The Julia code is nearly as compact as Matlab and nearly as fast as Fortran. This material is based upon work supported by the National Science Foundation under Grant No. 1554149.
Clinical expression of developmental coordination disorder in a large Canadian family
Gaines, Robin; Collins, David; Boycott, Kym; Missiuna, Cheryl; DeLaat, Denise; Soucie, Helen
2008-01-01
Previous studies of the phenotype of developmental coordination disorder (DCD) have largely concentrated on population-based samples. The present study reports on an in-depth examination of a large Canadian family with eight children, after three children who were suspected to have DCD were referred for evaluation. Subsequently, five of the six children whose motor impairments could be measured, and the mother, met the diagnostic criteria for DCD as per the American Psychiatric Association’s Diagnostic and Statistical Manual of Mental Disorders – fourth edition. The family members diagnosed with DCD showed remarkably similar profiles of motor difficulties. Additionally, the five children diagnosed with DCD had current speech articulation difficulties, with four of them having visited speech/language pathologists; the mother had a lateral lisp. More in-depth testing for three children revealed intact intellectual, academic and language comprehension skills. Three of the children diagnosed with DCD were obese. The present report highlights familial clustering of DCD and the presence of comorbid conditions in the affected children. PMID:19436536
Subsurface Buoy Forms for Array Applications
1990-10-01
CIRCUMSCRIBED CIRCLES Figure 19. Derivation of a cycloid outline with relationship to familiar shape outlines. 35 An AutoLisp routine has been created to...Buoyance Bulletin, no. 44. Weast, R. C., D. R. Lied, M. J. Astle, and W. H. Hudson, R. G. 1944. Engineers’ Manual , 2d ed. Beyer. 1990. Handbook of...An AutoLisp Program Routine Created to Construct the Torospherical Outlines Shown on the Previous Page Is Reproduced Below. (defun c.Iorodraw () input
Analytical learning and term-rewriting systems
NASA Technical Reports Server (NTRS)
Laird, Philip; Gamble, Evan
1990-01-01
Analytical learning is a set of machine learning techniques for revising the representation of a theory based on a small set of examples of that theory. When the representation of the theory is correct and complete but perhaps inefficient, an important objective of such analysis is to improve the computational efficiency of the representation. Several algorithms with this purpose have been suggested, most of which are closely tied to a first order logical language and are variants of goal regression, such as the familiar explanation based generalization (EBG) procedure. But because predicate calculus is a poor representation for some domains, these learning algorithms are extended to apply to other computational models. It is shown that the goal regression technique applies to a large family of programming languages, all based on a kind of term rewriting system. Included in this family are three language families of importance to artificial intelligence: logic programming, such as Prolog; lambda calculus, such as LISP; and combinatorial based languages, such as FP. A new analytical learning algorithm, AL-2, is exhibited that learns from success but is otherwise quite different from EBG. These results suggest that term rewriting systems are a good framework for analytical learning research in general, and that further research should be directed toward developing new techniques.
A Comparative Study : Microprogrammed Vs Risc Architectures For Symbolic Processing
NASA Astrophysics Data System (ADS)
Heudin, J. C.; Metivier, C.; Demigny, D.; Maurin, T.; Zavidovique, B.; Devos, F.
1987-05-01
It is oftenclaimed that conventional computers are not well suited for human-like tasks : Vision (Image Processing), Intelligence (Symbolic Processing) ... In the particular case of Artificial Intelligence, dynamic type-checking is one example of basic task that must be improved. The solution implemented in most Lisp work-stations consists in a microprogrammed architecture with a tagged memory. Another way to gain efficiency is to design a well suited instruction set for symbolic processing, which reduces the semantic gap between the high level language and the machine code. In this framework, the RISC concept provides a convenient approach to study new architectures for symbolic processing. This paper compares both approaches and describes our projectof designing a compact symbolic processor for Artificial Intelligence applications.
Manchester visual query language
NASA Astrophysics Data System (ADS)
Oakley, John P.; Davis, Darryl N.; Shann, Richard T.
1993-04-01
We report a database language for visual retrieval which allows queries on image feature information which has been computed and stored along with images. The language is novel in that it provides facilities for dealing with feature data which has actually been obtained from image analysis. Each line in the Manchester Visual Query Language (MVQL) takes a set of objects as input and produces another, usually smaller, set as output. The MVQL constructs are mainly based on proven operators from the field of digital image analysis. An example is the Hough-group operator which takes as input a specification for the objects to be grouped, a specification for the relevant Hough space, and a definition of the voting rule. The output is a ranked list of high scoring bins. The query could be directed towards one particular image or an entire image database, in the latter case the bins in the output list would in general be associated with different images. We have implemented MVQL in two layers. The command interpreter is a Lisp program which maps each MVQL line to a sequence of commands which are used to control a specialized database engine. The latter is a hybrid graph/relational system which provides low-level support for inheritance and schema evolution. In the paper we outline the language and provide examples of useful queries. We also describe our solution to the engineering problems associated with the implementation of MVQL.
Fenwick, Matthew; Sesanker, Colbert; Schiller, Martin R.; Ellis, Heidi JC; Hinman, M. Lee; Vyas, Jay; Gryk, Michael R.
2012-01-01
Scientists are continually faced with the need to express complex mathematical notions in code. The renaissance of functional languages such as LISP and Haskell is often credited to their ability to implement complex data operations and mathematical constructs in an expressive and natural idiom. The slow adoption of functional computing in the scientific community does not, however, reflect the congeniality of these fields. Unfortunately, the learning curve for adoption of functional programming techniques is steeper than that for more traditional languages in the scientific community, such as Python and Java, and this is partially due to the relative sparseness of available learning resources. To fill this gap, we demonstrate and provide applied, scientifically substantial examples of functional programming, We present a multi-language source-code repository for software integration and algorithm development, which generally focuses on the fields of machine learning, data processing, bioinformatics. We encourage scientists who are interested in learning the basics of functional programming to adopt, reuse, and learn from these examples. The source code is available at: https://github.com/CONNJUR/CONNJUR-Sandbox (see also http://www.connjur.org). PMID:25328913
Fenwick, Matthew; Sesanker, Colbert; Schiller, Martin R; Ellis, Heidi Jc; Hinman, M Lee; Vyas, Jay; Gryk, Michael R
2012-01-01
Scientists are continually faced with the need to express complex mathematical notions in code. The renaissance of functional languages such as LISP and Haskell is often credited to their ability to implement complex data operations and mathematical constructs in an expressive and natural idiom. The slow adoption of functional computing in the scientific community does not, however, reflect the congeniality of these fields. Unfortunately, the learning curve for adoption of functional programming techniques is steeper than that for more traditional languages in the scientific community, such as Python and Java, and this is partially due to the relative sparseness of available learning resources. To fill this gap, we demonstrate and provide applied, scientifically substantial examples of functional programming, We present a multi-language source-code repository for software integration and algorithm development, which generally focuses on the fields of machine learning, data processing, bioinformatics. We encourage scientists who are interested in learning the basics of functional programming to adopt, reuse, and learn from these examples. The source code is available at: https://github.com/CONNJUR/CONNJUR-Sandbox (see also http://www.connjur.org).
1989-08-01
report demonstrates how flavors (object-oriented programming in Franz is carried out via flavors. can be u>,d for this programming. Different approaches...data structures that are part of Franz LISP. A method is a procedure that is invoked by a message to a flavor instance. The method triggered depends...keywordize is a procedure used to intern the :set-op name into the keyword package so that the flavor features of Franz recognize this operation. An
Analysis on the workspace of palletizing robot based on AutoCAD
NASA Astrophysics Data System (ADS)
Li, Jin-quan; Zhang, Rui; Guan, Qi; Cui, Fang; Chen, Kuan
2017-10-01
In this paper, a four-degree-of-freedom articulated palletizing robot is used as the object of research. Based on the analysis of the overall configuration of the robot, the kinematic mathematical model is established by D-H method to figure out the workspace of the robot. In order to meet the needs of design and analysis, using AutoCAD secondary development technology and AutoLisp language to develop AutoCAD-based 2D and 3D workspace simulation interface program of palletizing robot. At last, using AutoCAD plugin, the influence of structural parameters on the shape and position of the working space is analyzed when the structure parameters of the robot are changed separately. This study laid the foundation for the design, control and planning of palletizing robots.
The Mission Operations Planning Assistant
NASA Technical Reports Server (NTRS)
Schuetzle, James G.
1987-01-01
The Mission Operations Planning Assistant (MOPA) is a knowledge-based system developed to support the planning and scheduling of instrument activities on the Upper Atmospheric Research Satellite (UARS). The MOPA system represents and maintains instrument plans at two levels of abstraction in order to keep plans comprehensible to both UARS Principal Investigators and Command Management personnel. The hierarchical representation of plans also allows MOPA to automatically create detailed instrument activity plans from which spacecraft command loads may be generated. The MOPA system was developed on a Symbolics 3640 computer using the ZetaLisp and ART languages. MOPA's features include a textual and graphical interface for plan inspection and modification, recognition of instrument operational constraint violations during the planning process, and consistency maintenance between the different planning levels. This paper describes the current MOPA system.
Vertical interincisal trespass assessment in children with speech disorders.
Sahad, Marcelo de Gouveia; Nahás, Ana Carla Raphaelli; Scavone-Junior, Helio; Jabur, Luciana Badra; Guedes-Pinto, Eduardo
2008-01-01
Through a transversal epidemiological study, conducted with 333 Brazilian children, males (157) and females (176), aged 3 to 6 years old, enrolled in a public preschool, this study aimed to evaluate the prevalence of the different types of vertical interincisal trespass (VIT) and the relationship between these occlusal aspects and anterior lisping and/or anterior tongue thrust in the articulation of the lingua-alveolar phonemes /t/, /d/, /n/ and /l/. All children involved were submitted to a VIT examination and to a speech evaluation. Statistical significance was analyzed through the Qui-square test, at a significance level of 0.05 (95% confidence limit). The quantitative analysis of the data demonstrated the following prevalences: 1 - the different types of VIT: 48.3% for normal overbite (NO), 22.5% for deep overbite (DO), 9.3% for edge to edge (ETE) and 19.8% for open bite (OB); 2 - interdental lisping in relation to the different types of VIT: 42% for NO, 12.5% for DO, 12.5% for ETE, 32.9% for OB; and 3 - children with anterior tongue thrust in the articulation of lingua-alveolar phonemes in relation to the different types of VIT: 42.1% for NO, 14% for DO, 10.5% for ETE, 33.3% for OB. The results demonstrated that there was a significant relationship between open bite and anterior lisping and/or anterior tongue thrust in the articulation of the lingua-alveolar phonemes /t/, /d/, /n/ and /l/; and that there was a significant relationship between deep overbite and the absence of anterior lisping and anterior tongue thrust in the articulation of the lingua-alveolar phonemes.
TLB for Free: In-Cache Address Translation for a Multiprocessor Workstation
1985-05-13
LISZT Franz LISP self-compilation I 0.6Mb 145 VAXIMA I Algebraic expert system (a derivative of .MACSY:MA) 1.7Mb 414 CSZOK Two V AXIMA streams...first four were gathered on a VAX running UNIX with an address and instruction tracer [Henr84]. LISZT is the Franz LISP compiler compiling itself...Collisions) (PTE Misses) LISZT 0.584 0.609 0.02.5( 4.3%) (0.009) (0.016) V;\\...’\\lMA 1.855 1.885 0.030(1.6%) (0.004) (0.026) CS100K 2.214 2.260
NASA Technical Reports Server (NTRS)
Litt, Jonathan; Wong, Edmond; Simon, Donald L.
1994-01-01
A prototype Lisp-based soft real-time object-oriented Graphical User Interface for control system development is presented. The Graphical User Interface executes alongside a test system in laboratory conditions to permit observation of the closed loop operation through animation, graphics, and text. Since it must perform interactive graphics while updating the screen in real time, techniques are discussed which allow quick, efficient data processing and animation. Examples from an implementation are included to demonstrate some typical functionalities which allow the user to follow the control system's operation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hummel, K.E.
1987-12-01
Expert systems are artificial intelligence programs that solve problems requiring large amounts of heuristic knowledge, based on years of experience and tradition. Production systems are domain-independent tools that support the development of rule-based expert systems. This document describes a general purpose production system known as HERB. This system was developed to support the programming of expert systems using hierarchically structured rule bases. HERB encourages the partitioning of rules into multiple rule bases and supports the use of multiple conflict resolution strategies. Multiple rule bases can also be placed on a system stack and simultaneously searched during each interpreter cycle. Bothmore » backward and forward chaining rules are supported by HERB. The condition portion of each rule can contain both patterns, which are matched with facts in a data base, and LISP expressions, which are explicitly evaluated in the LISP environment. Properties of objects can also be stored in the HERB data base and referenced within the scope of each rule. This document serves both as an introduction to the principles of LISP-based production systems and as a user's manual for the HERB system. 6 refs., 17 figs.« less
Techniques and implementation of the embedded rule-based expert system using Ada
NASA Technical Reports Server (NTRS)
Liberman, Eugene M.; Jones, Robert E.
1991-01-01
Ada is becoming an increasingly popular programming language for large Government-funded software projects. Ada with its portability, transportability, and maintainability lends itself well to today's complex programming environment. In addition, expert systems have also assured a growing role in providing human-like reasoning capability and expertise for computer systems. The integration of expert system technology with Ada programming language, specifically a rule-based expert system using an ART-Ada (Automated Reasoning Tool for Ada) system shell is discussed. The NASA Lewis Research Center was chosen as a beta test site for ART-Ada. The test was conducted by implementing the existing Autonomous Power EXpert System (APEX), a Lisp-base power expert system, in ART-Ada. Three components, the rule-based expert system, a graphics user interface, and communications software make up SMART-Ada (Systems fault Management with ART-Ada). The main objective, to conduct a beta test on the ART-Ada rule-based expert system shell, was achieved. The system is operational. New Ada tools will assist in future successful projects. ART-Ada is one such tool and is a viable alternative to the straight Ada code when an application requires a rule-based or knowledge-based approach.
Model Checking JAVA Programs Using Java Pathfinder
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Pressburger, Thomas
2000-01-01
This paper describes a translator called JAVA PATHFINDER from JAVA to PROMELA, the "programming language" of the SPIN model checker. The purpose is to establish a framework for verification and debugging of JAVA programs based on model checking. This work should be seen in a broader attempt to make formal methods applicable "in the loop" of programming within NASA's areas such as space, aviation, and robotics. Our main goal is to create automated formal methods such that programmers themselves can apply these in their daily work (in the loop) without the need for specialists to manually reformulate a program into a different notation in order to analyze the program. This work is a continuation of an effort to formally verify, using SPIN, a multi-threaded operating system programmed in Lisp for the Deep-Space 1 spacecraft, and of previous work in applying existing model checkers and theorem provers to real applications.
An expert system for prediction of chemical toxicity
Hickey, James P.; Aldridge, Andrew J.; Passino-Reader, Dora R.; Frank, Anthony M.
1992-01-01
The National Fisheries Research Center- Great Lakes has developed an interactive computer program that uses the structure of an organic molecule to predict its acute toxicity to four aquatic species. The expert system software, written in the muLISP language, identifies the skeletal structures and substituent groups of an organic molecule from a user-supplied standard chemical notation known as a SMILES string, and then generates values for four solvatochromic parameters. Multiple regression equations relate these parameters to the toxicities (expressed as log10LC50s and log10EC50s, along with 95% confidence intervals) for four species. The system is demonstrated by prediction of toxicity for anilide-type pesticides to the fathead minnow (Pimephales promelas). This software is designed for use on an IBM-compatible personal computer by personnel with minimal toxicology background for rapid estimation of chemical toxicity. The system has numerous applications, with much potential for use in the pharmaceutical industry
Rice-obot 1: An intelligent autonomous mobile robot
NASA Technical Reports Server (NTRS)
Defigueiredo, R.; Ciscon, L.; Berberian, D.
1989-01-01
The Rice-obot I is the first in a series of Intelligent Autonomous Mobile Robots (IAMRs) being developed at Rice University's Cooperative Intelligent Mobile Robots (CIMR) lab. The Rice-obot I is mainly designed to be a testbed for various robotic and AI techniques, and a platform for developing intelligent control systems for exploratory robots. Researchers present the need for a generalized environment capable of combining all of the control, sensory and knowledge systems of an IAMR. They introduce Lisp-Nodes as such a system, and develop the basic concepts of nodes, messages and classes. Furthermore, they show how the control system of the Rice-obot I is implemented as sub-systems in Lisp-Nodes.
NEO-LISP: Deflecting near-Earth objects using high average power, repetitively pulsed lasers
NASA Astrophysics Data System (ADS)
Phipps, C. R.; Michaelis, M. M.
Several kinds of Near-Earth objects exist for which one would like to cause modest orbit perturbations, but which are inaccessible to normal means of interception because of their number, distance or the lack of early warning. For these objects, LISP (Laser Impulse Space Propulsion) is an appropriate technique for rapidly applying the required mechanical impulse from a ground-based station. In order of increasing laser energy required, examples are: (1) repositioning specially prepared geosynchronous satellites for an enhanced lifetime; (2) causing selected items of space junk to re-enter and burn up in the atmosphere on a computed trajectory; and (3) safely deflecting Earth-directed comet nuclei and earth-crossing asteroids (ECA's) a few tens of meters in size (the most hazardous size). They will discuss each of these problems in turn and show that each application is best matched by its own matrix of LISP laser pulse width, pulse repetition rate, wavelength and average power. The latter ranges from 100W to 3GW for the cases considered. They will also discuss means of achieving the active beam phase error correction during passage through the atmosphere and very large exit pupil in the optical system which are required in each of these cases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
AISL-CRYPTO is a library of cryptography functions supporting other AISL software. It provides various crypto functions for Common Lisp, including Digital Signature Algorithm, Data Encryption Standard, Secure Hash Algorithm, and public-key cryptography.
Parallel Ada benchmarks for the SVMS
NASA Technical Reports Server (NTRS)
Collard, Philippe E.
1990-01-01
The use of parallel processing paradigm to design and develop faster and more reliable computers appear to clearly mark the future of information processing. NASA started the development of such an architecture: the Spaceborne VHSIC Multi-processor System (SVMS). Ada will be one of the languages used to program the SVMS. One of the unique characteristics of Ada is that it supports parallel processing at the language level through the tasking constructs. It is important for the SVMS project team to assess how efficiently the SVMS architecture will be implemented, as well as how efficiently Ada environment will be ported to the SVMS. AUTOCLASS II, a Bayesian classifier written in Common Lisp, was selected as one of the benchmarks for SVMS configurations. The purpose of the R and D effort was to provide the SVMS project team with the version of AUTOCLASS II, written in Ada, that would make use of Ada tasking constructs as much as possible so as to constitute a suitable benchmark. Additionally, a set of programs was developed that would measure Ada tasking efficiency on parallel architectures as well as determine the critical parameters influencing tasking efficiency. All this was designed to provide the SVMS project team with a set of suitable tools in the development of the SVMS architecture.
Architecture Adaptive Computing Environment
NASA Technical Reports Server (NTRS)
Dorband, John E.
2006-01-01
Architecture Adaptive Computing Environment (aCe) is a software system that includes a language, compiler, and run-time library for parallel computing. aCe was developed to enable programmers to write programs, more easily than was previously possible, for a variety of parallel computing architectures. Heretofore, it has been perceived to be difficult to write parallel programs for parallel computers and more difficult to port the programs to different parallel computing architectures. In contrast, aCe is supportable on all high-performance computing architectures. Currently, it is supported on LINUX clusters. aCe uses parallel programming constructs that facilitate writing of parallel programs. Such constructs were used in single-instruction/multiple-data (SIMD) programming languages of the 1980s, including Parallel Pascal, Parallel Forth, C*, *LISP, and MasPar MPL. In aCe, these constructs are extended and implemented for both SIMD and multiple- instruction/multiple-data (MIMD) architectures. Two new constructs incorporated in aCe are those of (1) scalar and virtual variables and (2) pre-computed paths. The scalar-and-virtual-variables construct increases flexibility in optimizing memory utilization in various architectures. The pre-computed-paths construct enables the compiler to pre-compute part of a communication operation once, rather than computing it every time the communication operation is performed.
Speech sound disorders in a community study of preschool children.
McLeod, Sharynne; Harrison, Linda J; McAllister, Lindy; McCormack, Jane
2013-08-01
To undertake a community (nonclinical) study to describe the speech of preschool children who had been identified by parents/teachers as having difficulties "talking and making speech sounds" and compare the speech characteristics of those who had and had not accessed the services of a speech-language pathologist (SLP). Stage 1: Parent/teacher concern regarding the speech skills of 1,097 4- to 5-year-old children attending early childhood centers was documented. Stage 2a: One hundred forty-three children who had been identified with concerns were assessed. Stage 2b: Parents returned questionnaires about service access for 109 children. The majority of the 143 children (86.7%) achieved a standard score below the normal range for the percentage of consonants correct (PCC) on the Diagnostic Evaluation of Articulation and Phonology (Dodd, Hua, Crosbie, Holm, & Ozanne, 2002). Consonants produced incorrectly were consistent with the late-8 phonemes ( Shriberg, 1993). Common phonological patterns were fricative simplification (82.5%), cluster simplification (49.0%)/reduction (19.6%), gliding (41.3%), and palatal fronting (15.4%). Interdental lisps on /s/ and /z/ were produced by 39.9% of the children, dentalization of other sibilants by 17.5%, and lateral lisps by 13.3%. Despite parent/teacher concern, only 41/109 children had contact with an SLP. These children were more likely to be unintelligible to strangers, to express distress about their speech, and to have a lower PCC and a smaller consonant inventory compared to the children who had no contact with an SLP. A significant number of preschool-age children with speech sound disorders (SSD) have not had contact with an SLP. These children have mild-severe SSD and would benefit from SLP intervention. Integrated SLP services within early childhood communities would enable earlier identification of SSD and access to intervention to reduce potential educational and social impacts affiliated with SSD.
The Katydid system for compiling KEE applications to Ada
NASA Technical Reports Server (NTRS)
Filman, Robert E.; Bock, Conrad; Feldman, Roy
1990-01-01
Components of a system known as Katydid are developed in an effort to compile knowledge-based systems developed in a multimechanism integrated environment (KEE) to Ada. The Katydid core is an Ada library supporting KEE object functionality, and the other elements include a rule compiler, a LISP-to-Ada translator, and a knowledge-base dumper. Katydid employs translation mechanisms that convert LISP knowledge structures and rules to Ada and utilizes basic prototypes of a run-time KEE object-structure library module for Ada. Preliminary results include the semiautomatic compilation of portions of a simple expert system to run in an Ada environment with the described algorithms. It is suggested that Ada can be employed for AI programming and implementation, and the Katydid system is being developed to include concurrency and synchronization mechanisms.
Building a High Performance Metadata Broker using Clojure, NoSQL and Message Queues
NASA Astrophysics Data System (ADS)
Truslove, I.; Reed, S.
2013-12-01
In practice, Earth and Space Science Informatics often relies on getting more done with less: fewer hardware resources, less IT staff, fewer lines of code. As a capacity-building exercise focused on rapid development of high-performance geoinformatics software, the National Snow and Ice Data Center (NSIDC) built a prototype metadata brokering system using a new JVM language, modern database engines and virtualized or cloud computing resources. The metadata brokering system was developed with the overarching goals of (i) demonstrating a technically viable product with as little development effort as possible, (ii) using very new yet very popular tools and technologies in order to get the most value from the least legacy-encumbered code bases, and (iii) being a high-performance system by using scalable subcomponents, and implementation patterns typically used in web architectures. We implemented the system using the Clojure programming language (an interactive, dynamic, Lisp-like JVM language), Redis (a fast in-memory key-value store) as both the data store for original XML metadata content and as the provider for the message queueing service, and ElasticSearch for its search and indexing capabilities to generate search results. On evaluating the results of the prototyping process, we believe that the technical choices did in fact allow us to do more for less, due to the expressive nature of the Clojure programming language and its easy interoperability with Java libraries, and the successful reuse or re-application of high performance products or designs. This presentation will describe the architecture of the metadata brokering system, cover the tools and techniques used, and describe lessons learned, conclusions, and potential next steps.
CLIPS: A proposal for improved usability
NASA Technical Reports Server (NTRS)
Patton, Charles R.
1990-01-01
This paper proposes the enhancement of the CLIPS user interface to improve the over-all usability of the CLIPS development environment. It suggests some directions for the long term growth of the user interface, and discusses some specific strengths and weaknesses of the current CLIPS PC user interface. Every user of CLIPS shares a common experience: his/her first interaction with the system itself. As with any new language, between the process of installing CLIPS on the appropriate computer and the completion of a large application, an intensive learning process takes place. For those with extensive programming knowledge and LISP backgrounds, this experience may have been mostly interesting and pleasant. Being familiar with products that are similar to CLIPS in many ways, these users enjoy a relatively short training period with the product. Already familiar with many of the functions they wish to employ, experienced users are free to focus on the capabilities of CLIPS that make it uniquely useful within their working environment.
Vision Guided Intelligent Robot Design And Experiments
NASA Astrophysics Data System (ADS)
Slutzky, G. D.; Hall, E. L.
1988-02-01
The concept of an intelligent robot is an important topic combining sensors, manipulators, and artificial intelligence to design a useful machine. Vision systems, tactile sensors, proximity switches and other sensors provide the elements necessary for simple game playing as well as industrial applications. These sensors permit adaption to a changing environment. The AI techniques permit advanced forms of decision making, adaptive responses, and learning while the manipulator provides the ability to perform various tasks. Computer languages such as LISP and OPS5, have been utilized to achieve expert systems approaches in solving real world problems. The purpose of this paper is to describe several examples of visually guided intelligent robots including both stationary and mobile robots. Demonstrations will be presented of a system for constructing and solving a popular peg game, a robot lawn mower, and a box stacking robot. The experience gained from these and other systems provide insight into what may be realistically expected from the next generation of intelligent machines.
Teachers and artificial intelligence. The Logo connection.
Merbler, J B
1990-12-01
This article describes a three-phase program for training special education teachers to teach Logo and artificial intelligence. Logo is derived from the LISP computer language and is relatively simple to learn and use, and it is argued that these factors make it an ideal tool for classroom experimentation in basic artificial intelligence concepts. The program trains teachers to develop simple demonstrations of artificial intelligence using Logo. The material that the teachers learn to teach is suitable as an advanced level topic for intermediate- through secondary-level students enrolled in computer competency or similar courses. The material emphasizes problem-solving and thinking skills using a nonverbal expressive medium (Logo), thus it is deemed especially appropriate for hearing-impaired children. It is also sufficiently challenging for academically talented children, whether hearing or deaf. Although the notion of teachers as programmers is controversial, Logo is relatively easy to learn, has direct implications for education, and has been found to be an excellent tool for empowerment-for both teachers and children.
Scheduling System Assessment, and Development and Enhancement of Re-engineered Version of GPSS
NASA Technical Reports Server (NTRS)
Loganantharaj, Rasiah; Thomas, Bushrod; Passonno, Nicole
1996-01-01
The objective of this project is two-fold. First to provide an evaluation of a commercially developed version of the ground processing scheduling system (GPSS) for its applicability to the Kennedy Space Center (KSC) ground processing problem. Second, to work with the KSC GPSS development team and provide enhancement to the existing software. Systems reengineering is required to provide a sustainable system for the users and the software maintenance group. Using the LISP profile prototype code developed by the GPSS reverse reengineering groups as a building block, we have implemented the resource deconfliction portion of GPSS in common LISP using its object oriented features. The prototype corrects and extends some of the deficiencies of the current production version, plus it uses and builds on the classes from the development team's profile prototype.
KINKFOLD—an AutoLISP program for construction of geological cross-sections using borehole image data
NASA Astrophysics Data System (ADS)
Özkaya, Sait Ismail
2002-04-01
KINKFOLD is an AutoLISP program designed to construct geological cross-sections from borehole image or dip meter logs. The program uses the kink-fold method for cross-section construction. Beds are folded around hinge lines as angle bisectors so that bedding thickness remains unchanged. KINKFOLD may be used to model a wide variety of parallel fold structures, including overturned and faulted folds, and folds truncated by unconformities. The program accepts data from vertical or inclined boreholes. The KINKFOLD program cannot be used to model fault drag, growth folds, inversion structures or disharmonic folds where the bed thickness changes either because of deformation or deposition. Faulted structures and similar folds can be modelled by KINKFOLD by omitting dip measurements within fault drag zones and near axial planes of similar folds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdulmadjid, Syahrun Nur, E-mail: syahrun-madjid@yahoo.com; Lahna, Kurnia, E-mail: kurnialahna@gmail.com; Desiyana, Lydia Septa, E-mail: lydia-septa@yahoo.com
2016-03-11
An experimental study has been performed to examine the physical characteristics of pharmaceutical products, such as tablet, by employing an emission plasma induced by Nd-YAG laser at a low pressure of Helium gas. The hardness of tablet is one of the parameters that examined during the production process for standard quality of pharmaceutical products. In the Laser-Induced Shock Wave Plasma Spectroscopy (LISPS), the shock wave has a significant role in inducing atomic excitation. It was known that, the speed of the shock wavefront depends on the hardness of the sample, and it correlates with the ionization rate of the ablatedmore » atoms. The hardness of the tablet is examined using the intensity ratio between the ion of Mg (II) 275.2 nm and the neutral of Mg (I) 285.2 nm emission lines detected from the laser-induced plasma. It was observed that the ratio changes with respect to the change in the tablet hardness, namely the ratio is higher for the hard tablet. Besides the ratio measurements, we also measured the depth profile of a tablet by focusing 60 shots of irradiation of laser light at a fixed position on the surface of the tablet. It was found that the depth profile varies differently with the hardness of the tablet. These experiment results show that the technique of LISPS can be applied to examine the quality of pharmaceutical products.« less
LISP based simulation generators for modeling complex space processes
NASA Technical Reports Server (NTRS)
Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing
1987-01-01
The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.
Developing an Intelligent Computer-Aided Trainer
NASA Technical Reports Server (NTRS)
Hua, Grace
1990-01-01
The Payload-assist module Deploys/Intelligent Computer-Aided Training (PD/ICAT) system was developed as a prototype for intelligent tutoring systems with the intention of seeing PD/ICAT evolve and produce a general ICAT architecture and development environment that can be adapted by a wide variety of training tasks. The proposed architecture is composed of a user interface, a domain expert, a training session manager, a trainee model and a training scenario generator. The PD/ICAT prototype was developed in the LISP environment. Although it has been well received by its peers and users, it could not be delivered toe its end users for practical use because of specific hardware and software constraints. To facilitate delivery of PD/ICAT to its users and to prepare for a more widely accepted development and delivery environment for future ICAT applications, we have ported this training system to a UNIX workstation and adopted use of a conventional language, C, and a C-based rule-based language, CLIPS. A rapid conversion of the PD/ICAT expert system to CLIPS was possible because the knowledge was basically represented as a forward chaining rule base. The resulting CLIPS rule base has been tested successfully in other ICATs as well. Therefore, the porting effort has proven to be a positive step toward our ultimate goal of building a general purpose ICAT development environment.
DG TO FT - AUTOMATIC TRANSLATION OF DIGRAPH TO FAULT TREE MODELS
NASA Technical Reports Server (NTRS)
Iverson, D. L.
1994-01-01
Fault tree and digraph models are frequently used for system failure analysis. Both types of models represent a failure space view of the system using AND and OR nodes in a directed graph structure. Each model has its advantages. While digraphs can be derived in a fairly straightforward manner from system schematics and knowledge about component failure modes and system design, fault tree structure allows for fast processing using efficient techniques developed for tree data structures. The similarities between digraphs and fault trees permits the information encoded in the digraph to be translated into a logically equivalent fault tree. The DG TO FT translation tool will automatically translate digraph models, including those with loops or cycles, into fault tree models that have the same minimum cut set solutions as the input digraph. This tool could be useful, for example, if some parts of a system have been modeled using digraphs and others using fault trees. The digraphs could be translated and incorporated into the fault trees, allowing them to be analyzed using a number of powerful fault tree processing codes, such as cut set and quantitative solution codes. A cut set for a given node is a group of failure events that will cause the failure of the node. A minimum cut set for a node is any cut set that, if any of the failures in the set were to be removed, the occurrence of the other failures in the set will not cause the failure of the event represented by the node. Cut sets calculations can be used to find dependencies, weak links, and vital system components whose failures would cause serious systems failure. The DG TO FT translation system reads in a digraph with each node listed as a separate object in the input file. The user specifies a terminal node for the digraph that will be used as the top node of the resulting fault tree. A fault tree basic event node representing the failure of that digraph node is created and becomes a child of the terminal root node. A subtree is created for each of the inputs to the digraph terminal node and the root of those subtrees are added as children of the top node of the fault tree. Every node in the digraph upstream of the terminal node will be visited and converted. During the conversion process, the algorithm keeps track of the path from the digraph terminal node to the current digraph node. If a node is visited twice, then the program has found a cycle in the digraph. This cycle is broken by finding the minimal cut sets of the twice visited digraph node and forming those cut sets into subtrees. Another implementation of the algorithm resolves loops by building a subtree based on the digraph minimal cut sets calculation. It does not reduce the subtree to minimal cut set form. This second implementation produces larger fault trees, but runs much faster than the version using minimal cut sets since it does not spend time reducing the subtrees to minimal cut sets. The fault trees produced by DG TO FT will contain OR gates, AND gates, Basic Event nodes, and NOP gates. The results of a translation can be output as a text object description of the fault tree similar to the text digraph input format. The translator can also output a LISP language formatted file and an augmented LISP file which can be used by the FTDS (ARC-13019) diagnosis system, available from COSMIC, which performs diagnostic reasoning using the fault tree as a knowledge base. DG TO FT is written in C-language to be machine independent. It has been successfully implemented on a Sun running SunOS, a DECstation running ULTRIX, a Macintosh running System 7, and a DEC VAX running VMS. The RAM requirement varies with the size of the models. DG TO FT is available in UNIX tar format on a .25 inch streaming magnetic tape cartridge (standard distribution) or on a 3.5 inch diskette. It is also available on a 3.5 inch Macintosh format diskette or on a 9-track 1600 BPI magnetic tape in DEC VAX FILES-11 format. Sample input and sample output are provided on the distribution medium. An electronic copy of the documentation in Macintosh Microsoft Word format is provided on the distribution medium. DG TO FT was developed in 1992. Sun, and SunOS are trademarks of Sun Microsystems, Inc. DECstation, ULTRIX, VAX, and VMS are trademarks of Digital Equipment Corporation. UNIX is a registered trademark of AT&T Bell Laboratories. Macintosh is a registered trademark of Apple Computer, Inc. System 7 is a trademark of Apple Computers Inc. Microsoft Word is a trademark of Microsoft Corporation.
An expert system for wind shear avoidance
NASA Technical Reports Server (NTRS)
Stengel, Robert F.; Stratton, D. Alexander
1990-01-01
A study of intelligent guidance and control concepts for protecting against the adverse effects of wind shear during aircraft takeoffs and landings is being conducted, with current emphasis on developing an expert system for wind shear avoidance. Principal objectives are to develop methods for assessing the likelihood of wind shear encounter (based on real-time information in the cockpit), for deciding what flight path to pursue (e.g., takeoff abort, landing go-around, or normal climbout or glide slope), and for using the aircraft's full potential for combating wind shear. This study requires the definition of both deterministic and statistical techniques for fusing internal and external information , for making go/no-go decisions, and for generating commands to the manually controlled flight. The program has begun with the development of the WindShear Safety Advisor, an expert system for pilot aiding that is based on the FAA Windshear Training Aid; a two-volume manual that presents an overview , pilot guide, training program, and substantiating data provides guidelines for this initial development. The WindShear Safety Advisor expert system currently contains over 200 rules and is coded in the LISP programming language.
Variable Temperature Scanning Tunneling Microscopy
1991-07-01
Tomazin, both Electrical Engineering. Build a digital integrator for the STM feedback loop: Kyle Drewry, Electrical Engineering. Write an AutoLisp ...program to automate the AutoCad design of UHV-STM chambers: Alfred Pierce (minority), Mechanical Engineering. Design a 32-bit interface board for the EISA
Instructional Aspects of Intelligent Tutoring Systems.
ERIC Educational Resources Information Center
Pieters, Jules M., Ed.
This collection contains three papers addressing the instructional aspects of intelligent tutoring systems (ITS): (1) "Some Experiences with Two Intelligent Tutoring Systems for Teaching Computer Programming: Proust and the LISP-Tutor" (van den Berg, Merrienboer, and Maaswinkel); (2) "Some Issues on the Construction of Cooperative…
Parallelization of Rocket Engine Simulator Software (PRESS)
NASA Technical Reports Server (NTRS)
Cezzar, Ruknet
1998-01-01
We have outlined our work in the last half of the funding period. We have shown how a demo package for RESSAP using MPI can be done. However, we also mentioned the difficulties with the UNIX platform. We have reiterated some of the suggestions made during the presentation of the progress of the at Fourth Annual HBCU Conference. Although we have discussed, in some detail, how TURBDES/PUMPDES software can be run in parallel using MPI, at present, we are unable to experiment any further with either MPI or PVM. Due to X windows not being implemented, we are also not able to experiment further with XPVM, which it will be recalled, has a nice GUI interface. There are also some concerns, on our part, about MPI being an appropriate tool. The best thing about MPr is that it is public domain. Although and plenty of documentation exists for the intricacies of using MPI, little information is available on its actual implementations. Other than very typical, somewhat contrived examples, such as Jacobi algorithm for solving Laplace's equation, there are few examples which can readily be applied to real situations, such as in our case. In effect, the review of literature on both MPI and PVM, and there is a lot, indicate something similar to the enormous effort which was spent on LISP and LISP-like languages as tools for artificial intelligence research. During the development of a book on programming languages [12], when we searched the literature for very simple examples like taking averages, reading and writing records, multiplying matrices, etc., we could hardly find a any! Yet, so much was said and done on that topic in academic circles. It appears that we faced the same problem with MPI, where despite significant documentation, we could not find even a simple example which supports course-grain parallelism involving only a few processes. From the foregoing, it appears that a new direction may be required for more productive research during the extension period (10/19/98 - 10/18/99). At the least, the research would need to be done on Windows 95/Windows NT based platforms. Moreover, with the acquisition of Lahey Fortran package for PC platform, and the existing Borland C + + 5. 0, we can do work on C + + wrapper issues. We have carefully studied the blueprint for Space Transportation Propulsion Integrated Design Environment for the next 25 years [13] and found the inclusion of HBCUs in that effort encouraging. Especially in the long period for which a map is provided, there is no doubt that HBCUs will grow and become better equipped to do meaningful research. In the shorter period, as was suggested in our presentation at the HBCU conference, some key decisions regarding the aging Fortran based software for rocket propellants will need to be made. One important issue is whether or not object oriented languages such as C + + or Java should be used for distributed computing. Whether or not "distributed computing" is necessary for the existing software is yet another, larger, question to be tackled with.
A visual LISP program for voxelizing AutoCAD solid models
NASA Astrophysics Data System (ADS)
Marschallinger, Robert; Jandrisevits, Carmen; Zobl, Fritz
2015-01-01
AutoCAD solid models are increasingly recognized in geological and geotechnical 3D modeling. In order to bridge the currently existing gap between AutoCAD solid models and the grid modeling realm, a Visual LISP program is presented that converts AutoCAD solid models into voxel arrays. Acad2Vox voxelizer works on a 3D-model that is made up of arbitrary non-overlapping 3D-solids. After definition of the target voxel array geometry, 3D-solids are scanned at grid positions and properties are streamed to an ASCII output file. Acad2Vox has a novel voxelization strategy that combines a hierarchical reduction of sampling dimensionality with an innovative use of AutoCAD-specific methods for a fast and memory-saving operation. Acad2Vox provides georeferenced, voxelized analogs of 3D design data that can act as regions-of-interest in later geostatistical modeling and simulation. The Supplement includes sample geological solid models with instructions for practical work with Acad2Vox.
Applications of artificial intelligence to mission planning
NASA Technical Reports Server (NTRS)
Ford, Donnie R.; Rogers, John S.; Floyd, Stephen A.
1990-01-01
The scheduling problem facing NASA-Marshall mission planning is extremely difficult for several reasons. The most critical factor is the computational complexity involved in developing a schedule. The size of the search space is large along some dimensions and infinite along others. It is because of this and other difficulties that many of the conventional operation research techniques are not feasible or inadequate to solve the problems by themselves. Therefore, the purpose is to examine various artificial intelligence (AI) techniques to assist conventional techniques or to replace them. The specific tasks performed were as follows: (1) to identify mission planning applications for object oriented and rule based programming; (2) to investigate interfacing AI dedicated hardware (Lisp machines) to VAX hardware; (3) to demonstrate how Lisp may be called from within FORTRAN programs; (4) to investigate and report on programming techniques used in some commercial AI shells, such as Knowledge Engineering Environment (KEE); and (5) to study and report on algorithmic methods to reduce complexity as related to AI techniques.
ERIC Educational Resources Information Center
Linn, Marcia C.
1995-01-01
Describes a framework called scaffolded knowledge integration and illustrates how it guided the design of two successful course enhancements in the field of computer science and engineering: the LISP Knowledge Integration Environment and the spatial reasoning environment. (101 references) (Author/MKR)
Lisp as an Alternative to Java
NASA Technical Reports Server (NTRS)
Gat, E.
2000-01-01
In a recent study, Prechelt compared the relative performance of Java and C++ in terms of execution time and memory utilization. Unlike many benchmark studies, Prechelt compared mulitple implementations of the same task by multiple programmers in order to control for the effects of difference in programmer skill.
Image Algebra Matlab language version 2.3 for image processing and compression research
NASA Astrophysics Data System (ADS)
Schmalz, Mark S.; Ritter, Gerhard X.; Hayden, Eric
2010-08-01
Image algebra is a rigorous, concise notation that unifies linear and nonlinear mathematics in the image domain. Image algebra was developed under DARPA and US Air Force sponsorship at University of Florida for over 15 years beginning in 1984. Image algebra has been implemented in a variety of programming languages designed specifically to support the development of image processing and computer vision algorithms and software. The University of Florida has been associated with development of the languages FORTRAN, Ada, Lisp, and C++. The latter implementation involved a class library, iac++, that supported image algebra programming in C++. Since image processing and computer vision are generally performed with operands that are array-based, the Matlab™ programming language is ideal for implementing the common subset of image algebra. Objects include sets and set operations, images and operations on images, as well as templates and image-template convolution operations. This implementation, called Image Algebra Matlab (IAM), has been found to be useful for research in data, image, and video compression, as described herein. Due to the widespread acceptance of the Matlab programming language in the computing community, IAM offers exciting possibilities for supporting a large group of users. The control over an object's computational resources provided to the algorithm designer by Matlab means that IAM programs can employ versatile representations for the operands and operations of the algebra, which are supported by the underlying libraries written in Matlab. In a previous publication, we showed how the functionality of IAC++ could be carried forth into a Matlab implementation, and provided practical details of a prototype implementation called IAM Version 1. In this paper, we further elaborate the purpose and structure of image algebra, then present a maturing implementation of Image Algebra Matlab called IAM Version 2.3, which extends the previous implementation of IAM to include polymorphic operations over different point sets, as well as recursive convolution operations and functional composition. We also show how image algebra and IAM can be employed in image processing and compression research, as well as algorithm development and analysis.
Chips: A Tool for Developing Software Interfaces Interactively.
ERIC Educational Resources Information Center
Cunningham, Robert E.; And Others
This report provides a detailed description of Chips, an interactive tool for developing software employing graphical/computer interfaces on Xerox Lisp machines. It is noted that Chips, which is implemented as a collection of customizable classes, provides the programmer with a rich graphical interface for the creation of rich graphical…
OASIS: Prototyping Graphical Interfaces to Networked Information.
ERIC Educational Resources Information Center
Buckland, Michael K.; And Others
1993-01-01
Describes the latest modifications being made to OASIS, a front-end enhancement to the University of California's MELVYL online union catalog. Highlights include the X Windows interface; multiple database searching to act as an information network; Lisp implementation for flexible data representation; and OASIS commands and features to help…
Expert System Detects Power-Distribution Faults
NASA Technical Reports Server (NTRS)
Walters, Jerry L.; Quinn, Todd M.
1994-01-01
Autonomous Power Expert (APEX) computer program is prototype expert-system program detecting faults in electrical-power-distribution system. Assists human operators in diagnosing faults and deciding what adjustments or repairs needed for immediate recovery from faults or for maintenance to correct initially nonthreatening conditions that could develop into faults. Written in Lisp.
ERIC Educational Resources Information Center
Johnson, W. Lewis; Soloway, Elliot
This detailed description of a microcomputer version of PROUST (Program Understander for Students), a knowledge-based system that finds nonsyntactic bugs in Pascal programs written by novice programmers, presents the inner workings of Micro-PROUST, which was written in Golden LISP for the IBM-PC (512K). The contents include: (1) a reprint of an…
Research and applications: Artificial intelligence
NASA Technical Reports Server (NTRS)
Chaitin, L. J.; Duda, R. O.; Johanson, P. A.; Raphael, B.; Rosen, C. A.; Yates, R. A.
1970-01-01
The program is reported for developing techniques in artificial intelligence and their application to the control of mobile automatons for carrying out tasks autonomously. Visual scene analysis, short-term problem solving, and long-term problem solving are discussed along with the PDP-15 simulator, LISP-FORTRAN-MACRO interface, resolution strategies, and cost effectiveness.
NASA Technical Reports Server (NTRS)
Borchardt, G. C.
1994-01-01
The Simple Tool for Automated Reasoning program (STAR) is an interactive, interpreted programming language for the development and operation of artificial intelligence (AI) application systems. STAR provides an environment for integrating traditional AI symbolic processing with functions and data structures defined in compiled languages such as C, FORTRAN and PASCAL. This type of integration occurs in a number of AI applications including interpretation of numerical sensor data, construction of intelligent user interfaces to existing compiled software packages, and coupling AI techniques with numerical simulation techniques and control systems software. The STAR language was created as part of an AI project for the evaluation of imaging spectrometer data at NASA's Jet Propulsion Laboratory. Programming in STAR is similar to other symbolic processing languages such as LISP and CLIP. STAR includes seven primitive data types and associated operations for the manipulation of these structures. A semantic network is used to organize data in STAR, with capabilities for inheritance of values and generation of side effects. The AI knowledge base of STAR can be a simple repository of records or it can be a highly interdependent association of implicit and explicit components. The symbolic processing environment of STAR may be extended by linking the interpreter with functions defined in conventional compiled languages. These external routines interact with STAR through function calls in either direction, and through the exchange of references to data structures. The hybrid knowledge base may thus be accessed and processed in general by either side of the application. STAR is initially used to link externally compiled routines and data structures. It is then invoked to interpret the STAR rules and symbolic structures. In a typical interactive session, the user enters an expression to be evaluated, STAR parses the input, evaluates the expression, performs any file input/output required, and displays the results. The STAR interpreter is written in the C language for interactive execution. It has been implemented on a VAX 11/780 computer operating under VMS, and the UNIX version has been implemented on a Sun Microsystems 2/170 workstation. STAR has a memory requirement of approximately 200K of 8 bit bytes, excluding externally compiled functions and application-dependent symbolic definitions. This program was developed in 1985.
NASA Technical Reports Server (NTRS)
Borchardt, G. C.
1994-01-01
The Simple Tool for Automated Reasoning program (STAR) is an interactive, interpreted programming language for the development and operation of artificial intelligence (AI) application systems. STAR provides an environment for integrating traditional AI symbolic processing with functions and data structures defined in compiled languages such as C, FORTRAN and PASCAL. This type of integration occurs in a number of AI applications including interpretation of numerical sensor data, construction of intelligent user interfaces to existing compiled software packages, and coupling AI techniques with numerical simulation techniques and control systems software. The STAR language was created as part of an AI project for the evaluation of imaging spectrometer data at NASA's Jet Propulsion Laboratory. Programming in STAR is similar to other symbolic processing languages such as LISP and CLIP. STAR includes seven primitive data types and associated operations for the manipulation of these structures. A semantic network is used to organize data in STAR, with capabilities for inheritance of values and generation of side effects. The AI knowledge base of STAR can be a simple repository of records or it can be a highly interdependent association of implicit and explicit components. The symbolic processing environment of STAR may be extended by linking the interpreter with functions defined in conventional compiled languages. These external routines interact with STAR through function calls in either direction, and through the exchange of references to data structures. The hybrid knowledge base may thus be accessed and processed in general by either side of the application. STAR is initially used to link externally compiled routines and data structures. It is then invoked to interpret the STAR rules and symbolic structures. In a typical interactive session, the user enters an expression to be evaluated, STAR parses the input, evaluates the expression, performs any file input/output required, and displays the results. The STAR interpreter is written in the C language for interactive execution. It has been implemented on a VAX 11/780 computer operating under VMS, and the UNIX version has been implemented on a Sun Microsystems 2/170 workstation. STAR has a memory requirement of approximately 200K of 8 bit bytes, excluding externally compiled functions and application-dependent symbolic definitions. This program was developed in 1985.
Applications of artificial intelligence to mission planning
NASA Technical Reports Server (NTRS)
Ford, Donnie R.; Floyd, Stephen A.; Rogers, John S.
1990-01-01
The following subject areas are covered: object-oriented programming task; rule-based programming task; algorithms for resource allocation; connecting a Symbolics to a VAX; FORTRAN from Lisp; trees and forest task; software data structure conversion; software functionality modifications and enhancements; portability of resource allocation to a TI MicroExplorer; frontier of feasibility software system; and conclusions.
Dynamically Alterable Arrays of Polymorphic Data Types
NASA Technical Reports Server (NTRS)
James, Mark
2006-01-01
An application library package was developed that represents data packets for Deep Space Network (DSN) message packets as dynamically alterable arrays composed of arbitrary polymorphic data types. The software was to address a limitation of the present state of the practice for having an array directly composed of a single monomorphic data type. This is a severe limitation when one is dealing with science data in that the types of objects one is dealing with are typically not known in advance and, therefore, are dynamic in nature. The unique feature of this approach is that it enables one to define at run-time the dynamic shape of the matrix with the ability to store polymorphic data types in each of its indices. Existing languages such as C and C++ have the restriction that the shape of the array must be known in advance and each of its elements be a monomorphic data type that is strictly defined at compile-time. This program can be executed on a variety of platforms. It can be distributed in either source code or binary code form. It must be run in conjunction with any one of a number of Lisp compilers that are available commercially or as shareware.
Comparison of Ontology Reasoners: Racer, Pellet, Fact++
NASA Astrophysics Data System (ADS)
Huang, T.; Li, W.; Yang, C.
2008-12-01
In this paper, we examine some key aspects of three of the most popular and effective Semantic reasoning engines that have been developed: Pellet, RACER, and Fact++. While these reasonably advanced reasoners share some notable similarities, it is ultimately the creativity and unique nature of these reasoning engines that have resulted in the successes of each of these reasoners. Of the numerous dissimilarities, the most obvious example might be that while Pellet is written in Java, RACER employs the Lisp programming language and Fact++ was developed using C++. From this and many other distinctions in the system architecture, we can understand the benefits of each reasoner and potentially discover certain properties that may contribute to development of an optimal reasoner in the future. The objective of this paper is to establish a solid comparison of the reasoning engines based on their system architectures, features, and overall performances in real world application. In the end, we expect to produce a valid conclusion about the advantages and problems in each reasoner. While there may not be a decisive first place among the three reasoners, the evaluation will also provide some answers as to which of these current reasoning tools will be most effective in common, practical situations.
Rose garden promises of intelligent tutoring systems: Blossom or thorn
NASA Technical Reports Server (NTRS)
Shute, Valerie J.
1991-01-01
Intelligent tutoring systems (ITS) have been in existence for over a decade. However, few controlled evaluation studies have been conducted comparing the effectiveness of these systems to more traditional instruction methods. Two main promises of ITSs are examined: (1) Engender more effective and efficient learning in relation to traditional formats; and (2) Reduce the range of learning outcome measures where a majority of individuals are elevated to high performance levels. Bloom (1984) has referred to these as the two sigma problem; to achieve two standard deviation improvements with tutoring over traditional instruction methods. Four ITSs are discussed in relation to the two promises. These tutors have undergone systematic, controlled evaluations: (1) The LISP tutor (Anderson Farrell and Sauers, 1984); (2) Smithtown (Shute and Glaser, in press); (3) Sherlock (Lesgold, Lajoie, Bunzo and Eggan, 1990); and (4) The Pascal ITS (Bonar, Cunningham, Beatty and Well, 1988). Results show that these four tutors do accelerate learning with no degradation in final outcome. Suggestions for improvements to the design and evaluation of ITSs are discussed.
Control of Randomly Sampled Robotic Systems
1989-05-01
task is so cumbersome and complicated that we would not be able to do without lots of mistakes. To avoid this formidable business , a Lisp program is...Artificial Inteligence Laboratory, 1972. PumA26O.c Ned Mar 8 17:51:04 1989 1 #include <rnath.h> #define real float #define mm 6 #define G 9.83. #define M6
Pupils' Ideas about Flowering Plants. Learning in Science Project (Primary). Working Paper No. 125.
ERIC Educational Resources Information Center
Biddulph, Fred
The Learning in Science Project (Primary)--LISP(P)--investigated the ideas and interests children have about flowering plants (in particular whether these plants have a life cycle). Data were obtained from: individual interviews with children, ages 7- to 14-year-old (10 students for each age level), using the "interview-about-instances"…
Using Maxima in the Mathematics Classroom
ERIC Educational Resources Information Center
Fedriani, Eugenio M.; Moyano, Rafael
2011-01-01
Coming from the Macsyma system and adapted to the Common Lisp standard, Maxima can be regarded as a tool for a frequent use in the mathematics classroom. The main aim of this work is to show some possibilities of Maxima and its graphical interface through our experience as Mathematics teachers in Business degrees, although it can be easily spread…
1985 Annual Technical Report: A Research Program in Computer Technology. July 1984--June 1985.
ERIC Educational Resources Information Center
University of Southern California, Marina del Rey. Information Sciences Inst.
Summaries of research performed by the Information Sciences Institute at the University of Southern California for the U.S. Department of Defense Advanced Research Projects Agency in 17 areas are provided in this report: (1) Common LISP framework, an exportable version of the Formalized Software Development (FSD) testbed; (2) Explainable Expert…
Effects of a Format-based Second Language Teaching Method in Kindergarten.
ERIC Educational Resources Information Center
Uilenburg, Noelle; Plooij, Frans X.; de Glopper, Kees; Damhuis, Resi
2001-01-01
Focuses on second language teaching with a format-based method. The differences between a format-based teaching method and a standard approach used as treatments in a quasi-experimental, non-equivalent control group are described in detail. Examines whether the effects of a format-based teaching method and a standard foreign language method differ…
Language competition in a population of migrating agents.
Lipowska, Dorota; Lipowski, Adam
2017-05-01
Influencing various aspects of human activity, migration is associated also with language formation. To examine the mutual interaction of these processes, we study a Naming Game with migrating agents. The dynamics of the model leads to formation of low-mobility clusters, which turns out to break the symmetry of the model: although the Naming Game remains symmetric, low-mobility languages are favored. High-mobility languages are gradually eliminated from the system, and the dynamics of language formation considerably slows down. Our model is too simple to explain in detail language competition of migrating human communities, but it certainly shows that languages of settlers are favored over nomadic ones.
Language competition in a population of migrating agents
NASA Astrophysics Data System (ADS)
Lipowska, Dorota; Lipowski, Adam
2017-05-01
Influencing various aspects of human activity, migration is associated also with language formation. To examine the mutual interaction of these processes, we study a Naming Game with migrating agents. The dynamics of the model leads to formation of low-mobility clusters, which turns out to break the symmetry of the model: although the Naming Game remains symmetric, low-mobility languages are favored. High-mobility languages are gradually eliminated from the system, and the dynamics of language formation considerably slows down. Our model is too simple to explain in detail language competition of migrating human communities, but it certainly shows that languages of settlers are favored over nomadic ones.
ERIC Educational Resources Information Center
Recker, Margaret M.; Pirolli, Peter
Students learning to program recursive LISP functions in a typical school-like lesson on recursion were observed. The typical lesson contains text and examples and involves solving a series of programming problems. The focus of this study is on students' learning strategies in new domains. In this light, a Soar computational model of…
Pupils' Views about Spiders. Learning in Science Project (Primary). Working Paper No. 123.
ERIC Educational Resources Information Center
Hawe, Eleanor
The Learning in Science Project (Primary)--LISP(P)--investigated the ideas and interests about spiders held by 8- to 10-year-old children. Data included 303 questions--and answers to some of the questions--about spiders obtained from children in four classes and from responses obtained during individual interviews with 10 children from each age…
Children's Ideas about "Metals." Learning in Science Project (Primary). Working Paper No. 112.
ERIC Educational Resources Information Center
Biddulph, Fred; Osborne, Roger
The topic of metals is frequently taught in primary schools. However, when metals are suggested as one of a series of topics for study, students often initially show little enthusiasm for the topic. To determine the ideas that children have about metals the Learning in Science Project (Primary)--LISP(P)--interviewed thirty-eight 9- to 10-year-old…
ERIC Educational Resources Information Center
Biddulph, Fred; McMinn, Bill
An alternative approach for teaching primary school science has been proposed by the Learning in Science Project (Primary--LISP(P). This study investigated the use of the approach during three series of lessons on the topic "metals." Each series followed the same general pattern: (1) an introductory session to stimulate children to ask…
Object-oriented fault tree evaluation program for quantitative analyses
NASA Technical Reports Server (NTRS)
Patterson-Hine, F. A.; Koen, B. V.
1988-01-01
Object-oriented programming can be combined with fault free techniques to give a significantly improved environment for evaluating the safety and reliability of large complex systems for space missions. Deep knowledge about system components and interactions, available from reliability studies and other sources, can be described using objects that make up a knowledge base. This knowledge base can be interrogated throughout the design process, during system testing, and during operation, and can be easily modified to reflect design changes in order to maintain a consistent information source. An object-oriented environment for reliability assessment has been developed on a Texas Instrument (TI) Explorer LISP workstation. The program, which directly evaluates system fault trees, utilizes the object-oriented extension to LISP called Flavors that is available on the Explorer. The object representation of a fault tree facilitates the storage and retrieval of information associated with each event in the tree, including tree structural information and intermediate results obtained during the tree reduction process. Reliability data associated with each basic event are stored in the fault tree objects. The object-oriented environment on the Explorer also includes a graphical tree editor which was modified to display and edit the fault trees.
Oral breathing and speech disorders in children.
Hitos, Silvia F; Arakaki, Renata; Solé, Dirceu; Weckx, Luc L M
2013-01-01
To assess speech alterations in mouth-breathing children, and to correlate them with the respiratory type, etiology, gender, and age. A total of 439 mouth-breathers were evaluated, aged between 4 and 12 years. The presence of speech alterations in children older than 5 years was considered delayed speech development. The observed alterations were tongue interposition (TI), frontal lisp (FL), articulatory disorders (AD), sound omissions (SO), and lateral lisp (LL). The etiology of mouth breathing, gender, age, respiratory type, and speech disorders were correlated. Speech alterations were diagnosed in 31.2% of patients, unrelated to the respiratory type: oral or mixed. Increased frequency of articulatory disorders and more than one speech disorder were observed in males. TI was observed in 53.3% patients, followed by AD in 26.3%, and by FL in 21.9%. The co-occurrence of two or more speech alterations was observed in 24.8% of the children. Mouth breathing can affect speech development, socialization, and school performance. Early detection of mouth breathing is essential to prevent and minimize its negative effects on the overall development of individuals. Copyright © 2013 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.
An Embedded Rule-Based Diagnostic Expert System in Ada
NASA Technical Reports Server (NTRS)
Jones, Robert E.; Liberman, Eugene M.
1992-01-01
Ada is becoming an increasingly popular programming language for large Government-funded software projects. Ada with it portability, transportability, and maintainability lends itself well to today's complex programming environment. In addition, expert systems have also assumed a growing role in providing human-like reasoning capability expertise for computer systems. The integration is discussed of expert system technology with Ada programming language, especially a rule-based expert system using an ART-Ada (Automated Reasoning Tool for Ada) system shell. NASA Lewis was chosen as a beta test site for ART-Ada. The test was conducted by implementing the existing Autonomous Power EXpert System (APEX), a Lisp-based power expert system, in ART-Ada. Three components, the rule-based expert systems, a graphics user interface, and communications software make up SMART-Ada (Systems fault Management with ART-Ada). The rules were written in the ART-Ada development environment and converted to Ada source code. The graphics interface was developed with the Transportable Application Environment (TAE) Plus, which generates Ada source code to control graphics images. SMART-Ada communicates with a remote host to obtain either simulated or real data. The Ada source code generated with ART-Ada, TAE Plus, and communications code was incorporated into an Ada expert system that reads the data from a power distribution test bed, applies the rule to determine a fault, if one exists, and graphically displays it on the screen. The main objective, to conduct a beta test on the ART-Ada rule-based expert system shell, was achieved. The system is operational. New Ada tools will assist in future successful projects. ART-Ada is one such tool and is a viable alternative to the straight Ada code when an application requires a rule-based or knowledge-based approach.
ERIC Educational Resources Information Center
Ponomarenko, Larisa N.; Zlobina, Irina S.; Galitskih, Elena O.; Rublyova, Olga S.
2017-01-01
The article presents the main ideas of concept of foreign language discursive competence formation among university and secondary school students by means of intercultural dialogue. The concept includes fundamental principles, activity stages of educational process, and criteria of foreign language discursive competence formation. Innovation of…
Tools and technologies for expert systems: A human factors perspective
NASA Technical Reports Server (NTRS)
Rajaram, Navaratna S.
1987-01-01
It is widely recognized that technologies based on artificial intelligence (AI), especially expert systems, can make significant contributions to the productivity and effectiveness of operations of information and knowledge intensive organizations such as NASA. At the same time, these being relatively new technologies, there is the problem of transfering technology to key personnel of such organizations. The problems of examining the potential of expert systems and of technology transfer is addressed in the context of human factors applications. One of the topics of interest was the investigation of the potential use of expert system building tools, particularly NEXPERT as a technology transfer medium. Two basic conclusions were reached in this regard. First, NEXPERT is an excellent tool for rapid prototyping of experimental expert systems, but not ideal as a delivery vehicle. Therefore, it is not a substitute for general purpose system implementation languages such a LISP or C. This assertion probably holds for nearly all such tools on the market today. Second, an effective technology transfer mechanism is to formulate and implement expert systems for problems which members of the organization in question can relate to. For this purpose, the LIghting EnGineering Expert (LIEGE) was implemented using NEXPERT as the tool for technology transfer and to illustrate the value of expert systems to the activities of the Man-System Division.
LISP on a Reduced-Instruction-Set-Processor,
1986-01-01
Digital * Press, 1984. 19. Steele, G. L. Jr., and Sussman, G.J. LAMBDA : The Ultimate Imperative. Al Memo 353, MIT, Artificial ,, Inteligence Laboratory...procedure B is No 444, MIT Artificial Intelligence Laboratory, August, recursive, if procedure A can be reexecuted before the call 1977. returns. This...the programs Artificial Intelligence Programming. Lawrence Erlbaum use apply and eval, and of these three only frl uses eval Associates, Hillsdale, New
Artificial Intelligence Project
1990-01-01
Artifcial Intelligence Project at The University of Texas at Austin, University of Texas at Austin, Artificial Intelligence Laboratory AITR84-01. Novak...Texas at Austin, Artificial Intelligence Laboratory A187-52, April 1987. Novak, G. "GLISP: A Lisp-Based Programming System with Data Abstraction...of Texas at Austin, Artificial Intelligence Laboratory AITR85-14.) Rim, Hae-Chang, and Simmons, R. F. "Extracting Data Base Knowledge from Medical
Concept Formation and the Development of Language. Theoretical Paper No. 37.
ERIC Educational Resources Information Center
Nelson, Gordon K.
This paper examines possible interchanges between cognitive and language processes with particular attention given to concept formation and semantic language development. Aspects of psychological and contemporary linguistic theories are discussed as a way to interrelate the functions of thought and language. The author concludes that while…
ERIC Educational Resources Information Center
Castro-Peet, Alma Sandra
2017-01-01
Purpose: This study explored a technological contribution to education made by the Defense Language Institute Foreign Language Center (DLIFLC) in the formative assessment field. The purpose of this quantitative correlational study was to identify the relationship between online formative (Online Diagnostic Assessment; ODA) and summative (Defense…
Rodríguez, Cathi Draper; Cumming, Therese M
2017-01-01
This exploratory study investigated the effects of a language building iPad application on the language skills (i.e., receptive vocabulary, expressive vocabulary, and sentence formation) of young students with language-based disabilities. The study utilized a pre-test-post-test control group design. Students in the treatment group used the iPad language building application, Language Builder, for 30 minutes a day. Participants were 31 first-grade to third-grade students with identified language-based disabilities. Students were assigned to two groups for the 8-week intervention. Data indicated that students in the treatment group made significantly greater gains in the area of sentence formation than the control group. Results revealed no significant difference between the two groups in the areas of expressive and receptive vocabulary. A short intervention of using Language Builder via the iPad may increase the sentence formation skills of young students with language delays. Additionally, discussion regarding the usefulness of iPad applications in education is presented.
Computer-Aided Fabrication of Integrated Circuits
1989-09-30
baseline CMOS process. One result of this effort was the identification of several residual bugs in the PATRAN graphics processor . The vendor promises...virtual memory. The internal Nubus architecture uses a 32-bit LISP processor running at 10 megahertz (100 ns clock period). An ethernet controller is...For different patterns, we need different masks for the photo step, and for dif- ferent micro -structures of the wafers, we need different etching
Information Processing Research
1988-01-01
the Hitech chess machine, which achieves its success from parallelism in the right places. Hitech has now reached a National rating of 2359, making it...outset that success depended on building real systems and subjecting them to use by a large number of faculty and students within the Department. We...central server workstations each acting as a host for a Warp machine, and a few Warp multiprocessors. The command interpreter is executed in Lisp on
Software For Nearly Optimal Packing Of Cargo
NASA Technical Reports Server (NTRS)
Fennel, Theron R.; Daughtrey, Rodney S.; Schwaab, Doug G.
1994-01-01
PACKMAN computer program used to find nearly optimal arrangements of cargo items in storage containers, subject to such multiple packing objectives as utilization of volumes of containers, utilization of containers up to limits on weights, and other considerations. Automatic packing algorithm employed attempts to find best positioning of cargo items in container, such that volume and weight capacity of container both utilized to maximum extent possible. Written in Common LISP.
Communications Patterns in a Symbolic Multiprocessor.
1987-06-01
instruction references that Multilisp programs make. The cache hit ratio is greatest when instruction references have a high degree of -- locality. Another...future touches hit an undetermined future. N, The only exception is Consim, in which one third of future touches hit unde- termined futures. Task...Cambridge, MA, June 1985. [52] S. Sugimoto, K. Agusa, K. Tabata , and Y. Ohno. A multi-microprocessor system for concurrent Lisp. In Proceedings of
ERIC Educational Resources Information Center
Thomas, Crystal Ann
2012-01-01
The purpose of this dissertation was to investigate whether powerful language training affected student participation, impression formation, and gender communication style in online discussions. Powerful language was defined as a lack of the use of powerless language. Participants in this study were 507 freshmen taking a first-year college…
ERIC Educational Resources Information Center
Hallerberg, Mark; Cothran, Bettina
1999-01-01
Explores how language and political science professors can co-teach a course using the Language Across the Curriculum format to increase student understanding of a country's language and politics. Describes a Georgia Tech course taught in German on post-war German politics. Addresses the elements of a successful course and student and course…
Automatic mathematical modeling for real time simulation system
NASA Technical Reports Server (NTRS)
Wang, Caroline; Purinton, Steve
1988-01-01
A methodology for automatic mathematical modeling and generating simulation models is described. The models will be verified by running in a test environment using standard profiles with the results compared against known results. The major objective is to create a user friendly environment for engineers to design, maintain, and verify their model and also automatically convert the mathematical model into conventional code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine Simulation. It is written in LISP and MACSYMA and runs on a Symbolic 3670 Lisp Machine. The program provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. It contains an initial set of component process elements for the Space Shuttle Main Engine Simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. The system is then able to automatically generate the model and FORTRAN code. The future goal which is under construction is to download the FORTRAN code to VAX/VMS system for conventional computation. The SSME mathematical model will be verified in a test environment and the solution compared with the real data profile. The use of artificial intelligence techniques has shown that the process of the simulation modeling can be simplified.
The mGA1.0: A common LISP implementation of a messy genetic algorithm
NASA Technical Reports Server (NTRS)
Goldberg, David E.; Kerzic, Travis
1990-01-01
Genetic algorithms (GAs) are finding increased application in difficult search, optimization, and machine learning problems in science and engineering. Increasing demands are being placed on algorithm performance, and the remaining challenges of genetic algorithm theory and practice are becoming increasingly unavoidable. Perhaps the most difficult of these challenges is the so-called linkage problem. Messy GAs were created to overcome the linkage problem of simple genetic algorithms by combining variable-length strings, gene expression, messy operators, and a nonhomogeneous phasing of evolutionary processing. Results on a number of difficult deceptive test functions are encouraging with the mGA always finding global optima in a polynomial number of function evaluations. Theoretical and empirical studies are continuing, and a first version of a messy GA is ready for testing by others. A Common LISP implementation called mGA1.0 is documented and related to the basic principles and operators developed by Goldberg et. al. (1989, 1990). Although the code was prepared with care, it is not a general-purpose code, only a research version. Important data structures and global variations are described. Thereafter brief function descriptions are given, and sample input data are presented together with sample program output. A source listing with comments is also included.
Encoding Standards for Linguistic Corpora.
ERIC Educational Resources Information Center
Ide, Nancy
The demand for extensive reusability of large language text collections for natural languages processing research requires development of standardized encoding formats. Such formats must be capable of representing different kinds of information across the spectrum of text types and languages, capable of representing different levels of…
Formative Assessment of Writing in English as a Foreign Language
ERIC Educational Resources Information Center
Burner, Tony
2016-01-01
Recognizing the importance of formative assessment, this mixed-methods study investigates how four teachers and 100 students respond to the new emphasis on formative assessment in English as a foreign language (EFL) writing classes in Norway. While previous studies have examined formative assessment in oral classroom interactions and focused on…
ERIC Educational Resources Information Center
Biddulph, Fred; Osborne, Roger
Two booklets were developed by the Learning in Science Project (Primary)--LISP(P)--to help teachers adopt an approach to primary science teaching which would enhance children's understanding of floating and sinking. Both booklets were designed to enable teachers to reconceptualize their teaching task from activity-driven, didactic teaching to…
Mask Matching for Linear Feature Detection.
1987-01-01
decide which matched masks are part of a linear feature by sim- ple thresholding of the confidence measures. However, it is shown in a compan - ion report...Laboratory, Center for Automation Research, University of Maryland, January 1987. 3. E.M. Allen, R.H. Trigg, and R.J. Wood, The Maryland Artificial ... Intelligence Group Franz Lisp Environment, Variation 3.5, TR-1226, Department of Computer Science, University of Maryland, December 1984. 4. D.E. Knuth, The
Energy Supply Options for Modernizing Army Heating Systems
1999-01-01
Army Regulation (AR) 420-49, Heating, Energy Selection and Fuel Storage, Distribution, and Dispens- ing Systems and Technical Manual (TM) 5-650...analysis. 26 USACERL TR 99/23 HEATMAP uses the AutoLISP program in AutoCAD to take the graphical input to populate a Microsoft® Access database in...of 1992, Subtitle F, Federal Agency Energy Man- agement. Technical Manual (TM) 5-650, Repairs and Utilities: Central Boiler Plants (HQDA, 13 October
JPRS Report, Science & Technology. China.
1989-03-29
Commun ., Vol COM-29, No 6, pp 895-901, June 1981. [4] R.C. Titsworth , "A Boolean-Function-Multiplexed Telemetry System," IEEE Trans, on SET, pp 42...Reagents 39 Gene-Engineered Human Epithelium Growth Factor (hEGF) 39 Superfine Snake Venom 39 COMPUTERS Ai Computer System LISP-MI [Zheng Shouqi, et...XUEBAO, No 3, Jun 88] 134 Coordinated Development of Microwave, Optical Communications [Zhang Xu; DIANXIN KUAIBAO, No 11, Nov 88] 143 Error
Deductive Synthesis of the Unification Algorithm,
1981-06-01
DEDUCTIVE SYNTHESIS OF THE I - UNIFICATION ALGORITHM Zohar Manna Richard Waldinger I F? Computer Science Department Artificial Intelligence Center...theorem proving," Artificial Intelligence Journal, Vol. 9, No. 1, pp. 1-35. Boyer, R. S. and J S. Moore [Jan. 19751, "Proving theorems about LISP...d’Intelligence Artificielle , U.E.R. de Luminy, Universit6 d’ Aix-Marseille II. Green, C. C. [May 1969], "Application of theorem proving to problem
NASA Technical Reports Server (NTRS)
Harrison, P. Ann
1993-01-01
All the NASA VEGetation Workbench (VEG) goals except the Learning System provide the scientist with several different techniques. When VEG is run, rules assist the scientist in selecting the best of the available techniques to apply to the sample of cover type data being studied. The techniques are stored in the VEG knowledge base. The design and implementation of an interface that allows the scientist to add new techniques to VEG without assistance from the developer were completed. A new interface that enables the scientist to add techniques to VEG without assistance from the developer was designed and implemented. This interface does not require the scientist to have a thorough knowledge of Knowledge Engineering Environment (KEE) by Intellicorp or a detailed knowledge of the structure of VEG. The interface prompts the scientist to enter the required information about the new technique. It prompts the scientist to enter the required Common Lisp functions for executing the technique and the left hand side of the rule that causes the technique to be selected. A template for each function and rule and detailed instructions about the arguments of the functions, the values they should return, and the format of the rule are displayed. Checks are made to ensure that the required data were entered, the functions compiled correctly, and the rule parsed correctly before the new technique is stored. The additional techniques are stored separately from the VEG knowledge base. When the VEG knowledge base is loaded, the additional techniques are not normally loaded. The interface allows the scientist the option of adding all the previously defined new techniques before running VEG. When the techniques are added, the required units to store the additional techniques are created automatically in the correct places in the VEG knowledge base. The methods file containing the functions required by the additional techniques is loaded. New rule units are created to store the new rules. The interface that allow the scientist to select which techniques to use is updated automatically to include the new techniques. Task H was completed. The interface that allows the scientist to add techniques to VEG was implemented and comprehensively tested. The Common Lisp code for the Add Techniques system is listed in Appendix A.
Concept Formation Skills in Long-Term Cochlear Implant Users
Castellanos, Irina; Kronenberger, William G.; Beer, Jessica; Colson, Bethany G.; Henning, Shirley C.; Ditmars, Allison; Pisoni, David B.
2015-01-01
This study investigated if a period of auditory sensory deprivation followed by degraded auditory input and related language delays affects visual concept formation skills in long-term prelingually deaf cochlear implant (CI) users. We also examined if concept formation skills are mediated or moderated by other neurocognitive domains (i.e., language, working memory, and executive control). Relative to normally hearing (NH) peers, CI users displayed significantly poorer performance in several specific areas of concept formation, especially when multiple comparisons and relational concepts were components of the task. Differences in concept formation between CI users and NH peers were fully explained by differences in language and inhibition–concentration skills. Language skills were also found to be more strongly related to concept formation in CI users than in NH peers. The present findings suggest that complex relational concepts may be adversely affected by a period of early prelingual deafness followed by access to underspecified and degraded sound patterns and spoken language transmitted by a CI. Investigating a unique clinical population such as early-implanted prelingually deaf children with CIs can provide new insights into foundational brain–behavior relations and developmental processes. PMID:25583706
Success in tutoring electronic troubleshooting
NASA Technical Reports Server (NTRS)
Parker, Ellen M.
1990-01-01
Two years ago Dr. Sherrie Gott of the Air Force Human Resources Laboratory described an avionics troubleshooting tutor being developed under the Basic Job Skills Research Program. The tutor, known as Sherlock, is directed at teaching the diagnostic procedures necessary to investigate complex test equipment used to maintain F-15 fighter aircraft. Since Dr. Gott's presentation in 1987, the tutor has undergone field testing at two Air Force F-15 flying wings. The results of the field test showed that after an average of 20 hours on the tutor, the 16 airmen in the experimental group (who average 28 months of experience) showed significant performance gains when compared to a control group (having a mean experience level of 37 months) who continued participating in the existing on-the-job training program. Troubleshooting performance of the tutored group approached the level of proficiency of highly experienced airmen (averaging approximately 114 months of experience), and these performance gains were confirmed in delayed testing six months following the intervention. The tutor is currently undergoing a hardware and software conversion form a Xerox Lisp environment to a PC-based environment using an object-oriented programming language. Summarized here are the results of the successful field test. The focus is on: (1) the instructional features that contributed to Sherlock's success; and (2) the implementation of these features in the PC-based version of the avionics troubleshooting tutor.
2012-01-01
Background A crucial issue for the sustainability of societies is how to maintain health and functioning in older people. With increasing age, losses in vision, hearing, balance, mobility and cognitive capacity render older people particularly exposed to environmental barriers. A central building block of human functioning is walking. Walking difficulties may start to develop in midlife and become increasingly prevalent with age. Life-space mobility reflects actual mobility performance by taking into account the balance between older adults internal physiologic capacity and the external challenges they encounter in daily life. The aim of the Life-Space Mobility in Old Age (LISPE) project is to examine how home and neighborhood characteristics influence people’s health, functioning, disability, quality of life and life-space mobility in the context of aging. In addition, examine whether a person’s health and function influence life-space mobility. Design This paper describes the study protocol of the LISPE project, which is a 2-year prospective cohort study of community-dwelling older people aged 75 to 90 (n = 848). The data consists of a baseline survey including face-to-face interviews, objective observation of the home environment and a physical performance test in the participant’s home. All the baseline participants will be interviewed over the phone one and two years after baseline to collect data on life-space mobility, disability and participation restriction. Additional home interviews and environmental evaluations will be conducted for those who relocate during the study period. Data on mortality and health service use will be collected from national registers. In a substudy on walking activity and life space, 358 participants kept a 7-day diary and, in addition, 176 participants also wore an accelerometer. Discussion Our study, which includes extensive data collection with a large sample, provides a unique opportunity to study topics of importance for aging societies. A novel approach is employed which enables us to study the interactions of environmental features and individual characteristics underlying the life-space of older people. Potentially, the results of this study will contribute to improvements in strategies to postpone or prevent progression to disability and loss of independence. PMID:23170987
Transcending Tradition: Situated Activity, Discourse, and Identity in Language Teacher Education
ERIC Educational Resources Information Center
Mantero, Miguel
2004-01-01
This article explores the concept of tradition within language teacher education (LTE) and extends our understanding of the elements involved in the formation of identity in preservice, second language teachers. After reviewing various perspectives of identity formation, a discursive model is offered as an approach to individual development within…
The multilingual matrix test: Principles, applications, and comparison across languages: A review.
Kollmeier, Birger; Warzybok, Anna; Hochmuth, Sabine; Zokoll, Melanie A; Uslar, Verena; Brand, Thomas; Wagener, Kirsten C
2015-01-01
A review of the development, evaluation, and application of the so-called 'matrix sentence test' for speech intelligibility testing in a multilingual society is provided. The format allows for repeated use with the same patient in her or his native language even if the experimenter does not understand the language. Using a closed-set format, the syntactically fixed, semantically unpredictable sentences (e.g. 'Peter bought eight white ships') provide a vocabulary of 50 words (10 alternatives for each position in the sentence). The principles (i.e. construction, optimization, evaluation, and validation) for 14 different languages are reviewed. Studies of the influence of talker, language, noise, the training effect, open vs. closed conduct of the test, and the subjects' language proficiency are reported and application examples are discussed. The optimization principles result in a steep intelligibility function and a high homogeneity of the speech materials presented and test lists employed, yielding a high efficiency and excellent comparability across languages. The characteristics of speakers generally dominate the differences across languages. The matrix test format with the principles outlined here is recommended for producing efficient, reliable, and comparable speech reception thresholds across different languages.
The Formation of Students' Creative Independence at the English Language Classes
ERIC Educational Resources Information Center
Shangaraeva, Liya F.; Yarkhamova, Alfiya A.; Biktagirova, Zubayda A.; Agol, Dorice
2016-01-01
The article is devoted to the formation of students' creative independence. The aim of the article is to identify and test pedagogical conditions of formation students' creative independence studying the English language. The leading methods are analyses of scientific works and practice, empirical and experimental data, method of involved…
ERIC Educational Resources Information Center
New York City Board of Education, Brooklyn, NY.
The format of this report is similar to that of other reports on The Language Development Project. See AL 002 352 and 354 for descriptions of the project and the format of the reports. [Not available in hard copy due to marginal legibility of the original document.] (DO)
Language Policy and Planning in South America.
ERIC Educational Resources Information Center
Hornberger, Nancy H.
1994-01-01
A discussion of language policy formation and planning in South America focuses on the highland indigenous sectors and covers the following: colonial languages; immigrant languages; and indigenous languages, including planning, acquisition planning, and corpus planning. (Contains 83 references.) (LB)
1985-06-01
M382 FACOM computer, and is written in UTILISP (University of Tokyo version of interactive LISP). There are three "levels" of kanji vocabulary, with...decade, -. *.- biotechnology might impact on several domains: -4 -the pharmaceutical industry by the production of new drugs , vaccines, and diagnostic...competition was fierce and only four of the 70 companies . -"- survived. They launched the antibiotic production and drug industry of Japan which is now
Modeling of flow systems for implementation under KATE
NASA Technical Reports Server (NTRS)
Whitlow, Jonathan E.
1990-01-01
The modeling of flow systems is a task currently being investigated at Kennedy Space Center in parallel with the development of the KATE artificial intelligence system used for monitoring diagnosis and control. Various aspects of the modeling issues are focussed on with particular emphasis on a water system scheduled for demonstration within the KATE environment in September of this year. LISP procedures were written to solve the continuity equations for three internal pressure nodes using Newton's method for simultaneous nonlinear equations.
1997-04-01
implied, with respect to the accuracy, completeness or usefulness of the information contained in this report/ manual , or that the use of any information...shipyards throughout the world have introduced various aspects of CAD/CAM piecemeal as substitutes for manual processes, the greatest improvement in...possibility of multiple models of the molded geometry being developed, which would cause the loss of geometry control. Numerous AutoLisp routines were used
Steamer Training System and Graphics Editor, 1987 Version
1987-09-01
NIIHAU : >simenv>documentation>simenv-read me.text.24 7/30/87 18:29:51 Page 1 SMode: Text-- Herewith are instructions for installing the Genera 7.0 (should...lowercase: t; package: file-system; - (set-logical-pathname-host "simenv" : physical-host " niihau " :translations ((C"simenv;" ">3imenv>") steamer-system...translations ;;--- mode: lisp; base: 10; lowercase: t; package: file-system;-- (fS:set-logical-pathname-host "steamer-system" :physical-host " niihau
Five Tips to Help Prevent Infections
... Information For… Media Policy Makers 5 Tips to Help Prevent Infections Language: English (US) Español (Spanish) Recommend ... Makers Language: English (US) Español (Spanish) File Formats Help: How do I view different file formats (PDF, ...
Improving Foreign Language Speaking through Formative Assessment
ERIC Educational Resources Information Center
Tuttle, Harry Grover; Tuttle, Alan Robert
2012-01-01
Want a quick way to get your students happily conversing more in the target language? This practical book shows you how to use formative assessments to gain immediate and lasting improvement in your students' fluency. You'll learn how to: (1) Imbed the 3-minute formative assessment into every lesson with ease; (2) Engage students in peer formative…
Mission and science activity scheduling language
NASA Technical Reports Server (NTRS)
Hull, Larry G.
1993-01-01
To support the distributed and complex operational scheduling required for future National Aeronautics and Space Administration (NASA) missions, a formal, textual language, the Scheduling Applications Interface Language (SAIL), has been developed. Increased geographic dispersion of investigators is leading to distributed mission and science activity planning, scheduling, and operations. SAIL is an innovation which supports the effective and efficient communication of scheduling information among physically dispersed applications in distributed scheduling environments. SAIL offers a clear, concise, unambiguous expression of scheduling information in a readable, hardware independent format. The language concept, syntax, and semantics incorporate language features found useful during five years of research and prototyping with scheduling languages in physically distributed environments. SAIL allows concise specification of mission and science activity plans in a format which promotes repetition and reuse.
Progress in recognizing typeset mathematics
NASA Astrophysics Data System (ADS)
Fateman, Richard J.; Tokuyasu, Taku A.
1996-03-01
Printed mathematics has a number of features which distinguish it from conventional text. These include structure in two dimensions (fractions, exponents, limits), frequent font changes, symbols with variable shape (quotient bars), and substantially differing notational conventions from source to source. When compounded with more generic problems such as noise and merged or broken characters, printed mathematics offers a challenging arena for recognition. Our project was initially driven by the goal of scanning and parsing some 5,000 pages of elaborate mathematics (tables of definite integrals). While our prototype system demonstrates success on translating noise-free typeset equations into Lisp expressions appropriate for further processing, a more semantic top-down approach appears necessary for higher levels of performance. Such an approach may benefit the incorporation of these programs into a more general document processing viewpoint. We intend to release to the public our somewhat refined prototypes as utility programs in the hope that they will be of general use in the construction of custom OCR packages. These utilities are quite fast even as originally prototyped in Lisp, where they may be of particular interest to those working on 'intelligent' optical processing. Some routines have been re-written in C++ as well. Additional programs providing formula recognition and parsing also form a part of this system. It is important however to realize that distinct conflicting grammars are needed to cover variations in contemporary and historical typesetting, and thus a single simple solution is not possible.
Language![R]. What Works Clearinghouse Intervention Report
ERIC Educational Resources Information Center
What Works Clearinghouse, 2013
2013-01-01
"LANGUAGE!"[R] is a language arts intervention designed for struggling learners in grades 3-12 who score below the 40th percentile on standardized literacy tests. The curriculum integrates English literacy acquisition skills into a six-step lesson format. During a daily lesson, students work on six key literacy strands (which the…
Formal Analysis of the Remote Agent Before and After Flight
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Lowry, Mike; Park, SeungJoon; Pecheur, Charles; Penix, John; Visser, Willem; White, Jon L.
2000-01-01
This paper describes two separate efforts that used the SPIN model checker to verify deep space autonomy flight software. The first effort occurred at the beginning of a spiral development process and found five concurrency errors early in the design cycle that the developers acknowledge would not have been found through testing. This effort required a substantial manual modeling effort involving both abstraction and translation from the prototype LISP code to the PROMELA language used by SPIN. This experience and others led to research to address the gap between formal method tools and the development cycle used by software developers. The Java PathFinder tool which directly translates from Java to PROMELA was developed as part of this research, as well as automatic abstraction tools. In 1999 the flight software flew on a space mission, and a deadlock occurred in a sibling subsystem to the one which was the focus of the first verification effort. A second quick-response "cleanroom" verification effort found the concurrency error in a short amount of time. The error was isomorphic to one of the concurrency errors found during the first verification effort. The paper demonstrates that formal methods tools can find concurrency errors that indeed lead to loss of spacecraft functions, even for the complex software required for autonomy. Second, it describes progress in automatic translation and abstraction that eventually will enable formal methods tools to be inserted directly into the aerospace software development cycle.
Integrating medical imaging analyses through a high-throughput bundled resource imaging system
NASA Astrophysics Data System (ADS)
Covington, Kelsie; Welch, E. Brian; Jeong, Ha-Kyu; Landman, Bennett A.
2011-03-01
Exploitation of advanced, PACS-centric image analysis and interpretation pipelines provides well-developed storage, retrieval, and archival capabilities along with state-of-the-art data providence, visualization, and clinical collaboration technologies. However, pursuit of integrated medical imaging analysis through a PACS environment can be limiting in terms of the overhead required to validate, evaluate and integrate emerging research technologies. Herein, we address this challenge through presentation of a high-throughput bundled resource imaging system (HUBRIS) as an extension to the Philips Research Imaging Development Environment (PRIDE). HUBRIS enables PACS-connected medical imaging equipment to invoke tools provided by the Java Imaging Science Toolkit (JIST) so that a medical imaging platform (e.g., a magnetic resonance imaging scanner) can pass images and parameters to a server, which communicates with a grid computing facility to invoke the selected algorithms. Generated images are passed back to the server and subsequently to the imaging platform from which the images can be sent to a PACS. JIST makes use of an open application program interface layer so that research technologies can be implemented in any language capable of communicating through a system shell environment (e.g., Matlab, Java, C/C++, Perl, LISP, etc.). As demonstrated in this proof-of-concept approach, HUBRIS enables evaluation and analysis of emerging technologies within well-developed PACS systems with minimal adaptation of research software, which simplifies evaluation of new technologies in clinical research and provides a more convenient use of PACS technology by imaging scientists.
Situation assessment in the Paladin tactical decision generation system
NASA Technical Reports Server (NTRS)
Mcmanus, John W.; Chappell, Alan R.; Arbuckle, P. Douglas
1992-01-01
Paladin is a real-time tactical decision generator for air combat engagements. Paladin uses specialized knowledge-based systems and other Artificial Intelligence (AI) programming techniques to address the modern air combat environment and agile aircraft in a clear and concise manner. Paladin is designed to provide insight into both the tactical benefits and the costs of enhanced agility. The system was developed using the Lisp programming language on a specialized AI workstation. Paladin utilizes a set of air combat rules, an active throttle controller, and a situation assessment module that have been implemented as a set of highly specialized knowledge-based systems. The situation assessment module was developed to determine the tactical mode of operation (aggressive, defensive, neutral, evasive, or disengagement) used by Paladin at each decision point in the air combat engagement. Paladin uses the situation assessment module; the situationally dependent modes of operation to more accurately represent the complex decision-making process of human pilots. This allows Paladin to adapt its tactics to the current situation and improves system performance. Discussed here are the details of Paladin's situation assessment and modes of operation. The results of simulation testing showing the error introduced into the situation assessment module due to estimation errors in positional and geometric data for the opponent aircraft are presented. Implementation issues for real-time performance are discussed and several solutions are presented, including Paladin's use of an inference engine designed for real-time execution.
Gasquoine, Philip Gerard; Gonzalez, Cassandra Dayanira
2012-05-01
Conventional neuropsychological norms developed for monolinguals likely overestimate normal performance in bilinguals on language but not visual-perceptual format tests. This was studied by comparing neuropsychological false-positive rates using the 50th percentile of conventional norms and individual comparison standards (Picture Vocabulary or Matrix Reasoning scores) as estimates of preexisting neuropsychological skill level against the number expected from the normal distribution for a consecutive sample of 56 neurologically intact, bilingual, Hispanic Americans. Participants were tested in separate sessions in Spanish and English in the counterbalanced order on La Bateria Neuropsicologica and the original English language tests on which this battery was based. For language format measures, repeated-measures multivariate analysis of variance showed that individual estimates of preexisting skill level in English generated the mean number of false positives most approximate to that expected from the normal distribution, whereas the 50th percentile of conventional English language norms did the same for visual-perceptual format measures. When using conventional Spanish or English monolingual norms for language format neuropsychological measures with bilingual Hispanic Americans, individual estimates of preexisting skill level are recommended over the 50th percentile.
ERIC Educational Resources Information Center
Lewis, John D.
1998-01-01
Describes XML (extensible markup language), a new language classification submitted to the World Wide Web Consortium that is defined in terms of both SGML (Standard Generalized Markup Language) and HTML (Hypertext Markup Language), specifically designed for the Internet. Limitations of PDF (Portable Document Format) files for electronic journals…
Ground Operations Aerospace Language (GOAL)
NASA Technical Reports Server (NTRS)
1973-01-01
GOAL, is a test engineer oriented language designed to be used to standardize procedure terminology and as the test programming language to be used for ground checkout operations in a space vehicle launch environment. The material presented concerning GOAL includes: (1) a historical review, (2) development objectives and requirements, (3) language scope and format, and (4) language capabilities.
ERIC Educational Resources Information Center
Bulut, Mesut
2016-01-01
The aim of this study is to find out Anadolu University Open Education Faculty Turkish Language and Literature graduated students' views towards Pedagogical Formation Training certificate and their opinions about special teaching methods. This study has been done in one of the universities of East Karadeniz in Turkey in which the 20 Turkish…
Expert systems applied to fault isolation and energy storage management, phase 2
NASA Technical Reports Server (NTRS)
1987-01-01
A user's guide for the Fault Isolation and Energy Storage (FIES) II system is provided. Included are a brief discussion of the background and scope of this project, a discussion of basic and advanced operating installation and problem determination procedures for the FIES II system and information on hardware and software design and implementation. A number of appendices are provided including a detailed specification for the microprocessor software, a detailed description of the expert system rule base and a description and listings of the LISP interface software.
1988-04-30
spots and it gives milk and it chews cud and it has a long neck and it has long legs, then it is a giraffe . These rules are translated into a Lisp...implies tiger) ((ungulate and ( long legs) and ( long neck ) and tawny and (dark spots)) implies giraffe ) ((Ungulate and white and (black stripes)) implies...it is a tiger. I1) If the animal is an ungulate and it has long legs and it has a long neck and it has a tawny color and it has dark spots then it is a
Expert systems tools for Hubble Space Telescope observation scheduling
NASA Technical Reports Server (NTRS)
Miller, Glenn; Rosenthal, Don; Cohen, William; Johnston, Mark
1987-01-01
The utility of expert systems techniques for the Hubble Space Telescope (HST) planning and scheduling is discussed and a plan for development of expert system tools which will augment the existing ground system is described. Additional capabilities provided by these tools will include graphics-oriented plan evaluation, long-range analysis of the observation pool, analysis of optimal scheduling time intervals, constructing sequences of spacecraft activities which minimize operational overhead, and optimization of linkages between observations. Initial prototyping of a scheduler used the Automated Reasoning Tool running on a LISP workstation.
A user interface for a knowledge-based planning and scheduling system
NASA Technical Reports Server (NTRS)
Mulvehill, Alice M.
1988-01-01
The objective of EMPRESS (Expert Mission Planning and Replanning Scheduling System) is to support the planning and scheduling required to prepare science and application payloads for flight aboard the US Space Shuttle. EMPRESS was designed and implemented in Zetalisp on a 3600 series Symbolics Lisp machine. Initially, EMPRESS was built as a concept demonstration system. The system has since been modified and expanded to ensure that the data have integrity. Issues underlying the design and development of the EMPRESS-I interface, results from a system usability assessment, and consequent modifications are described.
An Expert-System Engine With Operative Probabilities
NASA Technical Reports Server (NTRS)
Orlando, N. E.; Palmer, M. T.; Wallace, R. S.
1986-01-01
Program enables proof-of-concepts tests of expert systems under development. AESOP is rule-based inference engine for expert system, which makes decisions about particular situation given user-supplied hypotheses, rules, and answers to questions drawn from rules. If knowledge base containing hypotheses and rules governing environment is available to AESOP, almost any situation within that environment resolved by answering questions asked by AESOP. Questions answered with YES, NO, MAYBE, DON'T KNOW, DON'T CARE, or with probability factor ranging from 0 to 10. AESOP written in Franz LISP for interactive execution.
1976-03-01
RESEARCH IN FUNCTIONALLY DISTRIBUTED COMPUTER SYSTEMS DEVEI.OPME--ETClU) MAR 76 P S FISHER, F MARYANSKI DAA629-76-6-0108 UNCLASSIFIED CS-76-08AN...RESEARCH IN FUNCTIONALLY !DISTRIBUTED COMPUTER SYSTEMS DEVELOPMENT Kansas State University Virgil Wallentine Principal Investigator Approved for public...reme; disiribution unlimited DTIC \\4JWE III ELECTi"U ~E V0AI. Ill ~1ONTAUG 2 0 1981&EV .IAIN LiSP4 F U.S. ARMY COMPUTER SYSTEMS COMMAND FT BELVOIR, VA
The Role of Discourse in Teaching Intercultural Professional Communication
ERIC Educational Resources Information Center
Kartabayeva, Ayana A.; Zhaitapova, Altynai A.
2016-01-01
With Kazakhstan's accession to the Bologna Process, particular importance is attached to the professionally-oriented approach of teaching foreign languages to students, which facilitates formation of their foreign language communicative ability. The article deals with the problem of teaching English to students for the purpose of formation of…
Genotype Specification Language.
Wilson, Erin H; Sagawa, Shiori; Weis, James W; Schubert, Max G; Bissell, Michael; Hawthorne, Brian; Reeves, Christopher D; Dean, Jed; Platt, Darren
2016-06-17
We describe here the Genotype Specification Language (GSL), a language that facilitates the rapid design of large and complex DNA constructs used to engineer genomes. The GSL compiler implements a high-level language based on traditional genetic notation, as well as a set of low-level DNA manipulation primitives. The language allows facile incorporation of parts from a library of cloned DNA constructs and from the "natural" library of parts in fully sequenced and annotated genomes. GSL was designed to engage genetic engineers in their native language while providing a framework for higher level abstract tooling. To this end we define four language levels, Level 0 (literal DNA sequence) through Level 3, with increasing abstraction of part selection and construction paths. GSL targets an intermediate language based on DNA slices that translates efficiently into a wide range of final output formats, such as FASTA and GenBank, and includes formats that specify instructions and materials such as oligonucleotide primers to allow the physical construction of the GSL designs by individual strain engineers or an automated DNA assembly core facility.
ERIC Educational Resources Information Center
Chen, I-Jung; Yen, Jung-Chuan
2013-01-01
This study extends current knowledge by exploring the effect of different annotation formats, namely in-text annotation, glossary annotation, and pop-up annotation, on hypertext reading comprehension in a foreign language and vocabulary acquisition across student proficiencies. User attitudes toward the annotation presentation were also…
Processing of Formational, Semantic, and Iconic Information in American Sign Language.
ERIC Educational Resources Information Center
Poizner, Howard; And Others
1981-01-01
Three experiments examined short-term encoding processes of deaf signers for different aspects of signs from American Sign Language. Results indicated that deaf signers code signs at one level in terms of linguistically significant formational parameters. The semantic and iconic information of signs, however, has little effect on short-term…
Exploring Language Teacher Identity Work as Ethical Self-Formation
ERIC Educational Resources Information Center
Miller, Elizabeth R.; Morgan, Brian; Medina, Adriana L.
2017-01-01
In this article, we treat language teacher identity as foundational to educational practice and see Foucault's (1983, 1997) notion of ethical self-formation, and its adoption in teacher education research by Clarke (2008, 2009, 2010), as providing a potential vehicle for understanding the development of teacher agency and critical identity work.…
Reflections on Multiliterate Lives. Bilingual Education and Bilingualism 26.
ERIC Educational Resources Information Center
Belcher, Diane, Ed.; Connor, Ulla, Ed.
This edited volume is a collection of personal accounts, in narrative and interview format, of the formative literacy experiences of highly successful second language users, all of whom are professional academics. Representing 14 countries of origin, the contributors, who are well known specialists in language teaching as well as a variety of…
ERIC Educational Resources Information Center
Yumarnamto, Mateus
2016-01-01
This study explores the factors that contribute to Indonesian English language teachers' (ELTs) professional growth and identity formation. The main question investigated in this study is: What are the factors and challenges that contribute to Indonesian ELTs' professional growth and identity formation as reflected in their life histories,…
Developing Oral Language Skills in Middle School English Learners
ERIC Educational Resources Information Center
Fisher, Douglas; Frey, Nancy
2018-01-01
Oral language development can help English learners develop academic proficiency with the English language. In this investigation, at one middle school, teachers focused on improving oral language skills. Using a formative experiment process, the teachers developed an intervention to accomplish their pedagogical goal and then tracked data to see…
Creating Culturally Relevant Instructional Materials: A Swaziland Case Study
ERIC Educational Resources Information Center
Titone, Connie; Plummer, Emily C.; Kielar, Melissa A.
2012-01-01
In the field of English language learning, research proves that culturally relevant reading materials improve students' language acquisition, learning motivation, self-esteem, and identity formation. Since English is the language of instruction in many distant countries, such as Swaziland, even when English is not the native language of those…
NASA Technical Reports Server (NTRS)
Mclean, David R.; Tuchman, Alan; Potter, William J.
1991-01-01
Recently, many expert systems were developed in a LISP environment and then ported to the real world C environment before the final system is delivered. This situation may require that the entire system be completely rewritten in C and may actually result in a system which is put together as quickly as possible with little regard for maintainability and further evolution. With the introduction of high performance UNIX and X-windows based workstations, a great deal of the advantages of developing a first system in the LISP environment have become questionable. A C-based AI development effort is described which is based on a software tools approach with emphasis on reusability and maintainability of code. The discussion starts with simple examples of how list processing can easily be implemented in C and then proceeds to the implementations of frames and objects which use dynamic memory allocation. The implementation of procedures which use depth first search, constraint propagation, context switching and a blackboard-like simulation environment are described. Techniques for managing the complexity of C-based AI software are noted, especially the object-oriented techniques of data encapsulation and incremental development. Finally, all these concepts are put together by describing the components of planning software called the Planning And Resource Reasoning (PARR) shell. This shell was successfully utilized for scheduling services of the Tracking and Data Relay Satellite System for the Earth Radiation Budget Satellite since May 1987 and will be used for operations scheduling of the Explorer Platform in November 1991.
ISLE (Image and Signal Processing LISP Environment) reference manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sherwood, R.J.; Searfus, R.M.
1990-01-01
ISLE is a rapid prototyping system for performing image and signal processing. It is designed to meet the needs of a person doing development of image and signal processing algorithms in a research environment. The image and signal processing modules in ISLE form a very capable package in themselves. They also provide a rich environment for quickly and easily integrating user-written software modules into the package. ISLE is well suited to applications in which there is a need to develop a processing algorithm in an interactive manner. It is straightforward to develop the algorithms, load it into ISLE, apply themore » algorithm to an image or signal, display the results, then modify the algorithm and repeat the develop-load-apply-display cycle. ISLE consists of a collection of image and signal processing modules integrated into a cohesive package through a standard command interpreter. ISLE developer elected to concentrate their effort on developing image and signal processing software rather than developing a command interpreter. A COMMON LISP interpreter was selected for the command interpreter because it already has the features desired in a command interpreter, it supports dynamic loading of modules for customization purposes, it supports run-time parameter and argument type checking, it is very well documented, and it is a commercially supported product. This manual is intended to be a reference manual for the ISLE functions The functions are grouped into a number of categories and briefly discussed in the Function Summary chapter. The full descriptions of the functions and all their arguments are given in the Function Descriptions chapter. 6 refs.« less
Development of Markup Language for Medical Record Charting: A Charting Language.
Jung, Won-Mo; Chae, Younbyoung; Jang, Bo-Hyoung
2015-01-01
Nowadays a lot of trials for collecting electronic medical records (EMRs) exist. However, structuring data format for EMR is an especially labour-intensive task for practitioners. Here we propose a new mark-up language for medical record charting (called Charting Language), which borrows useful properties from programming languages. Thus, with Charting Language, the text data described in dynamic situation can be easily used to extract information.
Development of a test and flight engineering oriented language. Phase 3: Presentation
NASA Technical Reports Server (NTRS)
Kamsler, W. F.; Case, C. W.; Kinney, E. L.; Gyure, J.
1970-01-01
The format material used in an oral presentation of the phase 3 study effort is given. The material includes a description of the language ALOFT and a terminology comparison with other test languages.
NASA Technical Reports Server (NTRS)
Raible, E.
1994-01-01
The Panel Library and Editor is a graphical user interface (GUI) builder for the Silicon Graphics IRIS workstation family. The toolkit creates "widgets" which can be manipulated by the user. Its appearance is similar to that of the X-Windows System. The Panel Library is written in C and is used by programmers writing user-friendly mouse-driven applications for the IRIS. GUIs built using the Panel Library consist of "actuators" and "panels." Actuators are buttons, dials, sliders, or other mouse-driven symbols. Panels are groups of actuators that occupy separate windows on the IRIS workstation. The application user can alter variables in the graphics program, or fire off functions with a click on a button. The evolution of data values can be tracked with meters and strip charts, and dialog boxes with text processing can be built. Panels can be stored as icons when not in use. The Panel Editor is a program used to interactively create and test panel library interfaces in a simple and efficient way. The Panel Editor itself uses a panel library interface, so all actions are mouse driven. Extensive context-sensitive on-line help is provided. Programmers can graphically create and test the user interface without writing a single line of code. Once an interface is judged satisfactory, the Panel Editor will dump it out as a file of C code that can be used in an application. The Panel Library (v9.8) and Editor (v1.1) are written in C-Language (63%) and Scheme, a dialect of LISP, (37%) for Silicon Graphics 4D series workstations running IRIX 3.2 or higher. Approximately 10Mb of disk space is required once compiled. 1.5Mb of main memory is required to execute the panel editor. This program is available on a .25 inch streaming magnetic tape cartridge in UNIX tar format for an IRIS, and includes a copy of XScheme, the public-domain Scheme interpreter used by the Panel Editor. The Panel Library Programmer's Manual is included on the distribution media. The Panel Library and Editor were released to COSMIC in 1991. Silicon Graphics, IRIS, and IRIX are trademarks of Silicon Graphics, Inc. X-Window System is a trademark of Massachusetts Institute of Technology.
The Role of Irish Language Teaching: Cultural Identity Formation or Language Revitalization?
ERIC Educational Resources Information Center
Slatinská, Anna; Pecníková, Jana
2017-01-01
The focal point of the article is Irish language teaching in the Republic of Ireland. Firstly, we deal with the most significant documents where the status of the Irish language is being defined. In this respect, for the purposes of analysis, we have chosen the document titled "20 Year Strategy for the Irish language" which plays a…
Teaching English as a Language Not Subject by Employing Formative Assessment
ERIC Educational Resources Information Center
Chandio, Muhammad Tufail; Jafferi, Saima
2015-01-01
English is a second language (L2) in Sindh, Pakistan. Most of the public sector schools in Sindh teach English as a subject rather than a language. Besides, they do not distinguish between generic pedagogy and distinctive approaches used for teaching English as a first language (L1) and second language (L2). In addition, the erroneous traditional…
The Effect of Formative Assessments on Language Performance
ERIC Educational Resources Information Center
Radford, Brian W.
2014-01-01
This study sought to improve the language learning outcomes at the Missionary Training Center in Provo, Utah. Young men and women between the ages of 19-24 are taught a foreign language in an accelerated environment. In an effort to improve learning outcomes, computer-based practice and teaching of language performance criteria were provided to…
Handbook for Classroom Testing in Peace Corps Language Programs. Manual T0068.
ERIC Educational Resources Information Center
Anderson, Neil J.
This manual provides instructors in Peace Corps language training programs with information about two kinds of classroom testing: formative, ongoing testing and summative testing that occurs at the end of an instructional period. The first of the manual's four chapters on the purposes of language testing, discusses language testing within a…
Oral Language Proficiency Testing at the Foreign Service Institute. An Update--1983.
ERIC Educational Resources Information Center
Crawford, Gary D.; And Others
The Foreign Service Institute (FSI) has been engaged in oral language proficiency testing theory and practice for more than 20 years. The FSI test has been consistent during this time in format, evaluation criteria, performance standards, and level definitions. Current concerns about the degree of standardization of the format and the strength of…
The Effects of Syntactically Parsed Text Formats on Intensive Reading in EFL
ERIC Educational Resources Information Center
Herbert, John C.
2014-01-01
Separating text into meaningful language chunks, as with visual-syntactic text formatting, helps readers to process text more easily and language learners to recognize grammar and syntax patterns more quickly. Evidence of this exists in studies on native and non-native English speakers. However, recent studies question the roll of VSTF in certain…
The Effect of the Multiple-Choice Item Format on the Measurement of Knowledge of Language Structure
ERIC Educational Resources Information Center
Currie, Michael; Chiramanee, Thanyapa
2010-01-01
Noting the widespread use of multiple-choice items in tests in English language education in Thailand, this study compared their effect against that of constructed-response items. One hundred and fifty-two university undergraduates took a test of English structure first in constructed-response format, and later in three, stem-equivalent…
ERIC Educational Resources Information Center
Källkvist, Marie; Hult, Francis M.
2016-01-01
In the wake of the enactment of Sweden's Language Act in 2009 and in the face of the growing presence of English, Swedish universities have been called upon by the Swedish Higher Education Authority to craft their own language policy documents. This study focuses on the discursive negotiation of institutional bilingualism by a language policy…
1978-10-01
DAM . * MA 00573 PHASE 1 INSPECTION REPORT NATIONAL DAM INSPECTION PROGRAM 0 V DEPARTMENT OF THE ARMY_ NEW ENGLAND DIVISION, CORPS OF ENGINEERS WALTHAM...ArIhtEN T (c- the 6b4trectenrto I lck 20, It dit.,.et i am Report) !6 5IPPLEMENTARY NOTES (,o ,er program reads: Phase I Inspection Report, National Darn...Inspection Program ; hov.ever, the official title of the program is: National Program for Inspection of lon-Federal Dans:. LISP cover date for date of
NASA TileWorld manual (system version 2.2)
NASA Technical Reports Server (NTRS)
Philips, Andrew B.; Bresina, John L.
1991-01-01
The commands are documented of the NASA TileWorld simulator, as well as providing information about how to run it and extend it. The simulator, implemented in Common Lisp with Common Windows, encodes a particular range in a spectrum of domains, for controllable research experiments. TileWorld consists of a two dimensional grid of cells, a set of polygonal tiles, and a single agent which can grasp and move tiles. In addition to agent executable actions, there is an external event over which the agent has not control; this event correspond to a 'gust of wind'.
1983-07-18
architecture . Design , performance, and cost of BRISC is presented. Performance is shown to be better than high end mainframes such as the IBM 3081 and Amdahl 470V/8 on integer benchmarks written in C, Pascal and LISP. The cost, conservatively estimated to be $132,400 is about the same as a high end minicomputer such as the VAX-11/780. BRISC has a CPU cycle time of 46 ns, providing a RISC I instruction execution rate of greater than 15 MIPs. BRISC is designed with a Structured Computer Aided Logic Design System (SCALD) by Valid Logic Systems. An evaluation of the utility of
ERIC Educational Resources Information Center
Singh, Sukhdev
2006-01-01
The issue of language attitudes has become important in view of the regular formation and growth of multi-lingual societies. The individuals are under constant pressure to learn more than one language because of pragmatic/cultural/political reasons. The languages in such situations compete and often generate linguistic controversies about the…
Job Language Performance Requirements for MOS 72E, Telecommunications Center Operator.
1982-10-01
Process Outgoing *"asp to be raaomitt4d in Finished card Format 113-572-MW0 Process Outgoing NMessa to be Iranmi ttad to %"ptic Tap& Format 113-5724006...LANGUAGE naOvAiW (Satire Oe) The product of the entire data gathering and organization is the ILVR 0. These are relevant to all ,- coton and duty tasks
ERIC Educational Resources Information Center
Richards, Brian J.
2008-01-01
This article describes the development and validation of a diagnostic test of German and its integration in a programme of formative assessment during a one-year initial teacher-training course. The test focuses on linguistic aspects that cause difficulty for trainee teachers of German as a foreign language and assesses implicit and explicit…
ERIC Educational Resources Information Center
Tse, Lucy
2000-01-01
Examines one stage of ethnic identity formation and its affects on attitudes toward the heritage language among a group of Americans of Asian descent in the United States. Studied published narratives to discover whether feelings of ambivalence and evasion experienced by this population toward their ethnicity extended to the heritage language, and…
Gasquoine, Philip G; Weimer, Amy A; Amador, Arnoldo
2017-04-01
To measure specificity as failure rates for non-clinical, bilingual, Mexican Americans on three popular performance validity measures: (a) the language format Reliable Digit Span; (b) visual-perceptual format Test of Memory Malingering; and (c) visual-perceptual format Dot Counting, using optimal/suboptimal effort cut scores developed for monolingual, English-speakers. Participants were 61 consecutive referrals, aged between 18 and 65 years, with <16 years of education who were subjectively bilingual (confirmed via formal assessment) and chose the language of assessment, Spanish or English, for the performance validity tests. Failure rates were 38% for Reliable Digit Span, 3% for the Test of Memory Malingering, and 7% for Dot Counting. For Reliable Digit Span, the failure rates for Spanish (46%) and English (31%) languages of administration did not differ significantly. Optimal/suboptimal effort cut scores derived for monolingual English-speakers can be used with Spanish/English bilinguals when using the visual-perceptual format Test of Memory Malingering and Dot Counting. The high failure rate for Reliable Digit Span suggests it should not be used as a performance validity measure with Spanish/English bilinguals, irrespective of the language of test administration, Spanish or English.
Education within Sustainable Development: Critical Thinking Formation on ESL Class
NASA Astrophysics Data System (ADS)
Pevneva, Inna; Gavrishina, Olga; Smirnova, Anna; Rozhneva, Elena; Yakimova, Nataliya
2017-11-01
The article is devoted to consideration of the critical thinking formation in course of foreign language teaching within the education for sustainable development as a crucial skill of perspective employee and a future leader of Russian employment market. The necessity to include the component of problem education and critical thinking methodology in course of the foreign language class is justified along with analysis of the basic principles of critical thinking and certain strategies that can be implied in class. This model targets communicative language competences of students as well as critical thinking due to interconnection of various types of cognitive activities in class. The role in personality development of the students is considered along with the formation and enhancing of critical thinking skills within the modern personality-oriented approach.
Collaborative Planning of Robotic Exploration
NASA Technical Reports Server (NTRS)
Norris, Jeffrey; Backes, Paul; Powell, Mark; Vona, Marsette; Steinke, Robert
2004-01-01
The Science Activity Planner (SAP) software system includes an uplink-planning component, which enables collaborative planning of activities to be undertaken by an exploratory robot on a remote planet or on Earth. Included in the uplink-planning component is the SAP-Uplink Browser, which enables users to load multiple spacecraft activity plans into a single window, compare them, and merge them. The uplink-planning component includes a subcomponent that implements the Rover Markup Language Activity Planning format (RML-AP), based on the Extensible Markup Language (XML) format that enables the representation, within a single document, of planned spacecraft and robotic activities together with the scientific reasons for the activities. Each such document is highly parseable and can be validated easily. Another subcomponent of the uplink-planning component is the Activity Dictionary Markup Language (ADML), which eliminates the need for two mission activity dictionaries - one in a human-readable format and one in a machine-readable format. Style sheets that have been developed along with the ADML format enable users to edit one dictionary in a user-friendly environment without compromising
ERIC Educational Resources Information Center
Duda, R.; And Others
In second language learning, it is often the case that a discrepancy exists between the language of the pedagogical materials and that of the media or of the native speaker. This article discusses the advantages and problems involved in using authentic, non-didactic materials in post-introductory second language instruction, as found in an…
ERIC Educational Resources Information Center
Taherbhai, Husein; Seo, Daeryong; O'Malley, Kimberly
2014-01-01
English language learners (ELLs) are the fastest growing subgroup in American schools. These students, by a provision in the reauthorization of the Elementary and Secondary Education Act, are to be supported in their quest for language proficiency through the creation of systems that more effectively measure ELLs' progress across years. In…
Our health language and data collections.
Hovenga, Evelyn J S; Grain, Heather
2013-01-01
All communication within the health industry is dependent upon the use of our health language consisting of a very extensive and complex vocabulary. Converting this language into computable formats is necessary in a digital environment with a strong reliance on data, information and knowledge sharing. This chapter describes our health language, what terminologies and ontologies are, their use and relationships with natural language, indexing, data standards, data collections and the need for data governance.
ERIC Educational Resources Information Center
Afitska, Oksana
2014-01-01
A considerable number of studies on formative teacher assessment and feedback, learner self- and peer-assessment have been carried out in the field of Language Testing and Assessment (LTA) research over the last two decades. These studies investigated the above mentioned concepts from different perspectives (impact of assessment on learning,…
ERIC Educational Resources Information Center
Bell, Athene Cooper
2012-01-01
A formative design experiment methodology was employed to investigate the acquisition of early reading skills for high school English language learners (ELLs) beginning to read English. A fundamental challenge facing high school ELLs entering schools in the United States for the first time is learning how to read. While there is considerable…
ERIC Educational Resources Information Center
Akhmadullina, Rimma M.; Abdrafikova, Albina R.; Vanyukhina, Nadezhda V.
2016-01-01
The relevance of the topic is specified by the necessity of improving the quality of students' training in foreign languages for their mobility in terms of Russia`s entry into the Bologna process. This article is intended to support the effective use of instructional techniques of music and musical information with the aim of formation of…
Electronic Means of Foreign Language Learning in the System of Higher Education
ERIC Educational Resources Information Center
Frolova, Natalia
2017-01-01
Integration of information communication technologies, enhancing students' motivation and adding to personalized learning, into higher education is challenging but beneficial. It is particularly acute in the field of foreign language learning which requires language competence formation along with knowledge of grammar patterns, vocabulary…
ERIC Educational Resources Information Center
Intermountain School, Brigham City, UT.
Based on a coordinated aural-oral approach, this language arts curriculum guide was developed to teach Navajo students English as a second language. The design of the curriculum provides for longitudinal and horizontal movements to favor concept formation by inductive experience. The plan gears instruction to three instructional levels: low (the…
Telemetry Attributes Transfer Standard (TMATS) Handbook
2015-07-01
Example ......................... 6-1 Appendix A. Extensible Markup Language TMATS Differences ...................................... A-1 Appendix B...return-to-zero - level TG Telemetry Group TM telemetry TMATS Telemetry Attributes Transfer Standard XML eXtensible Markup Language Telemetry... Markup Language) format. The initial version of a standard 1 Range Commanders Council. Telemetry
XML Content Finally Arrives on the Web!
ERIC Educational Resources Information Center
Funke, Susan
1998-01-01
Explains extensible markup language (XML) and how it differs from hypertext markup language (HTML) and standard generalized markup language (SGML). Highlights include features of XML, including better formatting of documents, better searching capabilities, multiple uses for hyperlinking, and an increase in Web applications; Web browsers; and what…
Language-Rich Discussions for English Language Learners
ERIC Educational Resources Information Center
Zhang, Jie; Anderson, Richard C.; Nguyen-Jahiel, Kim
2013-01-01
A study involving 75 Spanish-speaking fifth graders from a school in the Chicago area investigated whether a peer-led, open-format discussion approach, known as Collaborative Reasoning, would accelerate the students' English language development. Results showed that, after participating in eight discussions over a four-week period, the CR group…
Literature in Language Lessons
ERIC Educational Resources Information Center
Zaiser, Richard
2018-01-01
Teaching modern foreign languages is not all about communicative skills. It is also about testing functional abilities. While we still pay lip service to the creed of communicative language teaching, we have adopted test formats and teaching styles that follow a hidden agenda: the production of human capital. The main objective of teaching is…
XML Schema Languages: Beyond DTD.
ERIC Educational Resources Information Center
Ioannides, Demetrios
2000-01-01
Discussion of XML (extensible markup language) and the traditional DTD (document type definition) format focuses on efforts of the World Wide Web Consortium's XML schema working group to develop a schema language to replace DTD that will be capable of defining the set of constraints of any possible data resource. (Contains 14 references.) (LRW)
Where Is Logo Taking Our Kids?
ERIC Educational Resources Information Center
Mace, Scott
1984-01-01
Discusses various aspects, features, and uses of the Logo programing language. A comparison (in chart format) of several Logo languages is also included, providing comments on the language as well as producer, current price, number of sprites and turtles, computer needed, and whether debugging aids and list operations are included. (JN)
The Integration of English Language Development and Science Instruction in Elementary Classrooms
NASA Astrophysics Data System (ADS)
Zwiep, Susan Gomez; Straits, William J.; Stone, Kristin R.; Beltran, Dolores D.; Furtado, Leena
2011-12-01
This paper explores one district's attempt to implement a blended science and English Language Development (ELD) elementary program, designed to provide English language learners opportunities to develop proficiency in English through participation in inquiry-based science. This process resulted in blended program that utilized a combined science/ELD lesson plan format to structure and guide teachers' efforts to use science as the context for language development. Data, collected throughout the first 2 years of the program, include teacher-generated lesson plans, observation notes, and interviews with teachers and principals. The process by which the blended program was developed, the initial implementation of the program, the resulting science/ELD lesson plan format, and teachers' perceptions about the program and its impact on their students are described.
ERIC Educational Resources Information Center
Richmond, Edmun B.
The findings of a study of language and language education policy in each of the three independent nations of Comoros, Mauritius, and the Seychelles are reported in this book. Each country is discussed separately, focusing on the linguistic and educational history, the existing educational system, and current language policies and programs.…
ERIC Educational Resources Information Center
Jones, Rebecca A.
The social and psychological factors which affect one person's acquisition of a second language are described in journal format. The psychological factors discussed are: (1) language shock, (2) culture shock, and (3) culture stress. The two social factors examined are both grouped under the term "social distance" but include (1) types of…
ERIC Educational Resources Information Center
Trent, John; Gao, Xuesong
2009-01-01
A teacher shortage in Hong Kong in core subjects, such as English, has led to interest in the recruitment and retention of second-career teachers. Drawing upon Wenger's (1998) theory of identity formation and using data from interviews with eight second-career English language teachers in Hong Kong, this paper explores how second-career teachers…
Simple proteomics data analysis in the object-oriented PowerShell.
Mohammed, Yassene; Palmblad, Magnus
2013-01-01
Scripting languages such as Perl and Python are appreciated for solving simple, everyday tasks in bioinformatics. A more recent, object-oriented command shell and scripting language, Windows PowerShell, has many attractive features: an object-oriented interactive command line, fluent navigation and manipulation of XML files, ability to consume Web services from the command line, consistent syntax and grammar, rich regular expressions, and advanced output formatting. The key difference between classical command shells and scripting languages, such as bash, and object-oriented ones, such as PowerShell, is that in the latter the result of a command is a structured object with inherited properties and methods rather than a simple stream of characters. Conveniently, PowerShell is included in all new releases of Microsoft Windows and therefore already installed on most computers in classrooms and teaching labs. In this chapter we demonstrate how PowerShell in particular allows easy interaction with mass spectrometry data in XML formats, connection to Web services for tools such as BLAST, and presentation of results as formatted text or graphics. These features make PowerShell much more than "yet another scripting language."
Temporal and contextual knowledge in model-based expert systems
NASA Technical Reports Server (NTRS)
Toth-Fejel, Tihamer; Heher, Dennis
1987-01-01
A basic paradigm that allows representation of physical systems with a focus on context and time is presented. Paragon provides the capability to quickly capture an expert's knowledge in a cognitively resonant manner. From that description, Paragon creates a simulation model in LISP, which when executed, verifies that the domain expert did not make any mistakes. The Achille's heel of rule-based systems has been the lack of a systematic methodology for testing, and Paragon's developers are certain that the model-based approach overcomes that problem. The reason this testing is now possible is that software, which is very difficult to test, has in essence been transformed into hardware.
Portal: Your Door to World Languages and Cultures
ERIC Educational Resources Information Center
Elliott, Don; Lawton, Rachele
2009-01-01
Portal: Your Door to World Languages and Cultures was a series of public cultural events, in a variety of formats, created through a new partnership between the credit and continuing education (noncredit) foreign language programs at the Community College of Baltimore County (CCBC). Portal was designed to cultivate interest in foreign languages…
Language and the Formation of Masculine Potency in Peruvian Aymara Boyhood
ERIC Educational Resources Information Center
Smith, Benjamin
2011-01-01
The dissertation project is a study of how childrens' increasingly skillful use of certain language forms ("stance forms") help them to more effectively perform culturally salient social identities in discourse. Although scholars have long claimed that language use helps to position speakers in terms of socially recognizable identities,…
ERIC Educational Resources Information Center
Guardado, Martin
2010-01-01
This article, part of a larger study, examines three middle-class, Hispanic Canadian families' conceptualizations of language, culture, and identity. Via an analysis of interview data, the findings indicate that the parents assigned diverse meanings to heritage language development (HLD) and held high expectations for their children's formation of…
Awareness of Deaf Sign Language and Gang Signs.
ERIC Educational Resources Information Center
Smith, Cynthia; Morgan, Robert L.
There have been increasing incidents of innocent people who use American Sign Language (ASL) or another form of sign language being victimized by gang violence due to misinterpretation of ASL hand formations. ASL is familiar to learners with a variety of disabilities, particularly those in the deaf community. The problem is that gang members have…
Retrieval Performance and Indexing Differences in ABELL and MLAIB
ERIC Educational Resources Information Center
Graziano, Vince
2012-01-01
Searches for 117 British authors are compared in the Annual Bibliography of English Language and Literature (ABELL) and the Modern Language Association International Bibliography (MLAIB). Authors are organized by period and genre within the early modern era. The number of records for each author was subdivided by format, language of publication,…
Artificial Intelligence and Language Development and Language Usage with the Deaf.
ERIC Educational Resources Information Center
Leach, John Mark
The paper reviews research on the application of artificial intelligence (AI) in language development and/or instruction with the deaf. Contributions of computer assisted instruction are noted, as are the problems resulting from over-dependence on a drill and practice format and from deaf students' difficulties in receiving and understanding new…
Literary Translation as a Tool for Critical Language Planning
ERIC Educational Resources Information Center
Mooneeram, Roshni
2013-01-01
This paper argues that Dev Virahsawmy, an author who manipulates literary translation for the purposes of linguistic prestige formation and re-negotiation, is a critical language-policy practitioner, as his work fills an important gap in language planning scholarship. A micro-analysis of the translation of a Shakespearean sonnet into Mauritian…
Lenguaje y Ciencias (Language and Sciences), Vol. 17, No. 2.
ERIC Educational Resources Information Center
Zierer, Ernesto, Ed.
This issue contains three articles in Spanish, with abstracts in English, dealing with the following topics: (1) technical and scientific language; (2) some types of misrenderings by students in translating from English to Spanish and implications for language instruction; and (3) some theoretical aspects of the formation of technical terms in…
ERIC Educational Resources Information Center
Fredriksson, Christine
2015-01-01
Synchronous written chat and instant messaging are tools which have been used and explored in online language learning settings for at least two decades. Research literature has shown that such tools give second language (L2) learners opportunities for language learning, e.g. , the interaction in real time with peers and native speakers, the…
ERIC Educational Resources Information Center
Silver, Steven S.
FMS/3 is a system for producing hard copy documentation at high speed from free format text and command input. The system was originally written in assembler language for a 12K IBM 360 model 20 using a high speed 1403 printer with the UCS-TN chain option (upper and lower case). Input was from an IBM 2560 Multi-function Card Machine. The model 20…
Personality, Category, and Cross-Linguistic Speech Sound Processing: A Connectivistic View
Li, Will X. Y.
2014-01-01
Category formation of human perception is a vital part of cognitive ability. The disciplines of neuroscience and linguistics, however, seldom mention it in the marrying of the two. The present study reviews the neurological view of language acquisition as normalization of incoming speech signal, and attempts to suggest how speech sound category formation may connect personality with second language speech perception. Through a questionnaire, (being thick or thin) ego boundary, a correlate found to be related to category formation, was proven a positive indicator of personality types. Following the qualitative study, thick boundary and thin boundary English learners native in Cantonese were given a speech-signal perception test using an ABX discrimination task protocol. Results showed that thick-boundary learners performed significantly lower in accuracy rate than thin-boundary learners. It was implied that differences in personality do have an impact on language learning. PMID:24757425
Evaluation of criteria for developing traffic safety materials for Latinos.
Streit-Kaplan, Erica L; Miara, Christine; Formica, Scott W; Gallagher, Susan Scavo
2011-03-01
This quantitative study assessed the validity of guidelines that identified four key characteristics of culturally appropriate Spanish-language traffic safety materials: language, translation, formative evaluation, and credible source material. From a sample of 190, the authors randomly selected 12 Spanish-language educational materials for analysis by 15 experts. Hypotheses included that the experts would rate materials with more of the key characteristics as more effective (likely to affect behavioral change) and rate materials originally developed in Spanish and those that utilized formative evaluation (e.g., pilot tests, focus groups) as more culturally appropriate. Although results revealed a weak association between the number of key characteristics in a material and the rating of its effectiveness, reviewers rated materials originally created in Spanish and those utilizing formative evaluation as significantly more culturally appropriate. The findings and methodology demonstrated important implications for developers and evaluators of any health-related materials for Spanish speakers and other population groups.
Comeau, Donald C.; Liu, Haibin; Islamaj Doğan, Rezarta; Wilbur, W. John
2014-01-01
BioC is a new format and associated code libraries for sharing text and annotations. We have implemented BioC natural language preprocessing pipelines in two popular programming languages: C++ and Java. The current implementations interface with the well-known MedPost and Stanford natural language processing tool sets. The pipeline functionality includes sentence segmentation, tokenization, part-of-speech tagging, lemmatization and sentence parsing. These pipelines can be easily integrated along with other BioC programs into any BioC compliant text mining systems. As an application, we converted the NCBI disease corpus to BioC format, and the pipelines have successfully run on this corpus to demonstrate their functionality. Code and data can be downloaded from http://bioc.sourceforge.net. Database URL: http://bioc.sourceforge.net PMID:24935050
Furley, Philip; Dicks, Matt; Memmert, Daniel
2012-02-01
In the present article, we investigate the effects of specific nonverbal behaviors signaling dominance and submissiveness on impression formation and outcome expectation in the soccer penalty kick situation. In Experiment 1, results indicated that penalty takers with dominant body language are perceived more positively by soccer goalkeepers and players and are expected to perform better than players with a submissive body language. This effect was similar for both video and point-light displays. Moreover, in contrast to previous studies, we found no effect of clothing (red vs. white) in the video condition. In Experiment 2, we used the implicit association test to demonstrate that dominant body language is implicitly associated with a positive soccer player schema whereas submissive body language is implicitly associated with a negative soccer player schema. The implications of our findings are discussed with reference to future implications for theory and research in the study of person perception in sport.
Widening the lens: what the manual modality reveals about language, learning and cognition.
Goldin-Meadow, Susan
2014-09-19
The goal of this paper is to widen the lens on language to include the manual modality. We look first at hearing children who are acquiring language from a spoken language model and find that even before they use speech to communicate, they use gesture. Moreover, those gestures precede, and predict, the acquisition of structures in speech. We look next at deaf children whose hearing losses prevent them from using the oral modality, and whose hearing parents have not presented them with a language model in the manual modality. These children fall back on the manual modality to communicate and use gestures, which take on many of the forms and functions of natural language. These homemade gesture systems constitute the first step in the emergence of manual sign systems that are shared within deaf communities and are full-fledged languages. We end by widening the lens on sign language to include gesture and find that signers not only gesture, but they also use gesture in learning contexts just as speakers do. These findings suggest that what is key in gesture's ability to predict learning is its ability to add a second representational format to communication, rather than a second modality. Gesture can thus be language, assuming linguistic forms and functions, when other vehicles are not available; but when speech or sign is possible, gesture works along with language, providing an additional representational format that can promote learning. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Group B Strep Infection in Newborns
... Active Bacterial Core surveillance (ABCs) CDC Streptococcus Laboratory Sepsis Group B Strep Disease in Newborns Language: English ( ... Active Bacterial Core surveillance (ABCs) CDC Streptococcus Laboratory Sepsis Language: English (US) Español (Spanish) File Formats Help: ...
Ewe (for Togo). Special Skills Handbook. Peace Corps Language Handbook Series.
ERIC Educational Resources Information Center
Kozelka, Paul R., Comp.; Agbovi, Yao Ete, Comp.
A book of language and cultural material for teachers and students of Ewe presents vocabulary lists and samples of Ewe language in various contexts, including letters, essays, and newspaper articles. Although not presented in lesson format, the material can be adapted by teachers or used by students for independent study. It is divided into two…
I Am My Language: Discourses of Women & Children in the Borderlands.
ERIC Educational Resources Information Center
Gonzalez, Norma
This book looks at language practices in Mexican American households in Tucson (Arizona), using language as a window to peer into the complexities of women's and children's lives in the borderlands. The notion is presented that the complexity inherent in the borderlands in general and in Tucson in particular is a formative factor in the language…
ERIC Educational Resources Information Center
Granados, Nadia Regina
2015-01-01
Through a Communities of Practice Network Analysis, this research illustrates the ways in which dual language graduates participate in multiple, varied, and overlapping communities of practice across time. Findings highlight that the dual language school as a shared community of practice represents a critical and formative part of participants'…
On Parle Francais Ici: The People of the St. John Valley Have a Tremendous Advantage.
ERIC Educational Resources Information Center
Banville, Beurmond J.
1995-01-01
A change in philosophy concerning the maintenance of native languages has led to local efforts to revive the French language in the St. John Valley (Maine), including the formation of a community organization and implementation of language programs in which children in all grades receive daily instruction in French. (LP)
Education Course Syllabus Development, Thai Language Major According to Buddhism Way of Thailand
ERIC Educational Resources Information Center
Waree, Chaiwat
2016-01-01
This research aims to develop Education Course Syllabus, Thai language major, according to Buddhism way of Thailand by using Taba's Approach and to evaluate the efficiency of Education Course Syllabus, Thai language major, according to Buddhism way of Thailand. This research was conducted according to research and development format and its…
Le Tutorat: Structure et Outil de Formation (Tutoring: Structure and Instruments of Development).
ERIC Educational Resources Information Center
Berthet, A.; And Others
1996-01-01
Focuses on the assurance by the Alliance Francaise of Paris of professional development of future teachers of foreign languages. The article describes the members of the Alliance as professors at the Foreign Language School of Paris with university training in such diverse fields as linguistics, modern languages, education, and philosophy. Their…
Modern Languages and Distance Education: Thirteen Days in the Cloud
ERIC Educational Resources Information Center
Dona, Elfe; Stover, Sheri; Broughton, Nancy
2014-01-01
This research study documents the journey of two modern language faculty (Spanish and German) from their original beliefs that teaching foreign languages can only be conducted in a face-to-face format to their eventual development of an online class using Web 2.0 technologies to encourage their students' active skills of reading and speaking in…
A Manual for Assessing Language Growth in Instructional Settings.
ERIC Educational Resources Information Center
Swinton, Spencer S.
This manual is designed to assist administrators of English-as-a-second-language programs in assessing students' language growth. It begins by reviewing some of the concepts and terminology to be used. It then goes on to suggest and illustrate data-recording formats and methods of summarizing raw gains. This is followed by an example based on…
Language Policy in British Colonial Education: Evidence from Nineteenth-Century Hong Kong
ERIC Educational Resources Information Center
Evans, Stephen
2006-01-01
This article examines the evolution of language-in-education policy in Hong Kong during the first six decades of British rule (1842-1902). In particular, it analyses the changing roles and status of the English and Chinese languages during this formative period in the development of the colony's education system. The textual and statistical data…
ERIC Educational Resources Information Center
Lee, Boh Young
2013-01-01
This study explores the beliefs and attitudes that Korean immigrant parents and their children in the USA hold about their heritage language. Data were collected through interviews. This study addresses how parents' perspectives and their actual heritage language practices with their children influence their children's cultural identity and…
An Overview of Genomic Sequence Variation Markup Language (GSVML)
Nakaya, Jun; Hiroi, Kaei; Ido, Keisuke; Yang, Woosung; Kimura, Michio
2006-01-01
Internationally accumulated genomic sequence variation data on human requires the interoperable data exchanging format. We developed the GSVML as the data exchanging format. The GSVML is human health oriented and has three categories. Analyses on the use case in human health domain and the investigation on the databases and markup languages were conducted. An interface ability to Health Level Seven Genotype Model was examined. GSVML provides a sharable platform for both clinical and research applications.
Conversion of Radiology Reporting Templates to the MRRT Standard.
Kahn, Charles E; Genereaux, Brad; Langlotz, Curtis P
2015-10-01
In 2013, the Integrating the Healthcare Enterprise (IHE) Radiology workgroup developed the Management of Radiology Report Templates (MRRT) profile, which defines both the format of radiology reporting templates using an extension of Hypertext Markup Language version 5 (HTML5), and the transportation mechanism to query, retrieve, and store these templates. Of 200 English-language report templates published by the Radiological Society of North America (RSNA), initially encoded as text and in an XML schema language, 168 have been converted successfully into MRRT using a combination of automated processes and manual editing; conversion of the remaining 32 templates is in progress. The automated conversion process applied Extensible Stylesheet Language Transformation (XSLT) scripts, an XML parsing engine, and a Java servlet. The templates were validated for proper HTML5 and MRRT syntax using web-based services. The MRRT templates allow radiologists to share best-practice templates across organizations and have been uploaded to the template library to supersede the prior XML-format templates. By using MRRT transactions and MRRT-format templates, radiologists will be able to directly import and apply templates from the RSNA Report Template Library in their own MRRT-compatible vendor systems. The availability of MRRT-format reporting templates will stimulate adoption of the MRRT standard and is expected to advance the sharing and use of templates to improve the quality of radiology reports.
Speech evaluation in children with temporomandibular disorders.
Pizolato, Raquel Aparecida; Fernandes, Frederico Silva de Freitas; Gavião, Maria Beatriz Duarte
2011-10-01
The aims of this study were to evaluate the influence of temporomandibular disorders (TMD) on speech in children, and to verify the influence of occlusal characteristics. Speech and dental occlusal characteristics were assessed in 152 Brazilian children (78 boys and 74 girls), aged 8 to 12 (mean age 10.05 ± 1.39 years) with or without TMD signs and symptoms. The clinical signs were evaluated using the Research Diagnostic Criteria for TMD (RDC/TMD) (axis I) and the symptoms were evaluated using a questionnaire. The following groups were formed: Group TMD (n=40), TMD signs and symptoms (Group S and S, n=68), TMD signs or symptoms (Group S or S, n=33), and without signs and symptoms (Group N, n=11). Articulatory speech disorders were diagnosed during spontaneous speech and repetition of the words using the "Phonological Assessment of Child Speech" for the Portuguese language. It was also applied a list of 40 phonological balanced words, read by the speech pathologist and repeated by the children. Data were analyzed by descriptive statistics, Fisher's exact or Chi-square tests (α=0.05). A slight prevalence of articulatory disturbances, such as substitutions, omissions and distortions of the sibilants /s/ and /z/, and no deviations in jaw lateral movements were observed. Reduction of vertical amplitude was found in 10 children, the prevalence being greater in TMD signs and symptoms children than in the normal children. The tongue protrusion in phonemes /t/, /d/, /n/, /l/ and frontal lips in phonemes /s/ and /z/ were the most prevalent visual alterations. There was a high percentage of dental occlusal alterations. There was no association between TMD and speech disorders. Occlusal alterations may be factors of influence, allowing distortions and frontal lisp in phonemes /s/ and /z/ and inadequate tongue position in phonemes /t/; /d/; /n/; /l/.
Transformation reborn: A new generation expert system for planning HST operations
NASA Technical Reports Server (NTRS)
Gerb, Andrew
1991-01-01
The Transformation expert system (TRANS) converts proposals for astronomical observations with the Hubble Space Telescope (HST) into detailed observing plans. It encodes expert knowledge to solve problems faced in planning and commanding HST observations to enable their processing by the Science Operations Ground System (SOGS). Among these problems are determining an acceptable order of executing observations, grouping of observations to enhance efficiency and schedulability, inserting extra observations when necessary, and providing parameters for commanding HST instruments. TRANS is currently an operational system and plays a critical role in the HST ground system. It was originally designed using forward-chaining provided by the OPS5 expert system language, but has been reimplemented using a procedural knowledge base. This reimplementation was forced by the explosion in the amount of OPS5 code required to specify the increasingly complicated situations requiring expert-level intervention by the TRANS knowledge base. This problem was compounded by the difficulty of avoiding unintended interaction between rules. To support the TRANS knowledge base, XCL, a small but powerful extension to Commom Lisp was implemented. XCL allows a compact syntax for specifying assignments and references to object attributes. XCL also allows the capability to iterate over objects and perform keyed lookup. The reimplementation of TRANS has greatly diminished the effort needed to maintain and enhance it. As a result of this, its functions have been expanded to include warnings about observations that are difficult or impossible to schedule or command, providing data to aid SPIKE, an intelligent planning system used for HST long-term scheduling, and providing information to the Guide Star Selection System (GSSS) to aid in determination of the long range availability of guide stars.
NASA Technical Reports Server (NTRS)
Hockaday, Stephen; Kuhlenschmidt, Sharon (Editor)
1991-01-01
The objective of the workshop was to explore the role of human factors in facilitating the introduction of artificial intelligence (AI) to advanced air traffic control (ATC) automation concepts. AI is an umbrella term which is continually expanding to cover a variety of techniques where machines are performing actions taken based upon dynamic, external stimuli. AI methods can be implemented using more traditional programming languages such as LISP or PROLOG, or they can be implemented using state-of-the-art techniques such as object-oriented programming, neural nets (hardware or software), and knowledge based expert systems. As this technology advances and as increasingly powerful computing platforms become available, the use of AI to enhance ATC systems can be realized. Substantial efforts along these lines are already being undertaken at the FAA Technical Center, NASA Ames Research Center, academic institutions, industry, and elsewhere. Although it is clear that the technology is ripe for bringing computer automation to ATC systems, the proper scope and role of automation are not at all apparent. The major concern is how to combine human controllers with computer technology. A wide spectrum of options exists, ranging from using automation only to provide extra tools to augment decision making by human controllers to turning over moment-by-moment control to automated systems and using humans as supervisors and system managers. Across this spectrum, it is now obvious that the difficulties that occur when tying human and automated systems together must be resolved so that automation can be introduced safely and effectively. The focus of the workshop was to further explore the role of injecting AI into ATC systems and to identify the human factors that need to be considered for successful application of the technology to present and future ATC systems.
Comeau, Donald C; Liu, Haibin; Islamaj Doğan, Rezarta; Wilbur, W John
2014-01-01
BioC is a new format and associated code libraries for sharing text and annotations. We have implemented BioC natural language preprocessing pipelines in two popular programming languages: C++ and Java. The current implementations interface with the well-known MedPost and Stanford natural language processing tool sets. The pipeline functionality includes sentence segmentation, tokenization, part-of-speech tagging, lemmatization and sentence parsing. These pipelines can be easily integrated along with other BioC programs into any BioC compliant text mining systems. As an application, we converted the NCBI disease corpus to BioC format, and the pipelines have successfully run on this corpus to demonstrate their functionality. Code and data can be downloaded from http://bioc.sourceforge.net. Database URL: http://bioc.sourceforge.net. © The Author(s) 2014. Published by Oxford University Press.
Good-enough linguistic representations and online cognitive equilibrium in language processing.
Karimi, Hossein; Ferreira, Fernanda
2016-01-01
We review previous research showing that representations formed during language processing are sometimes just "good enough" for the task at hand and propose the "online cognitive equilibrium" hypothesis as the driving force behind the formation of good-enough representations in language processing. Based on this view, we assume that the language comprehension system by default prefers to achieve as early as possible and remain as long as possible in a state of cognitive equilibrium where linguistic representations are successfully incorporated with existing knowledge structures (i.e., schemata) so that a meaningful and coherent overall representation is formed, and uncertainty is resolved or at least minimized. We also argue that the online equilibrium hypothesis is consistent with current theories of language processing, which maintain that linguistic representations are formed through a complex interplay between simple heuristics and deep syntactic algorithms and also theories that hold that linguistic representations are often incomplete and lacking in detail. We also propose a model of language processing that makes use of both heuristic and algorithmic processing, is sensitive to online cognitive equilibrium, and, we argue, is capable of explaining the formation of underspecified representations. We review previous findings providing evidence for underspecification in relation to this hypothesis and the associated language processing model and argue that most of these findings are compatible with them.
OIL—Output input language for data connectivity between geoscientific software applications
NASA Astrophysics Data System (ADS)
Amin Khan, Khalid; Akhter, Gulraiz; Ahmad, Zulfiqar
2010-05-01
Geoscientific computing has become so complex that no single software application can perform all the processing steps required to get the desired results. Thus for a given set of analyses, several specialized software applications are required, which must be interconnected for electronic flow of data. In this network of applications the outputs of one application become inputs of other applications. Each of these applications usually involve more than one data type and may have their own data formats, making them incompatible with other applications in terms of data connectivity. Consequently several data format conversion utilities are developed in-house to provide data connectivity between applications. Practically there is no end to this problem as each time a new application is added to the system, a set of new data conversion utilities need to be developed. This paper presents a flexible data format engine, programmable through a platform independent, interpreted language named; Output Input Language (OIL). Its unique architecture allows input and output formats to be defined independent of each other by two separate programs. Thus read and write for each format is coded only once and data connectivity link between two formats is established by a combination of their read and write programs. This results in fewer programs with no redundancy and maximum reuse, enabling rapid application development and easy maintenance of data connectivity links.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goodall, John; Iannacone, Mike; Athalye, Anish
2013-08-01
Morph is a framework and domain-specific language (DSL) that helps parse and transform structured documents. It currently supports several file formats including XML, JSON, and CSV, and custom formats are usable as well.
ERIC Educational Resources Information Center
Russell, Joan
This paper argues that verbal language plays a mediating role in the development of musical thinking. Two excerpts from the transcripts of conducting students' practica are interpreted. The verbal language that student conductors use during their rehearsals is a path to their musical thinking. Lev Vygotsky's social psychological theory of language…
ERIC Educational Resources Information Center
Pekel, Haci Ahmet
2014-01-01
Foreign language, computer programs and social network applications or web sites are widely used by many people nowadays for various aims. In the literature, the number of studies investigating over university departments of physical education or more specifically to say, taking sports students' and teachers' foreign language and social networking…
ERIC Educational Resources Information Center
Remine, Maria D.; Care, Esther; Brown, P. Margaret
2008-01-01
The internal use of language during problem solving is considered to play a key role in executive functioning. This role provides a means for self-reflection and self-questioning during the formation of rules and plans and a capacity to control and monitor behavior during problem-solving activity. Given that increasingly sophisticated language is…
ERIC Educational Resources Information Center
Petrov, Lisa Amor
2013-01-01
This article presents research findings from a pilot study of the use of service-learning in an intermediate-high class ("Spanish Language and Culture for Heritage Speakers") in the fall semesters of 2010 and 2011. Students reported gains in the areas of communication skills, dispositional learning, language, identity formation, and…
ERIC Educational Resources Information Center
Granger, Sylviane; Kraif, Olivier; Ponton, Claude; Antoniadis, Georges; Zampa, Virginie
2007-01-01
Learner corpora, electronic collections of spoken or written data from foreign language learners, offer unparalleled access to many hitherto uncovered aspects of learner language, particularly in their error-tagged format. This article aims to demonstrate the role that the learner corpus can play in CALL, particularly when used in conjunction with…
ERIC Educational Resources Information Center
Gonzalez, Virginia; And Others
A study investigated the interaction of cognitive, cultural, and linguistic factors in second-language concept formation in adults. Specifically, it examined how seven college students in a lower-division intensive Spanish class developed new gender concepts when learning a second language. Course instruction focused on concept construction at…
ERIC Educational Resources Information Center
Isonio, Steven
During spring 1992, the Combined English Language Skills Assessment (CELSA) test was piloted with a sample of English-as-a-Second-Language (ESL) classes at Golden West College (GWC) in Huntington Beach, California. The CELSA, which utilizes a cloze format including parts of conversations and short dialogues, combines items from beginning,…
Francis, Wendy S; Taylor, Randolph S; Gutiérrez, Marisela; Liaño, Mary K; Manzanera, Diana G; Penalver, Renee M
2018-05-19
Two experiments investigated how well bilinguals utilise long-standing semantic associations to encode and retrieve semantic clusters in verbal episodic memory. In Experiment 1, Spanish-English bilinguals (N = 128) studied and recalled word and picture sets. Word recall was equivalent in L1 and L2, picture recall was better in L1 than in L2, and the picture superiority effect was stronger in L1 than in L2. Semantic clustering in word and picture recall was equivalent in L1 and L2. In Experiment 2, Spanish-English bilinguals (N = 128) and English-speaking monolinguals (N = 128) studied and recalled word sequences that contained semantically related pairs. Data were analyzed using a multinomial processing tree approach, the pair-clustering model. Cluster formation was more likely for semantically organised than for randomly ordered word sequences. Probabilities of cluster formation, cluster retrieval, and retrieval of unclustered items did not differ across languages or language groups. Language proficiency has little if any impact on the utilisation of long-standing semantic associations, which are language-general.
Importing MAGE-ML format microarray data into BioConductor.
Durinck, Steffen; Allemeersch, Joke; Carey, Vincent J; Moreau, Yves; De Moor, Bart
2004-12-12
The microarray gene expression markup language (MAGE-ML) is a widely used XML (eXtensible Markup Language) standard for describing and exchanging information about microarray experiments. It can describe microarray designs, microarray experiment designs, gene expression data and data analysis results. We describe RMAGEML, a new Bioconductor package that provides a link between cDNA microarray data stored in MAGE-ML format and the Bioconductor framework for preprocessing, visualization and analysis of microarray experiments. http://www.bioconductor.org. Open Source.
Adaptive Language Games with Robots
NASA Astrophysics Data System (ADS)
Steels, Luc
2010-11-01
This paper surveys recent research into language evolution using computer simulations and robotic experiments. This field has made tremendous progress in the past decade going from simple simulations of lexicon formation with animallike cybernetic robots to sophisticated grammatical experiments with humanoid robots.
Molinaro, Nicola; Giannelli, Francesco; Caffarra, Sendy; Martin, Clara
2017-07-01
Language comprehension is largely supported by predictive mechanisms that account for the ease and speed with which communication unfolds. Both native and proficient non-native speakers can efficiently handle contextual cues to generate reliable linguistic expectations. However, the link between the variability of the linguistic background of the speaker and the hierarchical format of the representations predicted is still not clear. We here investigate whether native language exposure to typologically highly diverse languages (Spanish and Basque) affects the way early balanced bilingual speakers carry out language predictions. During Spanish sentence comprehension, participants developed predictions of words the form of which (noun ending) could be either diagnostic of grammatical gender values (transparent) or totally ambiguous (opaque). We measured electrophysiological prediction effects time-locked both to the target word and to its determiner, with the former being expected or unexpected. Event-related (N200-N400) and oscillatory activity in the low beta-band (15-17Hz) frequency channel showed that both Spanish and Basque natives optimally carry out lexical predictions independently of word transparency. Crucially, in contrast to Spanish natives, Basque natives displayed visual word form predictions for transparent words, in consistency with the relevance that noun endings (post-nominal suffixes) play in their native language. We conclude that early language exposure largely shapes prediction mechanisms, so that bilinguals reading in their second language rely on the distributional regularities that are highly relevant in their first language. More importantly, we show that individual linguistic experience hierarchically modulates the format of the predicted representation. Copyright © 2017 Elsevier B.V. All rights reserved.
NETS - A NEURAL NETWORK DEVELOPMENT TOOL, VERSION 3.0 (MACINTOSH VERSION)
NASA Technical Reports Server (NTRS)
Phillips, T. A.
1994-01-01
NETS, A Tool for the Development and Evaluation of Neural Networks, provides a simulation of Neural Network algorithms plus an environment for developing such algorithms. Neural Networks are a class of systems modeled after the human brain. Artificial Neural Networks are formed from hundreds or thousands of simulated neurons, connected to each other in a manner similar to brain neurons. Problems which involve pattern matching readily fit the class of problems which NETS is designed to solve. NETS uses the back propagation learning method for all of the networks which it creates. The nodes of a network are usually grouped together into clumps called layers. Generally, a network will have an input layer through which the various environment stimuli are presented to the network, and an output layer for determining the network's response. The number of nodes in these two layers is usually tied to some features of the problem being solved. Other layers, which form intermediate stops between the input and output layers, are called hidden layers. NETS allows the user to customize the patterns of connections between layers of a network. NETS also provides features for saving the weight values of a network during the learning process, which allows for more precise control over the learning process. NETS is an interpreter. Its method of execution is the familiar "read-evaluate-print" loop found in interpreted languages such as BASIC and LISP. The user is presented with a prompt which is the simulator's way of asking for input. After a command is issued, NETS will attempt to evaluate the command, which may produce more prompts requesting specific information or an error if the command is not understood. The typical process involved when using NETS consists of translating the problem into a format which uses input/output pairs, designing a network configuration for the problem, and finally training the network with input/output pairs until an acceptable error is reached. NETS allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard distribution medium) or a .25 inch streaming magnetic tape cartridge in UNIX tar format. NETS was developed in 1989 and updated in 1992. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation. SunIPC and SunOS are trademarks of Sun Microsystems, Inc. CRAY Y-MP and UNICOS are trademarks of Cray Research, Inc.
NETS - A NEURAL NETWORK DEVELOPMENT TOOL, VERSION 3.0 (MACHINE INDEPENDENT VERSION)
NASA Technical Reports Server (NTRS)
Baffes, P. T.
1994-01-01
NETS, A Tool for the Development and Evaluation of Neural Networks, provides a simulation of Neural Network algorithms plus an environment for developing such algorithms. Neural Networks are a class of systems modeled after the human brain. Artificial Neural Networks are formed from hundreds or thousands of simulated neurons, connected to each other in a manner similar to brain neurons. Problems which involve pattern matching readily fit the class of problems which NETS is designed to solve. NETS uses the back propagation learning method for all of the networks which it creates. The nodes of a network are usually grouped together into clumps called layers. Generally, a network will have an input layer through which the various environment stimuli are presented to the network, and an output layer for determining the network's response. The number of nodes in these two layers is usually tied to some features of the problem being solved. Other layers, which form intermediate stops between the input and output layers, are called hidden layers. NETS allows the user to customize the patterns of connections between layers of a network. NETS also provides features for saving the weight values of a network during the learning process, which allows for more precise control over the learning process. NETS is an interpreter. Its method of execution is the familiar "read-evaluate-print" loop found in interpreted languages such as BASIC and LISP. The user is presented with a prompt which is the simulator's way of asking for input. After a command is issued, NETS will attempt to evaluate the command, which may produce more prompts requesting specific information or an error if the command is not understood. The typical process involved when using NETS consists of translating the problem into a format which uses input/output pairs, designing a network configuration for the problem, and finally training the network with input/output pairs until an acceptable error is reached. NETS allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard distribution medium) or a .25 inch streaming magnetic tape cartridge in UNIX tar format. NETS was developed in 1989 and updated in 1992. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation. SunIPC and SunOS are trademarks of Sun Microsystems, Inc. CRAY Y-MP and UNICOS are trademarks of Cray Research, Inc.
Astronomical Instrumentation System Markup Language
NASA Astrophysics Data System (ADS)
Goldbaum, Jesse M.
2016-05-01
The Astronomical Instrumentation System Markup Language (AISML) is an Extensible Markup Language (XML) based file format for maintaining and exchanging information about astronomical instrumentation. The factors behind the need for an AISML are first discussed followed by the reasons why XML was chosen as the format. Next it's shown how XML also provides the framework for a more precise definition of an astronomical instrument and how these instruments can be combined to form an Astronomical Instrumentation System (AIS). AISML files for several instruments as well as one for a sample AIS are provided. The files demonstrate how AISML can be utilized for various tasks from web page generation and programming interface to instrument maintenance and quality management. The advantages of widespread adoption of AISML are discussed.
Benchmarking and performance analysis of the CM-2. [SIMD computer
NASA Technical Reports Server (NTRS)
Myers, David W.; Adams, George B., II
1988-01-01
A suite of benchmarking routines testing communication, basic arithmetic operations, and selected kernel algorithms written in LISP and PARIS was developed for the CM-2. Experiment runs are automated via a software framework that sequences individual tests, allowing for unattended overnight operation. Multiple measurements are made and treated statistically to generate well-characterized results from the noisy values given by cm:time. The results obtained provide a comparison with similar, but less extensive, testing done on a CM-1. Tests were chosen to aid the algorithmist in constructing fast, efficient, and correct code on the CM-2, as well as gain insight into what performance criteria are needed when evaluating parallel processing machines.
Status of GDL - GNU Data Language
NASA Astrophysics Data System (ADS)
Coulais, A.; Schellens, M.; Gales, J.; Arabas, S.; Boquien, M.; Chanial, P.; Messmer, P.; Fillmore, D.; Poplawski, O.; Maret, S.; Marchal, G.; Galmiche, N.; Mermet, T.
2010-12-01
Gnu Data Language (GDL) is an open-source interpreted language aimed at numerical data analysis and visualisation. It is a free implementation of the Interactive Data Language (IDL) widely used in Astronomy. GDL has a full syntax compatibility with IDL, and includes a large set of library routines targeting advanced matrix manipulation, plotting, time-series and image analysis, mapping, and data input/output including numerous scientific data formats. We will present the current status of the project, the key accomplishments, and the weaknesses - areas where contributions are welcome!
Bilingual Babel: Cuneiform Texts in Two or More Languages from Ancient Mesopotamia and Beyond.
ERIC Educational Resources Information Center
Cooper, Jerrold
1993-01-01
Discusses bilingualism in written cuneiform texts from ancient Babylonia and Sumeria. Describes the development of formats and techniques that enabled two or more languages on a single document to coexist harmoniously and productively. (SR)
2012-09-01
boxes) using a third-party commercial software component. When creating version 1, it was necessary to enter raw Hypertext Markup Language (HTML) tags...Markup Language (HTML) web page. Figure 12. Authors create procedures using the Procedure Editor. Users run procedures using the...step presents instructions to the user using formatted text and graphics specified using the Hypertext Markup Language (HTML). Instructions can
"A Seed Blessed by the Lord": The Role of Religious References in the Creation of Modern Hebrew
ERIC Educational Resources Information Center
Or, Iair G.
2016-01-01
The nativization of Modern Hebrew at the end of the nineteenth century and the beginning of the twentieth is one of the most commonly cited examples of language planning and (possibly) revival. The Hebrew Language Committee, which was the main body responsible for Hebrew language planning in the formative years 1890-1953, held numerous discussions…
ERIC Educational Resources Information Center
Vaattovaara, Johanna
2017-01-01
This paper presents a case example of a University Pedagogy course module carried out in ALMS (Autonomous Learning Modules) format. The participants of the course were mainly in-service language teachers of the University of Helsinki Language Centre, and the author of this report is a module instructor and counsellor. The motivation for the ALMS…
ERIC Educational Resources Information Center
Montrul, Silvina; de la Fuente, Israel; Davidson, Justin; Foote, Rebecca
2013-01-01
This study examined whether type of early language experience provides advantages to heritage speakers over second language (L2) learners with morphology, and investigated knowledge of gender agreement and its interaction with diminutive formation. Diminutives are a hallmark of Child Directed Speech in early language development and a highly…
NASA Astrophysics Data System (ADS)
Aldalur, Itziar; Zhang, Heng; Piszcz, Michał; Oteo, Uxue; Rodriguez-Martinez, Lide M.; Shanmukaraj, Devaraj; Rojo, Teofilo; Armand, Michel
2017-04-01
We report a simple synthesis route towards a new type of comb polymer material based on polyether amines oligomer side chains (i.e., Jeffamine® compounds) and a poly(ethylene-alt-maleic anhydride) backbone. Reaction proceeds by imide ring formation through the NH2 group allowing for attachment of side chains. By taking advantage of the high configurational freedoms and flexibility of propylene oxide/ethylene oxide units (PO/EO) in Jeffamine® compounds, novel polymer matrices were obtained with good elastomeric properties. Fully amorphous solid polymer electrolytes (SPEs) based on lithium bis(trifluoromethanesulfonyl)imide (LiTFSI) and Jeffamine®-based polymer matrices show low glass transition temperatures around -40 °C, high ionic conductivities and good electrochemical stabilities. The ionic conductivities of Jeffamine-based SPEs (5.3 × 10-4 S cm-1 at 70 °C and 4.5 × 10-5 S cm-1 at room temperature) are higher than those of the conventional SPEs comprising of LiTFSI and linear poly(ethylene oxide) (PEO), due to the amorphous nature and the high concentration of mobile end-groups of the Jeffamine-based polymer matrices rather than the semi-crystalline PEO The feasibility of Jeffamine-based compounds in lithium metal batteries is further demonstrated by the implementation of Jeffamine®-based polymer as a binder for cathode materials, and the stable cycling of Li|SPE|LiFePO4 and Li|SPE|S cells using Jeffamine-based SPEs.
ERIC Educational Resources Information Center
Gazan, Rich
2000-01-01
Surveys the current state of Extensible Markup Language (XML), a metalanguage for creating structured documents that describe their own content, and its implications for information professionals. Predicts that XML will become the common language underlying Web, word processing, and database formats. Also discusses Extensible Stylesheet Language…
IOS: PDP 11/45 formatted input/output task stacker and processer. [In MACRO-II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koschik, J.
1974-07-08
IOS allows the programer to perform formated Input/Output at assembly language level to/from any peripheral device. It runs under DOS versions V8-O8 or V9-19, reading and writing DOS-compatible files. Additionally, IOS will run, with total transparency, in an environment with memory management enabled. Minimum hardware required is a 16K PDP 11/45, Keyboard Device, DISK (DK,DF, or DC), and Line Frequency Clock. The source language is MACRO-11 (3.3K Decimal Words).
Pick_sw: a program for interactive picking of S-wave data, version 2.00
Ellefsen, Karl J.
2002-01-01
Program pick_sw is used to interactively pick travel times from S-wave data. It is assumed that the data are collected using 2 shots of opposite polarity at each shot location. The traces must be in either the SEG-2 format or the SU format. The program is written in the IDL and C programming languages, and the program is executed under the Windows operating system. (The program may also execute under other operating systems like UNIX if the C language functions are re-compiled).
Swat, M J; Moodie, S; Wimalaratne, S M; Kristensen, N R; Lavielle, M; Mari, A; Magni, P; Smith, M K; Bizzotto, R; Pasotti, L; Mezzalana, E; Comets, E; Sarr, C; Terranova, N; Blaudez, E; Chan, P; Chard, J; Chatel, K; Chenel, M; Edwards, D; Franklin, C; Giorgino, T; Glont, M; Girard, P; Grenon, P; Harling, K; Hooker, A C; Kaye, R; Keizer, R; Kloft, C; Kok, J N; Kokash, N; Laibe, C; Laveille, C; Lestini, G; Mentré, F; Munafo, A; Nordgren, R; Nyberg, H B; Parra-Guillen, Z P; Plan, E; Ribba, B; Smith, G; Trocóniz, I F; Yvon, F; Milligan, P A; Harnisch, L; Karlsson, M; Hermjakob, H; Le Novère, N
2015-06-01
The lack of a common exchange format for mathematical models in pharmacometrics has been a long-standing problem. Such a format has the potential to increase productivity and analysis quality, simplify the handling of complex workflows, ensure reproducibility of research, and facilitate the reuse of existing model resources. Pharmacometrics Markup Language (PharmML), currently under development by the Drug Disease Model Resources (DDMoRe) consortium, is intended to become an exchange standard in pharmacometrics by providing means to encode models, trial designs, and modeling steps.
Swat, MJ; Moodie, S; Wimalaratne, SM; Kristensen, NR; Lavielle, M; Mari, A; Magni, P; Smith, MK; Bizzotto, R; Pasotti, L; Mezzalana, E; Comets, E; Sarr, C; Terranova, N; Blaudez, E; Chan, P; Chard, J; Chatel, K; Chenel, M; Edwards, D; Franklin, C; Giorgino, T; Glont, M; Girard, P; Grenon, P; Harling, K; Hooker, AC; Kaye, R; Keizer, R; Kloft, C; Kok, JN; Kokash, N; Laibe, C; Laveille, C; Lestini, G; Mentré, F; Munafo, A; Nordgren, R; Nyberg, HB; Parra-Guillen, ZP; Plan, E; Ribba, B; Smith, G; Trocóniz, IF; Yvon, F; Milligan, PA; Harnisch, L; Karlsson, M; Hermjakob, H; Le Novère, N
2015-01-01
The lack of a common exchange format for mathematical models in pharmacometrics has been a long-standing problem. Such a format has the potential to increase productivity and analysis quality, simplify the handling of complex workflows, ensure reproducibility of research, and facilitate the reuse of existing model resources. Pharmacometrics Markup Language (PharmML), currently under development by the Drug Disease Model Resources (DDMoRe) consortium, is intended to become an exchange standard in pharmacometrics by providing means to encode models, trial designs, and modeling steps. PMID:26225259
NASA Astrophysics Data System (ADS)
Ragan-Kelley, M.; Perez, F.; Granger, B.; Kluyver, T.; Ivanov, P.; Frederic, J.; Bussonnier, M.
2014-12-01
IPython has provided terminal-based tools for interactive computing in Python since 2001. The notebook document format and multi-process architecture introduced in 2011 have expanded the applicable scope of IPython into teaching, presenting, and sharing computational work, in addition to interactive exploration. The new architecture also allows users to work in any language, with implementations in Python, R, Julia, Haskell, and several other languages. The language agnostic parts of IPython have been renamed to Jupyter, to better capture the notion that a cross-language design can encapsulate commonalities present in computational research regardless of the programming language being used. This architecture offers components like the web-based Notebook interface, that supports rich documents that combine code and computational results with text narratives, mathematics, images, video and any media that a modern browser can display. This interface can be used not only in research, but also for publication and education, as notebooks can be converted to a variety of output formats, including HTML and PDF. Recent developments in the Jupyter project include a multi-user environment for hosting notebooks for a class or research group, a live collaboration notebook via Google Docs, and better support for languages other than Python.
Individual language experience modulates rapid formation of cortical memory circuits for novel words
Kimppa, Lilli; Kujala, Teija; Shtyrov, Yury
2016-01-01
Mastering multiple languages is an increasingly important ability in the modern world; furthermore, multilingualism may affect human learning abilities. Here, we test how the brain’s capacity to rapidly form new representations for spoken words is affected by prior individual experience in non-native language acquisition. Formation of new word memory traces is reflected in a neurophysiological response increase during a short exposure to novel lexicon. Therefore, we recorded changes in electrophysiological responses to phonologically native and non-native novel word-forms during a perceptual learning session, in which novel stimuli were repetitively presented to healthy adults in either ignore or attend conditions. We found that larger number of previously acquired languages and earlier average age of acquisition (AoA) predicted greater response increase to novel non-native word-forms. This suggests that early and extensive language experience is associated with greater neural flexibility for acquiring novel words with unfamiliar phonology. Conversely, later AoA was associated with a stronger response increase for phonologically native novel word-forms, indicating better tuning of neural linguistic circuits to native phonology. The results suggest that individual language experience has a strong effect on the neural mechanisms of word learning, and that it interacts with the phonological familiarity of the novel lexicon. PMID:27444206
NASA Technical Reports Server (NTRS)
2001-01-01
In this contract, which is a component of a larger contract that we plan to submit in the coming months, we plan to study the preprocessing issues which arise in applying natural language processing techniques to NASA-KSC problem reports. The goals of this work will be to deal with the issues of: a) automatically obtaining the problem reports from NASA-KSC data bases, b) the format of these reports and c) the conversion of these reports to a format that will be adequate for our natural language software. At the end of this contract, we expect that these problems will be solved and that we will be ready to apply our natural language software to a text database of over 1000 KSC problem reports.
Integrating Reading and the English-Language Arts in the Geography Curriculum.
ERIC Educational Resources Information Center
Rushdoony, Haig A.
Suggested activities for integrating language concepts and comprehension skills into elementary school geography instruction are presented. The activities focus on concept formation through semantic mapping and making analogies, and on comprehension through recalling, generalizing, interpreting, and making inferences. Semantic maps indicate spoke…
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (UNIX VERSION)
NASA Technical Reports Server (NTRS)
Donnell, B.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (IBM PC VERSION)
NASA Technical Reports Server (NTRS)
Donnell, B.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (MACINTOSH VERSION)
NASA Technical Reports Server (NTRS)
Riley, G.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (DEC VAX VMS VERSION)
NASA Technical Reports Server (NTRS)
Donnell, B.
1994-01-01
CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
Language Use and Coalition Formation in Multiparty Negotiations.
Sagi, Eyal; Diermeier, Daniel
2017-01-01
The alignment of bargaining positions is crucial to a successful negotiation. Prior research has shown that similarity in language use is indicative of the conceptual alignment of interlocutors. We use latent semantic analysis to explore how the similarity of language use between negotiating parties develops over the course of a three-party negotiation. Results show that parties that reach an agreement show a gradual increase in language similarity over the course of the negotiation. Furthermore, reaching the most financially efficient outcome is dependent on similarity in language use between the parties that have the most to gain from such an outcome. Copyright © 2015 Cognitive Science Society, Inc.
SSL: A software specification language
NASA Technical Reports Server (NTRS)
Austin, S. L.; Buckles, B. P.; Ryan, J. P.
1976-01-01
SSL (Software Specification Language) is a new formalism for the definition of specifications for software systems. The language provides a linear format for the representation of the information normally displayed in a two-dimensional module inter-dependency diagram. In comparing SSL to FORTRAN or ALGOL, it is found to be largely complementary to the algorithmic (procedural) languages. SSL is capable of representing explicitly module interconnections and global data flow, information which is deeply imbedded in the algorithmic languages. On the other hand, SSL is not designed to depict the control flow within modules. The SSL level of software design explicitly depicts intermodule data flow as a functional specification.
ERIC Educational Resources Information Center
VanLengen, Craig Alan
2010-01-01
The Securities and Exchange Commission (SEC) has recently announced a proposal that will require all public companies to report their financial data in Extensible Business Reporting Language (XBRL). XBRL is an extension of Extensible Markup Language (XML). Moving to a standard reporting format makes it easier for organizations to report the…
Teaching Square Roots: Conceptual Complexity in Mathematics Language
ERIC Educational Resources Information Center
Gough, John
2007-01-01
Mathematics is an "artificial" deliberately constructed language, supported crucially by: (1) special alpha-numeric characters and usages; (2) extra-special non-alphanumeric symbols; (3) special written formats within a single line, such as superscripts and subscripts; (4) grouping along a line, including bracketing using round brackets,…
24 CFR Appendix Ms-1 to Part 3500 - Appendix MS-1 to Part 3500
Code of Federal Regulations, 2010 CFR
2010-04-01
... Part 3500 [Sample language; use business stationery or similar heading] [Date] SERVICING DISCLOSURE... “Servicing Transfer Information.” The model format may be annotated with further information that clarifies or enhances the model language.] [73 FR 68259, Nov. 17, 2008] ...
Motivations and Concerns: Voices from Pre-Service Language Teachers
ERIC Educational Resources Information Center
Kavanoz, Suzan; Yüksel, Hatice G.
2017-01-01
Contemporary interactionist theories conceive identity formation as a dynamic process that is continuously co-constructed within a social context. For pre-service language teachers, teacher education programs constitute the context in which their professional identities are formed. This cross-sectional qualitative study aims at exploring…
WaterML: an XML Language for Communicating Water Observations Data
NASA Astrophysics Data System (ADS)
Maidment, D. R.; Zaslavsky, I.; Valentine, D.
2007-12-01
One of the great impediments to the synthesis of water information is the plethora of formats used to publish such data. Each water agency uses its own approach. XML (eXtended Markup Languages) are generalizations of Hypertext Markup Language to communicate specific kinds of information via the internet. WaterML is an XML language for water observations data - streamflow, water quality, groundwater levels, climate, precipitation and aquatic biology data, recorded at fixed, point locations as a function of time. The Hydrologic Information System project of the Consortium of Universities for the Advancement of Hydrologic Science, Inc (CUAHSI) has defined WaterML and prepared a set of web service functions called WaterOneFLow that use WaterML to provide information about observation sites, the variables measured there and the values of those measurments. WaterML has been submitted to the Open GIS Consortium for harmonization with its standards for XML languages. Academic investigators at a number of testbed locations in the WATERS network are providing data in WaterML format using WaterOneFlow web services. The USGS and other federal agencies are also working with CUAHSI to similarly provide access to their data in WaterML through WaterOneFlow services.
OCCULT-ORSER complete conversational user-language translator
NASA Technical Reports Server (NTRS)
Ramapriyan, H. K.; Young, K.
1981-01-01
Translator program (OCCULT) assists non-computer-oriented users in setting up and submitting jobs for complex ORSER system. ORSER is collection of image processing programs for analyzing remotely sensed data. OCCULT is designed for those who would like to use ORSER but cannot justify acquiring and maintaining necessary proficiency in Remote Job Entry Language, Job Control Language, and control-card formats. OCCULT is written in FORTRAN IV and OS Assembler for interactive execution.
Valero, Germán; Cárdenas, Paula
The Faculty of Veterinary Medicine and Animal Science of the National Autonomous University of Mexico (UNAM) uses the Moodle learning management system for formative and summative computer assessment. The authors of this article-the teacher primarily responsible for Moodle implementation and a researcher who is a recent Moodle adopter-describe and discuss the students' and teachers' attitudes to summative and formative computer assessment in Moodle. Item analysis of quiz results helped us to identify and fix poorly performing questions, which greatly reduced student complaints and improved objective assessment. The use of certainty-based marking (CBM) in formative assessment in veterinary pathology was well received by the students and should be extended to more courses. The importance of having proficient computer support personnel should not be underestimated. A properly translated language pack is essential for the use of Moodle in a language other than English.
Concept Formation Skills in Long-Term Cochlear Implant Users
ERIC Educational Resources Information Center
Castellanos, Irina; Kronenberger, William G.; Beer, Jessica; Colson, Bethany G.; Henning, Shirley C.; Ditmars, Allison; Pisoni, David B.
2015-01-01
This study investigated if a period of auditory sensory deprivation followed by degraded auditory input and related language delays affects visual concept formation skills in long-term prelingually deaf cochlear implant (CI) users. We also examined if concept formation skills are mediated or moderated by other neurocognitive domains (i.e.,…
ERIC Educational Resources Information Center
Hsiao, Cheng-hua
2018-01-01
Teacher identity has been an important issue in teacher education because teacher identity influences teachers' professional development. However, little has been explored in preservice teachers' identity formation within the EFL context of language teaching. In this study, the early influence on EFL student teachers' identity formation in…
Why is number word learning hard? Evidence from bilingual learners.
Wagner, Katie; Kimura, Katherine; Cheung, Pierina; Barner, David
2015-12-01
Young children typically take between 18 months and 2 years to learn the meanings of number words. In the present study, we investigated this developmental trajectory in bilingual preschoolers to examine the relative contributions of two factors in number word learning: (1) the construction of numerical concepts, and (2) the mapping of language specific words onto these concepts. We found that children learn the meanings of small number words (i.e., one, two, and three) independently in each language, indicating that observed delays in learning these words are attributable to difficulties in mapping words to concepts. In contrast, children generally learned to accurately count larger sets (i.e., five or greater) simultaneously in their two languages, suggesting that the difficulty in learning to count is not tied to a specific language. We also replicated previous studies that found that children learn the counting procedure before they learn its logic - i.e., that for any natural number, n, the successor of n in the count list denotes the cardinality n+1. Consistent with past studies, we found that children's knowledge of successors is first acquired incrementally. In bilinguals, we found that this knowledge exhibits item-specific transfer between languages, suggesting that the logic of the positive integers may not be stored in a language-specific format. We conclude that delays in learning the meanings of small number words are mainly due to language-specific processes of mapping words to concepts, whereas the logic and procedures of counting appear to be learned in a format that is independent of a particular language and thus transfers rapidly from one language to the other in development. Copyright © 2015. Published by Elsevier Inc.
Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H. L.; Onami, Shuichi
2015-01-01
Motivation: Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. Results: We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. Availability and implementation: A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Contact: sonami@riken.jp Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:25414366
Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H L; Onami, Shuichi
2015-04-01
Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
Past participle formation in specific language impairment.
Kauschke, Christina; Renner, Lena F; Domahs, Ulrike
2017-03-01
German participles are formed by a co-occurrence of prefixation and suffixation. While the acquisition of regular and irregular suffixation has been investigated exhaustively, it is still unclear how German children master the prosodically determined prefixation rule (prefix ge-). Findings reported in the literature are inconsistent on this point. In particular, it is unclear whether participle formation is vulnerable in German children with specific language impairment (SLI). To compare children with and without SLI in their abilities to form German participles correctly, and to determine their relative sensitivities to the morphophonological regularities of prefixation. The performance of 14 German-speaking children with SLI (mean age = 7;5) in a participle formation task was compared with that of age-matched and younger typically developing controls. The materials included 60 regular verbs and 20 pseudo-verbs, half of them requiring the prefix ge-. Overall, children with SLI performed poorly compared with both groups of typically developing children. Children with SLI tended either to avoid participle markings or choose inappropriate affixes. However, while such children showed marked impairment at the morphological level, they were generally successful in applying the morphoprosodic rules governing prefixation. In contrast to earlier findings, the present results demonstrate that regular participle formation is problematic for German children with SLI. © 2016 Royal College of Speech and Language Therapists.
Robot computer problem solving system
NASA Technical Reports Server (NTRS)
Merriam, E. W.; Becker, J. D.
1973-01-01
A robot computer problem solving system which represents a robot exploration vehicle in a simulated Mars environment is described. The model exhibits changes and improvements made on a previously designed robot in a city environment. The Martian environment is modeled in Cartesian coordinates; objects are scattered about a plane; arbitrary restrictions on the robot's vision have been removed; and the robot's path contains arbitrary curves. New environmental features, particularly the visual occlusion of objects by other objects, were added to the model. Two different algorithms were developed for computing occlusion. Movement and vision capabilities of the robot were established in the Mars environment, using LISP/FORTRAN interface for computational efficiency. The graphical display program was redesigned to reflect the change to the Mars-like environment.
An Examination of Target Tracking in the Antisubmarine Warfare Systems Evaluation Tool (ASSET)
1991-09-01
fnCorC- Ingi Ing?)))) WX CY) Calt) (dlat2 Ccosl.±sp lat2 C) (if 0- C. diet dIng? 0.0001) (proqa (if (- flag 0) (proqa (if f- ding 0) (proqa (if (c latl...x Y)) (if (or C brg 90) Cbzg 2101) fsetq rnq C- cat? dIng)) Csetq tag (abs C/ diet Cco&Lisp brgC C C Csetq rng tC 60.1077 rng)) (list rag brgC C;let...Xdus i .X KetO : 0 yea 0 0 vi : .- (-2ln(md(I ))) cod 2rad(l ( i +Xd ye n .- (-2.n(rd(I)))n 2- rnd(I ) +Ydis n ,Vet V .xp -aA 4A,- - .k .H- it .- vx
Kate's Model Verification Tools
NASA Technical Reports Server (NTRS)
Morgan, Steve
1991-01-01
Kennedy Space Center's Knowledge-based Autonomous Test Engineer (KATE) is capable of monitoring electromechanical systems, diagnosing their errors, and even repairing them when they crash. A survey of KATE's developer/modelers revealed that they were already using a sophisticated set of productivity enhancing tools. They did request five more, however, and those make up the body of the information presented here: (1) a transfer function code fitter; (2) a FORTRAN-Lisp translator; (3) three existing structural consistency checkers to aid in syntax checking their modeled device frames; (4) an automated procedure for calibrating knowledge base admittances to protect KATE's hardware mockups from inadvertent hand valve twiddling; and (5) three alternatives for the 'pseudo object', a programming patch that currently apprises KATE's modeling devices of their operational environments.
NASA Technical Reports Server (NTRS)
Sharma, Naveen
1992-01-01
In this paper we briefly describe a combined symbolic and numeric approach for solving mathematical models on parallel computers. An experimental software system, PIER, is being developed in Common Lisp to synthesize computationally intensive and domain formulation dependent phases of finite element analysis (FEA) solution methods. Quantities for domain formulation like shape functions, element stiffness matrices, etc., are automatically derived using symbolic mathematical computations. The problem specific information and derived formulae are then used to generate (parallel) numerical code for FEA solution steps. A constructive approach to specify a numerical program design is taken. The code generator compiles application oriented input specifications into (parallel) FORTRAN77 routines with the help of built-in knowledge of the particular problem, numerical solution methods and the target computer.
Effects of Acoustic Variability on Second Language Vocabulary Learning
ERIC Educational Resources Information Center
Barcroft, Joe; Sommers, Mitchell S.
2005-01-01
This study examined the effects of acoustic variability on second language vocabulary learning. English native speakers learned new words in Spanish. Exposure frequency to the words was constant. Dependent measures were accuracy and latency of picture-to-Spanish and Spanish-to-English recall. Experiment 1 compared presentation formats of neutral…
On Present State of Teaching Russian Language in Russia
ERIC Educational Resources Information Center
Tekucheva, Irina V.; Gromova, Liliya Y.
2016-01-01
The article discusses the current state of teaching Russian language, discovers the nature of philological education, outlines the main problems of the implementation of the standard in school practice, analyzes the problems of formation of universal educational actions within the context of the implementation of cognitive-communicative approach,…
Evaluating Rural Preschool Speech-Language Services: Consumer Satisfaction.
ERIC Educational Resources Information Center
Grela, Bernard G.; Illerbrun, David
1998-01-01
A survey evaluated the satisfaction of 79 parents with the delivery of preschool speech-language services in a rural region of Canada. While parents were generally supportive of the services, they were less supportive of service convenience, parent support, and overall parent satisfaction. Intervention format received the lowest parent rating.…
Test Specifications and Blueprints: Reality and Expectations
ERIC Educational Resources Information Center
AlFallay, Ibrahim S.
2018-01-01
This study investigates to what extend do teachers of English as a school subject (ESS) in Saudi schools follow recommendations and guidelines suggested by language testing specialists in developing tables of specifications and preparing blueprints to their formative and summative language tests. To answer the study questions, a thirteen-statement…
ERIC Educational Resources Information Center
Magahay-Johnson, Wendy
1985-01-01
Describes procedures for designing trivia games to be used in teaching English as a second language. The students participate in designing the games, thereby gaining practice in the four basic language skills and the formation of yes-no questions, information questions, and statements. Provides examples for young intermediate ESL students. (SED)
Motivating Learners at South Korean Universities
ERIC Educational Resources Information Center
Niederhauser, Janet S.
2012-01-01
Students at many universities often fail to reach their full potential as English language learners due to low motivation. Some of the factors that affect their motivation relate to the country's education system in general. Others reflect institutional and cultural views of language learning in particular. Using a problem-solution format, this…
Limited Aspects of Reality: Frames of Reference in Language Assessment
ERIC Educational Resources Information Center
Fulcher, Glenn; Svalberg, Agneta
2013-01-01
Language testers operate within two frames of reference: norm-referenced (NRT) and criterion-referenced testing (CRT). The former underpins the world of large-scale standardized testing that prioritizes variability and comparison. The latter supports substantive score meaning in formative and domain specific assessment. Some claim that the…
A New KE-Free Online ICALL System Featuring Error Contingent Feedback
ERIC Educational Resources Information Center
Tokuda, Naoyuki; Chen, Liang
2004-01-01
As a first step towards implementing a human language teacher, we have developed a new template-based on-line ICALL (intelligent computer assisted language learning) system capable of automatically diagnosing learners' free-format translated inputs and returning error contingent feedback. The system architecture we have adopted allows language…
Academic Preparation for International Pre-MBA's in Marketing.
ERIC Educational Resources Information Center
Westerfield, Kay
Adjustments to the case study approach are recommended to address three major areas of difficulty for foreign students in master's-level marketing education programs: (1) language-related problems; (2) unfamiliar class format and methodology; and (3) lack of cultural background knowledge. For language-related problems, case studies are a good…
Net-centric ACT-R-Based Cognitive Architecture with DEVS Unified Process
2011-04-01
effort has been spent in analyzing various forms of requirement specifications, viz, state-based, Natural Language based, UML-based, Rule- based, BPMN ...requirement specifications in one of the chosen formats such as BPMN , DoDAF, Natural Language Processing (NLP) based, UML- based, DSL or simply
ERIC Educational Resources Information Center
Feuer, Avital
2011-01-01
This study examined the effects of a collaborative creative writing project on identity formation and overall language proficiency development among advanced Hebrew students. In an exercise called "The Zoning Committee", college students created the fictional Israeli-American town of Beit Shemesh, located in northern Michigan.…
Reconnecting Proficiency, Literacy, and Culture: From Theory to Practice
ERIC Educational Resources Information Center
Warford, Mark K.; White, William L.
2012-01-01
What does it mean to capably communicate across languages? This article introduces two theoretical models and a lesson plan format designed to facilitate the integration of proficiency, literacy, and culture teaching in foreign language teaching. The Second Symbolic Competencies Model configures proficiency and literacy as subordinate clusters of…
Open access for the non-English-speaking world: overcoming the language barrier
Fung, Isaac CH
2008-01-01
This editorial highlights the problem of language barrier in scientific communication in spite of the recent success of Open Access Movement. Four options for English-language journals to overcome the language barrier are suggested: 1) abstracts in alternative languages provided by authors, 2) Wiki open translation, 3) international board of translator-editors, and 4) alternative language version of the journal. The Emerging Themes in Epidemiology announces that with immediate effect, it will accept translations of abstracts or full texts by authors as Additional files. Editorial note: In an effort towards overcoming the language barrier in scientific publication, ETE will accept translations of abstracts or the full text of published articles. Each translation should be submitted separately as an Additional File in PDF format. ETE will only peer review English-language versions. Therefore, translations will not be scrutinized in the review-process and the responsibility for accurate translation rests with the authors. PMID:18173854
ERIC Educational Resources Information Center
Pablo, Irasema Mora; Rivas, Leonardo Arturo Rivas; Lengeling, M. Martha; Crawford, Troy
2015-01-01
The objective of this research was to explore the effects of language brokering upon identity formation within the family unit of students who have lived in the United States for a period of time and have come back to live in Mexico. The participants are six students that are currently undertaking a BA in TESOL (Teaching of English to Speakers of…
cluML: A markup language for clustering and cluster validity assessment of microarray data.
Bolshakova, Nadia; Cunningham, Pádraig
2005-01-01
cluML is a new markup language for microarray data clustering and cluster validity assessment. The XML-based format has been designed to address some of the limitations observed in traditional formats, such as inability to store multiple clustering (including biclustering) and validation results within a dataset. cluML is an effective tool to support biomedical knowledge representation in gene expression data analysis. Although cluML was developed for DNA microarray analysis applications, it can be effectively used for the representation of clustering and for the validation of other biomedical and physical data that has no limitations.
ArdenML: The Arden Syntax Markup Language (or Arden Syntax: It's Not Just Text Any More!)
Sailors, R. Matthew
2001-01-01
It is no longer necessary to think of Arden Syntax as simply a text-based knowledge base format. The development of ArdenML (Arden Syntax Markup Language), an XML-based markup language allows structured access to most of the maintenance and library categories without the need to write or buy a compiler may lead to the development of simple commercial and freeware tools for processing Arden Syntax Medical Logic Modules (MLMs)
ERIC Educational Resources Information Center
Chapman, Jean
The first of five handbooks developed by Project HAPI (Handicapped Achievement Program Improvement), a multimedia staff development program to help teachers and specialists write effective individualized education programs (IEPs), is in looseleaf workbook format and focuses on children with severe disorders of language, including aphasia and other…
KSC Space Station Operations Language (SSOL)
NASA Technical Reports Server (NTRS)
1985-01-01
The Space Station Operations Language (SSOL) will serve a large community of diverse users dealing with the integration and checkout of Space Station modules. Kennedy Space Center's plan to achieve Level A specification of the SSOL system, encompassing both its language and its automated support environment, is presented in the format of a briefing. The SSOL concept is a collection of fundamental elements that span languages, operating systems, software development, software tools and several user classes. The approach outlines a thorough process that combines the benefits of rapid prototyping with a coordinated requirements gathering effort, yielding a Level A specification of the SSOL requirements.
TransNewGuinea.org: An Online Database of New Guinea Languages.
Greenhill, Simon J
2015-01-01
The island of New Guinea has the world's highest linguistic diversity, with more than 900 languages divided into at least 23 distinct language families. This diversity includes the world's third largest language family: Trans-New Guinea. However, the region is one of the world's least well studied, and primary data is scattered across a wide range of publications and more often then not hidden in unpublished "gray" literature. The lack of primary research data on the New Guinea languages has been a major impediment to our understanding of these languages, and the history of the peoples in New Guinea. TransNewGuinea.org aims to collect data about these languages and place them online in a consistent format. This database will enable future research into the New Guinea languages with both traditional comparative linguistic methods and novel cutting-edge computational techniques. The long-term aim is to shed light into the prehistory of the peoples of New Guinea, and to understand why there is such major diversity in their languages.
Square One TV, Curriculum Connections Teacher's Guide.
ERIC Educational Resources Information Center
Children's Television Workshop, New York, NY.
This cross curriculum guide links mathematics, language arts, and social studies. The guide is divided into two sections. The first section provides a series of language arts activities and the second social studies activities. Within these two curriculum areas, the activities provided are based on three Square One TV formats: (1) Mathnet, the…
Effects of Bilingual Tact Instruction for a Child with Communication Disorder
ERIC Educational Resources Information Center
León, Alberto L.; Rosales, Rocío
2018-01-01
We evaluated the effects of tact training when instruction was presented in English only compared to tact training in a bilingual format (in English and the home language, Portuguese) for a participant diagnosed with a communication disorder. The participant's parents completed a questionnaire describing his exposure to both languages prior to the…
ERIC Educational Resources Information Center
Bergsland, Knut, Comp.
This comprehensive dictionary draws on ethnographic and linguistic work of the Aleut language and culture dating to 1745. An introductory section explains the dictionary's format, offers a brief historical survey, and contains notes on Aleut phonology and orthography, dialectal differences and developments, Eskimo-Aleut phonological…
Improving Formative Assessment in Language Classrooms Using "GradeCam Go!"
ERIC Educational Resources Information Center
Kiliçkaya, Ferit
2017-01-01
This study aimed to determine EFL (English as a Foreign Language) teachers' perceptions and experience regarding their use of "GradeCam Go!" to grade multiple choice tests. The results of the study indicated that the participants overwhelmingly valued "GradeCam Go!" due to its features such as grading printed forms for…
Automated Error Detection for Developing Grammar Proficiency of ESL Learners
ERIC Educational Resources Information Center
Feng, Hui-Hsien; Saricaoglu, Aysel; Chukharev-Hudilainen, Evgeny
2016-01-01
Thanks to natural language processing technologies, computer programs are actively being used not only for holistic scoring, but also for formative evaluation of writing. CyWrite is one such program that is under development. The program is built upon Second Language Acquisition theories and aims to assist ESL learners in higher education by…
Speaking of Sisterhood: A Sociolinguistic Study of an Asian American Sorority
ERIC Educational Resources Information Center
Bauman, Carina
2016-01-01
This dissertation explores language as a resource for the formation and expression of ethnic identity among the members of an Asian American college sorority. As a community of practice organized around ethnicity, the sorority provides an excellent site to examine the mutually constitutive relationship of language and ethnic identity. Two features…
Images, Words, and Narrative Epistemology.
ERIC Educational Resources Information Center
Fleckenstein, Kristie S.
1996-01-01
Reviews work suggesting that imagery and language function in tandem to constitute a sense of being, and that metaphors of sight hold as much formative power as metaphors of word. Describes the limitations of language and the ways in which imagery compensates for that limitation. Discusses narrative of epistemology as a fusion of image and…
Raising Multilingual Children: Foreign Language Acquisition and Children.
ERIC Educational Resources Information Center
Tokuhama-Espinosa, Tracey
This book illustrates how children learn foreign languages and when they can do so with the best results. The most recent studies in linguistics, neurology, education, and psychology are evaluated, and the findings are presented in a recipe format. Parents are encouraged to evaluate the multilingual children in their lives with the use of tools…
ERIC Educational Resources Information Center
Lee, Jia-Ying
2015-01-01
This study compares the language learner and test-taking strategies used by Chinese-speaking graduate students when confronted with familiar versus unfamiliar topics in an English multiple-choice format reading comprehension test. Thirty-six participants at a large mid-western university performed three tasks: A content knowledge vocabulary…
Applications of Videodisc Technology to Language Arts, Grades K-12: A Review of the Literature.
ERIC Educational Resources Information Center
Lewis, Martina E.
This monograph traces the history of videodisc technology, describes the videodisc and its functions, reviews classroom applications and limitations, and discusses the future use of videodisc technology in elementary and secondary language arts classes. Two videodisc formats are discussed--constant linear velocity (CLV), and constant angular…
State Education Policy Formation: The Case of Arizona's English Language Learner Legislation
ERIC Educational Resources Information Center
Lawton, Stephen B.
2012-01-01
This historical case study focuses on policy making at the state level by analyzing the development of a new policy for English language learners (ELLs) in Arizona. "New institutionalism" is used as a framework, with political culture and educational regimes acting as environmental factors affecting state policy choices. Key events…
ERIC Educational Resources Information Center
Bruneau, Beverly J.
1997-01-01
Describes the Literacy Pyramid (based on the United States Department of Agriculture food pyramid), a classification of eight instructional events, which is intended as a framework for teachers to think about the purpose of various instructional formats and about organizing time for language arts instruction. (SR)
Affective Dynamics of Leadership: An Experimental Test of Affect Control Theory
ERIC Educational Resources Information Center
Schroder, Tobias; Scholl, Wolfgang
2009-01-01
Affect Control Theory (ACT; Heise 1979, 2007) states that people control social interactions by striving to maintain culturally shared feelings about the situation. The theory is based on mathematical models of language-based impression formation. In a laboratory experiment, we tested the predictive power of a new German-language ACT model with…
Prerequisites for Emotional Intelligence Formation in Second Language Learning and Career Choice
ERIC Educational Resources Information Center
Baklashova, Tatiana A.; Galishnikova, Elena M.; Khafizova, Liliya A.
2016-01-01
The relevance of the topic is due to the enhancing role of emotional intelligence in second language learning. The article aims to substantiate that emotional intelligence (EI) strengthens training quality of future professionals, gives it an emotional color, and thereby increases a variety of intellectual skills. The leading methodical approaches…
Implementing Environmental Culture in the Language Learning Laboratory
ERIC Educational Resources Information Center
Sadykova, Aida G.; Yashina, Marianna E.; Zakirova, Luiza R.
2014-01-01
The article deals with the description of the experimental work aimed at testing the effectiveness of the pedagogical conditions used in the environmental education of senior pupils by means of a foreign language. It is worth mentioning that during the experiment under the detection of the formation of the environmental education of senior pupils…
A Role for Chunk Formation in Statistical Learning of Second Language Syntax
ERIC Educational Resources Information Center
Hamrick, Phillip
2014-01-01
Humans are remarkably sensitive to the statistical structure of language. However, different mechanisms have been proposed to account for such statistical sensitivities. The present study compared adult learning of syntax and the ability of two models of statistical learning to simulate human performance: Simple Recurrent Networks, which learn by…
ERIC Educational Resources Information Center
Wall, C. Edward; And Others
1995-01-01
Discusses the integration of Standard General Markup Language, Hypertext Markup Language, and MARC format to parse classified analytical bibliographies. Use of the resulting electronic knowledge constructs in local library systems as maps of a specified subset of resources is discussed, and an example is included. (LRW)
Perceived In-Group and Out-Group Stereotypes among Brazilian Foreign Language Students.
ERIC Educational Resources Information Center
El-Dash, Linda Gentry; Busnardo, JoAnne
2001-01-01
Presents the results of a study of stereotypical perceptions of ten foreign populations by 164 Brazilian university students studying diverse foreign languages. Socio-cultural stereotypes were investigated using bipolar adjective scales paired in a Likkert-type format. Factor analysis suggested a three-factor system is at work, consisting of…
ERIC Educational Resources Information Center
Vitanova, Gergana
2016-01-01
This article has several interconnected goals. First, it foregrounds the role of narratives and narrative inquiry in the research of second language teaching practices. It illustrates how multimodal narrativity could be used in analyzing the formation of personal and professional identities of several female teachers of English. Specifically, it…
Litterature: Retour au texte (Literature: Return to the Text).
ERIC Educational Resources Information Center
Noe, Alfred
1993-01-01
Choice of texts for use in French language instruction is discussed. It is argued that the text's format (e.g., advertising, figurative poetry, journal article, play, prose, etc.) is instrumental in bringing attention to the language in it, and this has implications for the best uses of different text types. (MSE)
Writing Conferences and Some Applications for the EFL Classroom.
ERIC Educational Resources Information Center
Renner, Christopher
1990-01-01
A teacher of English as a Foreign Language (EFL) to adults in a non-English-speaking country describes use of classroom writing conferences to improve student language use and introduce writing into the communicative syllabus. The approach is based on a conference format and focuses on self-directed inquiry. Students are provided with monolingual…
Knowledge Query Language (KQL)
2016-02-12
Lexington Massachusetts This page intentionally left blank. iii EXECUTIVE SUMMARY Currently, queries for data ...retrieval from non-Structured Query Language (NoSQL) data stores are tightly coupled to the specific implementation of the data store implementation...independent of the storage content and format for querying NoSQL or relational data stores. This approach uses address expressions (or A-Expressions
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core
Hucka, Michael; Bergmann, Frank T.; Hoops, Stefan; Keating, Sarah M.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Wilkinson, Darren J.
2017-01-01
Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528564
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core.
Hucka, Michael; Bergmann, Frank T; Hoops, Stefan; Keating, Sarah M; Sahle, Sven; Schaff, James C; Smith, Lucian P; Wilkinson, Darren J
2015-09-04
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core.
Hucka, Michael; Bergmann, Frank T; Hoops, Stefan; Keating, Sarah M; Sahle, Sven; Schaff, James C; Smith, Lucian P; Wilkinson, Darren J
2015-06-01
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
Plain Language Summary: Adult Sinusitis (Sinus Infection).
Caspersen, Leslie A; Walter, Lindsey M; Walsh, Sandra A; Rosenfeld, Richard M; Piccirillo, Jay F
2015-08-01
This plain language summary serves as an overview in explaining sinusitis (pronounced sign-you-side-tis). The purpose of this plain language summary is to provide patients with standard language explaining their condition in an easy-to-read format. This summary applies to those 18 years of age or older with sinusitis. The summary is featured as an FAQ (frequently asked question) format. The summary addresses how to manage and treat sinusitis symptoms. Adult sinusitis is often called a sinus infection. A healthcare provider may refer to a sinus infection as rhinosinusitis (pronounced rhi-no-sign-you-side-tis). This includes the nose as well as the sinuses in the name. A sinus infection is the swelling of the sinuses and nasal cavity.The summary is based on the published 2015 "Clinical Practice Guideline: Adult Sinusitis." The evidence-based guideline includes research to support more effective diagnosis and treatment of adult sinus infections. The guideline was developed as a quality improvement opportunity for managing sinus infections by creating clear recommendations to use in medical practice. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2015.
Guo, Jinqiu; Takada, Akira; Tanaka, Koji; Sato, Junzo; Suzuki, Muneou; Takahashi, Kiwamu; Daimon, Hiroyuki; Suzuki, Toshiaki; Nakashima, Yusei; Araki, Kenji; Yoshihara, Hiroyuki
2005-08-01
With the evolving and diverse electronic medical record (EMR) systems, there appears to be an ever greater need to link EMR systems and patient accounting systems with a standardized data exchange format. To this end, the CLinical Accounting InforMation (CLAIM) data exchange standard was developed. CLAIM is subordinate to the Medical Markup Language (MML) standard, which allows the exchange of medical data among different medical institutions. CLAIM uses eXtensible Markup Language (XML) as a meta-language. The current version, 2.1, inherited the basic structure of MML 2.x and contains two modules including information related to registration, appointment, procedure and charging. CLAIM 2.1 was implemented successfully in Japan in 2001. Consequently, it was confirmed that CLAIM could be used as an effective data exchange format between EMR systems and patient accounting systems.
Blurring the Inputs: A Natural Language Approach to Sensitivity Analysis
NASA Technical Reports Server (NTRS)
Kleb, William L.; Thompson, Richard A.; Johnston, Christopher O.
2007-01-01
To document model parameter uncertainties and to automate sensitivity analyses for numerical simulation codes, a natural-language-based method to specify tolerances has been developed. With this new method, uncertainties are expressed in a natural manner, i.e., as one would on an engineering drawing, namely, 5.25 +/- 0.01. This approach is robust and readily adapted to various application domains because it does not rely on parsing the particular structure of input file formats. Instead, tolerances of a standard format are added to existing fields within an input file. As a demonstration of the power of this simple, natural language approach, a Monte Carlo sensitivity analysis is performed for three disparate simulation codes: fluid dynamics (LAURA), radiation (HARA), and ablation (FIAT). Effort required to harness each code for sensitivity analysis was recorded to demonstrate the generality and flexibility of this new approach.
NASA Technical Reports Server (NTRS)
Seidewitz, Edwin V.; Agresti, William; Ferry, Daniel; Lavallee, David; Maresca, Paul; Nelson, Robert; Quimby, Kelvin; Rosenberg, Jacob; Roy, Daniel; Shell, Allyn
1987-01-01
Ada is a programming language of considerable expressive power. The Ada Language Reference Manual provides a thorough definition of the language. However, it does not offer sufficient guidance on the appropriate use of Ada's powerful features. For this reason, the Goddard Space Flight Center Ada User's Group has produced this style guide which addresses such program style issues. The guide covers three areas of Ada program style: the structural decomposition of a program; the coding and the use of specific Ada features; and the textural formatting of a program.
ERIC Educational Resources Information Center
Nivette, Jos, Ed.
Selected papers that address theoretical and practical training of the modern language teacher and language teaching experiments in various countries are presented. Some of the articles included are the following: "Les problemes de la formation linguistique et pedagogique des professeurs de francais en Afrique Subsaharienne" (The…
NASA Technical Reports Server (NTRS)
Knight, J. C.; Hamm, R. W.
1984-01-01
PASCAL/48 is a programming language for the Intel MCS-48 series of microcomputers. In particular, it can be used with the Intel 8748. It is designed to allow the programmer to control most of the instructions being generated and the allocation of storage. The language can be used instead of ASSEMBLY language in most applications while allowing the user the necessary degree of control over hardware resources. Although it is called PASCAL/48, the language differs in many ways from PASCAL. The program structure and statements of the two languages are similar, but the expression mechanism and data types are different. The PASCAL/48 cross-compiler is written in PASCAL and runs on the CDC CYBER NOS system. It generates object code in Intel hexadecimal format that can be used to program the MCS-48 series of microcomputers. This reference manual defines the language, describes the predeclared procedures, lists error messages, illustrates use, and includes language syntax diagrams.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ingargiola, A.; Laurence, T. A.; Boutelle, R.
We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode (SPAD), photomultiplier tube (PMT) or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. Themore » format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. As a result, to encourage adoption by the academic and commercial communities, all software is released under the MIT open source license.« less
PIML: the Pathogen Information Markup Language.
He, Yongqun; Vines, Richard R; Wattam, Alice R; Abramochkin, Georgiy V; Dickerman, Allan W; Eckart, J Dana; Sobral, Bruno W S
2005-01-01
A vast amount of information about human, animal and plant pathogens has been acquired, stored and displayed in varied formats through different resources, both electronically and otherwise. However, there is no community standard format for organizing this information or agreement on machine-readable format(s) for data exchange, thereby hampering interoperation efforts across information systems harboring such infectious disease data. The Pathogen Information Markup Language (PIML) is a free, open, XML-based format for representing pathogen information. XSLT-based visual presentations of valid PIML documents were developed and can be accessed through the PathInfo website or as part of the interoperable web services federation known as ToolBus/PathPort. Currently, detailed PIML documents are available for 21 pathogens deemed of high priority with regard to public health and national biological defense. A dynamic query system allows simple queries as well as comparisons among these pathogens. Continuing efforts are being taken to include other groups' supporting PIML and to develop more PIML documents. All the PIML-related information is accessible from http://www.vbi.vt.edu/pathport/pathinfo/
Huh, Sun
2013-01-01
ScienceCentral, a free or open access, full-text archive of scientific journal literature at the Korean Federation of Science and Technology Societies, was under test in September 2013. Since it is a Journal Article Tag Suite-based full text database, extensible markup language files of all languages can be presented, according to Unicode Transformation Format 8-bit encoding. It is comparable to PubMed Central: however, there are two distinct differences. First, its scope comprises all science fields; second, it accepts all language journals. Launching ScienceCentral is the first step for free access or open access academic scientific journals of all languages to leap to the world, including scientific journals from Croatia.
ERIC Educational Resources Information Center
Welch, Karen P.
2017-01-01
Formative assessment has been identified as an effective pedagogical practice in the field of education, where teachers and students engage daily in an interactive process to gather evidence of the students' proficiency of a specific learning goal. The evidence collected by the teacher and a student during the formative assessment process allows…
ERIC Educational Resources Information Center
Sturikova, Marina V.; Albrekht, Nina V.; Kondyurina, Irina M.; Rozhneva, Svetlana S.; Sankova, Larisa V.; Morozova, Elena S.
2016-01-01
The relevance of the research problem driven by the necessity of formation of future specialists' communicative competence as a component of professional competence with the aim of further professional mobility of graduates. The purpose of the article is to justify the possibility and necessity of formation of the required competencies in language…
ERIC Educational Resources Information Center
Ahmadi, Alireza; Sadeghi, Elham
2016-01-01
In the present study we investigated the effect of test format on oral performance in terms of test scores and discourse features (accuracy, fluency, and complexity). Moreover, we explored how the scores obtained on different test formats relate to such features. To this end, 23 Iranian EFL learners participated in three test formats of monologue,…
Photon-HDF5: An Open File Format for Timestamp-Based Single-Molecule Fluorescence Experiments.
Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier
2016-01-05
We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long-term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode, photomultiplier tube, or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc.) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. The format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference Python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. To encourage adoption by the academic and commercial communities, all software is released under the MIT open source license. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Photon-HDF5: An Open File Format for Timestamp-Based Single-Molecule Fluorescence Experiments
Ingargiola, Antonino; Laurence, Ted; Boutelle, Robert; Weiss, Shimon; Michalet, Xavier
2016-01-01
We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long-term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode, photomultiplier tube, or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc.) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. The format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference Python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. To encourage adoption by the academic and commercial communities, all software is released under the MIT open source license. PMID:26745406
Ingargiola, A.; Laurence, T. A.; Boutelle, R.; ...
2015-12-23
We introduce Photon-HDF5, an open and efficient file format to simplify exchange and long term accessibility of data from single-molecule fluorescence experiments based on photon-counting detectors such as single-photon avalanche diode (SPAD), photomultiplier tube (PMT) or arrays of such detectors. The format is based on HDF5, a widely used platform- and language-independent hierarchical file format for which user-friendly viewers are available. Photon-HDF5 can store raw photon data (timestamp, channel number, etc) from any acquisition hardware, but also setup and sample description, information on provenance, authorship and other metadata, and is flexible enough to include any kind of custom data. Themore » format specifications are hosted on a public website, which is open to contributions by the biophysics community. As an initial resource, the website provides code examples to read Photon-HDF5 files in several programming languages and a reference python library (phconvert), to create new Photon-HDF5 files and convert several existing file formats into Photon-HDF5. As a result, to encourage adoption by the academic and commercial communities, all software is released under the MIT open source license.« less
Trigo, Jesús Daniel; Martínez, Ignacio; Alesanco, Alvaro; Kollmann, Alexander; Escayola, Javier; Hayn, Dieter; Schreier, Günter; García, José
2012-07-01
This paper investigates the application of the enterprise information system (EIS) paradigm to standardized cardiovascular condition monitoring. There are many specifications in cardiology, particularly in the ECG standardization arena. The existence of ECG formats, however, does not guarantee the implementation of homogeneous, standardized solutions for ECG management. In fact, hospital management services need to cope with various ECG formats and, moreover, several different visualization applications. This heterogeneity hampers the normalization of integrated, standardized healthcare information systems, hence the need for finding an appropriate combination of ECG formats and a suitable EIS-based software architecture that enables standardized exchange and homogeneous management of ECG formats. Determining such a combination is one objective of this paper. The second aim is to design and develop the integrated healthcare information system that satisfies the requirements posed by the previous determination. The ECG formats selected include ISO/IEEE11073, Standard Communications Protocol for Computer-Assisted Electrocardiography, and an ECG ontology. The EIS-enabling techniques and technologies selected include web services, simple object access protocol, extensible markup language, or business process execution language. Such a selection ensures the standardized exchange of ECGs within, or across, healthcare information systems while providing modularity and accessibility.
Using spoken words to guide open-ended category formation.
Chauhan, Aneesh; Seabra Lopes, Luís
2011-11-01
Naming is a powerful cognitive tool that facilitates categorization by forming an association between words and their referents. There is evidence in child development literature that strong links exist between early word-learning and conceptual development. A growing view is also emerging that language is a cultural product created and acquired through social interactions. Inspired by these studies, this paper presents a novel learning architecture for category formation and vocabulary acquisition in robots through active interaction with humans. This architecture is open-ended and is capable of acquiring new categories and category names incrementally. The process can be compared to language grounding in children at single-word stage. The robot is embodied with visual and auditory sensors for world perception. A human instructor uses speech to teach the robot the names of the objects present in a visually shared environment. The robot uses its perceptual input to ground these spoken words and dynamically form/organize category descriptions in order to achieve better categorization. To evaluate the learning system at word-learning and category formation tasks, two experiments were conducted using a simple language game involving naming and corrective feedback actions from the human user. The obtained results are presented and discussed in detail.
Hochman, Bernardo; Locali, Rafael Fagionato; Oliveira Filho, Renato Santos de; Oliveira, Ricardo Leão de; Goldenberg, Saul; Ferreira, Lydia Masako
2006-01-01
To suggest a standardization, in the English language, the formatting of the citation of the research centers. From three more recent publications of the first 20 journals available in Brazilian Portal of Scientific Information - Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES), with bigger factor of impact during the year of 2004, according of information in ISI Web of Knowledge Journal Citation Reports database in biennium 2004-2005, had extracted the formats of citations of the research centers. An analogy to the institutional hierarchie step of the Federal University of Sao Paulo (UNIFESP) was carried out, and the formats most frequent, in the English language, had been adopted as standard to be suggested to cite the research centers for sending articles. In relation to the citation "Departamento", was standardized "Department of ..." (being "..." the name in English of the Department), to the citation "Programa de Pós-Graduação" "... Program", "Disciplina" "Division of ...", "Orgãos, Grupos e Associações" "... Group ", "Setor" "Section of...", "Centro" "Center for ...", "Unidade" "... Unit ", "Instituto" "Institute of ...", "Laboratório" "Laboratory of ..." and "Grupo" "Group of ...".
Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.
Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar
2015-09-04
The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.
Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.
Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar
2015-06-01
The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.
Shuttle-Data-Tape XML Translator
NASA Technical Reports Server (NTRS)
Barry, Matthew R.; Osborne, Richard N.
2005-01-01
JSDTImport is a computer program for translating native Shuttle Data Tape (SDT) files from American Standard Code for Information Interchange (ASCII) format into databases in other formats. JSDTImport solves the problem of organizing the SDT content, affording flexibility to enable users to choose how to store the information in a database to better support client and server applications. JSDTImport can be dynamically configured by use of a simple Extensible Markup Language (XML) file. JSDTImport uses this XML file to define how each record and field will be parsed, its layout and definition, and how the resulting database will be structured. JSDTImport also includes a client application programming interface (API) layer that provides abstraction for the data-querying process. The API enables a user to specify the search criteria to apply in gathering all the data relevant to a query. The API can be used to organize the SDT content and translate into a native XML database. The XML format is structured into efficient sections, enabling excellent query performance by use of the XPath query language. Optionally, the content can be translated into a Structured Query Language (SQL) database for fast, reliable SQL queries on standard database server computers.
The gel electrophoresis markup language (GelML) from the Proteomics Standards Initiative.
Gibson, Frank; Hoogland, Christine; Martinez-Bartolomé, Salvador; Medina-Aunon, J Alberto; Albar, Juan Pablo; Babnigg, Gyorgy; Wipat, Anil; Hermjakob, Henning; Almeida, Jonas S; Stanislaus, Romesh; Paton, Norman W; Jones, Andrew R
2010-09-01
The Human Proteome Organisation's Proteomics Standards Initiative has developed the GelML (gel electrophoresis markup language) data exchange format for representing gel electrophoresis experiments performed in proteomics investigations. The format closely follows the reporting guidelines for gel electrophoresis, which are part of the Minimum Information About a Proteomics Experiment (MIAPE) set of modules. GelML supports the capture of metadata (such as experimental protocols) and data (such as gel images) resulting from gel electrophoresis so that laboratories can be compliant with the MIAPE Gel Electrophoresis guidelines, while allowing such data sets to be exchanged or downloaded from public repositories. The format is sufficiently flexible to capture data from a broad range of experimental processes, and complements other PSI formats for MS data and the results of protein and peptide identifications to capture entire gel-based proteome workflows. GelML has resulted from the open standardisation process of PSI consisting of both public consultation and anonymous review of the specifications.
Markó, K; Schulz, S; Hahn, U
2005-01-01
We propose an interlingua-based indexing approach to account for the particular challenges that arise in the design and implementation of cross-language document retrieval systems for the medical domain. Documents, as well as queries, are mapped to a language-independent conceptual layer on which retrieval operations are performed. We contrast this approach with the direct translation of German queries to English ones which, subsequently, are matched against English documents. We evaluate both approaches, interlingua-based and direct translation, on a large medical document collection, the OHSUMED corpus. A substantial benefit for interlingua-based document retrieval using German queries on English texts is found, which amounts to 93% of the (monolingual) English baseline. Most state-of-the-art cross-language information retrieval systems translate user queries to the language(s) of the target documents. In contra-distinction to this approach, translating both documents and user queries into a language-independent, concept-like representation format is more beneficial to enhance cross-language retrieval performance.
ERIC Educational Resources Information Center
Fuchs, Carolin
2016-01-01
This exploratory study contributes to the underexplored area of collaborative task formats in telecollaboration. The study investigates how English as a second language (ESL) student teachers in the US and English as a foreign language (EFL) student teachers in Turkey negotiated the design, implementation, and evaluation of technology-based…
Language Planning and Personal Naming in Lithuania
ERIC Educational Resources Information Center
Ramoniene, Meilute
2007-01-01
This paper deals with the issues of language planning and naming in Lithuania since the restoration of independence in 1990. The aim of the paper is to analyse the challenges of corpus planning with the focus on the use and standardisation of personal names. The paper first presents the historical context of the formation of names in Lithuania and…
Language Policies' Impact on Immigrant Students' Lived Experiences in New York City Public Schools
ERIC Educational Resources Information Center
Gica, Diosdado Galan, Jr.
2012-01-01
Language policies' impact is evident in how most immigrant children become English monolinguals by the third generation. Yet a large percentage continues to underperform in public schools. Formative and summative evaluations draw from a narrow methodology, thus this study strived to tell the stories of immigrant students' lived experiences in New…
What Are More Effective in English Classrooms: Textbooks or Podcasts?
ERIC Educational Resources Information Center
Selwood, Jaime; Lauer, Joe; Enokida, Kazumichi
2016-01-01
In the 21st century it has become clear that more and more language-learning pedagogical materials have begun to shift to a digital mobile-access format and away from being a textbook and classroom based one. High quality language-learning podcasts can provide a cheap, beneficial and portable technology that allows learners the freedom to access…
Reta'maxik Qatzij--Conociendo Nuestro Idioma (Knowing Our Language). [CD-ROM].
ERIC Educational Resources Information Center
Academy for Educational Development, Washington, DC.
This CD-ROM is part of an interactive and dynamic multimedia package of information and games for learning K'iche' and Ixil. This CD-ROM includes six books in electronic format with interactive exercises that support improved bilingual and intercultural education and teacher training, specifically in the languages of K'iche' and Ixil. Books…
The Story of "Proyecto Papan"--Folktales and Their Potential for Foreign Language Education.
ERIC Educational Resources Information Center
de Ramirez, Lori Langer
1999-01-01
Discusses the potential for stories in the foreign language curriculum, and presents results of a qualitative research study in which 22 stories from the oral traditions of Argentina, Colombia, and Mexico were presented to students in different formats: picture books, audiotapes, videotapes, a Web page, and a Hypermedia program. (Author/VWL)
NASA Astrophysics Data System (ADS)
Petrobelli, P.
2011-06-01
Claudio Monteverdi appears as the key personality of the music in Galileo's time. His revolution in format and function of the musical language-from an essentially edonistic creation of purely sonorous images to a musical language consciously "expressive" of the content of the words on which it is based-is similar in character to the influential innovations in scientific thinking operated by Galileo.
Morphological Awareness and Learning to Read: A Cross-Language Perspective
ERIC Educational Resources Information Center
Kuo, Li-jen; Anderson, Richard C.
2006-01-01
In the past decade, there has been a surge of interest in morphological awareness, which refers to the ability to reflect on and manipulate morphemes and word formation rules in a language. This review provides a critical synthesis of empirical studies on this topic from a broad cross-linguistic perspective. Research with children speaking several…
ERIC Educational Resources Information Center
Ettinger, Blanche; Perfetto, Edda
Using a developmental, hands-on approach, this text/workbook helps students master the basic English skills that are essential to write effective business correspondence, to recognize language errors, and to develop decision-making and problem-solving skills. Its step-by-step focus and industry-specific format encourages students to review,…
Flip-J: Development of the System for Flipped Jigsaw Supported Language Learning
ERIC Educational Resources Information Center
Yamada, Masanori; Goda, Yoshiko; Hata, Kojiro; Matsukawa, Hideya; Yasunami, Seisuke
2016-01-01
This study aims to develop and evaluate a language learning system supported by the "flipped jigsaw" technique, called "Flip-J". This system mainly consists of three functions: (1) the creation of a learning material database, (2) allocation of learning materials, and (3) formation of an expert and jigsaw group. Flip-J was…
ERIC Educational Resources Information Center
Suarez, Stephanie Cox; Daniels, Karen J.
2009-01-01
This case study uses documentation as a tool for formative assessment to interpret the learning of twin boys with significantly delayed language skills. Reggio-inspired documentation (the act of collecting, interpreting, and reflecting on traces of learning from video, images, and observation notes) focused on the unfolding of the boys' nonverbal…
ERIC Educational Resources Information Center
Schuler, Kathryn D.; Reeder, Patricia A.; Newport, Elissa L.; Aslin, Richard N.
2017-01-01
Successful language acquisition hinges on organizing individual words into grammatical categories and learning the relationships between them, but the method by which children accomplish this task has been debated in the literature. One proposal is that learners use the shared distributional contexts in which words appear as a cue to their…
Oxford Guide to British and American Culture for Learners of English.
ERIC Educational Resources Information Center
Crowther, Jonathan, Ed.; Kavanagh, Kathryn, Ed.
The guide to American and British culture, for upper secondary- and university-level students, is intended for use by learners of English as a second language. It is designed to explain specific aspects of British and American life and traditions not generally included in English language dictionaries. The guide has a dictionary format, with terms…
Knowledge Query Language (KQL)
2016-02-01
unlimited. This page intentionally left blank. iii EXECUTIVE SUMMARY Currently, queries for data ...retrieval from non-Structured Query Language (NoSQL) data stores are tightly coupled to the specific implementation of the data store implementation, making...of the storage content and format for querying NoSQL or relational data stores. This approach uses address expressions (or A-Expressions) embedded in
Image-Based Approach to Mapping, Charting, and Geodesy.
1982-02-01
he..( reit- , st the .S Armyv I-npi ner Topographic Laboratories-, hll. Pi."ht inltete;(d inl de term ining the taipabil1i t ic e s nd 8 raotlhacho_...a.ve-ttor- bas-ed tipo I pi calIlIv t i l t ill-( I C I S) I (,r PiPC t; al p I( I i lt i ll (S fai-p i(%’ c t alI . , 1978’) , tr I F :TI, rt it I.,r...rt is to pr i de f i : i t i t t he r in I ts of () Po r sei relit inidiui tic-d a t ,IllP. i n dir - ut t \\ Ii lisP Ordr l-P!U, Anit-odment Noe. 125
Buckley, Julliette M; Coopey, Suzanne B; Sharko, John; Polubriaginof, Fernanda; Drohan, Brian; Belli, Ahmet K; Kim, Elizabeth M H; Garber, Judy E; Smith, Barbara L; Gadd, Michele A; Specht, Michelle C; Roche, Constance A; Gudewicz, Thomas M; Hughes, Kevin S
2012-01-01
The opportunity to integrate clinical decision support systems into clinical practice is limited due to the lack of structured, machine readable data in the current format of the electronic health record. Natural language processing has been designed to convert free text into machine readable data. The aim of the current study was to ascertain the feasibility of using natural language processing to extract clinical information from >76,000 breast pathology reports. APPROACH AND PROCEDURE: Breast pathology reports from three institutions were analyzed using natural language processing software (Clearforest, Waltham, MA) to extract information on a variety of pathologic diagnoses of interest. Data tables were created from the extracted information according to date of surgery, side of surgery, and medical record number. The variety of ways in which each diagnosis could be represented was recorded, as a means of demonstrating the complexity of machine interpretation of free text. There was widespread variation in how pathologists reported common pathologic diagnoses. We report, for example, 124 ways of saying invasive ductal carcinoma and 95 ways of saying invasive lobular carcinoma. There were >4000 ways of saying invasive ductal carcinoma was not present. Natural language processor sensitivity and specificity were 99.1% and 96.5% when compared to expert human coders. We have demonstrated how a large body of free text medical information such as seen in breast pathology reports, can be converted to a machine readable format using natural language processing, and described the inherent complexities of the task.
TransNewGuinea.org: An Online Database of New Guinea Languages
Greenhill, Simon J.
2015-01-01
The island of New Guinea has the world’s highest linguistic diversity, with more than 900 languages divided into at least 23 distinct language families. This diversity includes the world’s third largest language family: Trans-New Guinea. However, the region is one of the world’s least well studied, and primary data is scattered across a wide range of publications and more often then not hidden in unpublished “gray” literature. The lack of primary research data on the New Guinea languages has been a major impediment to our understanding of these languages, and the history of the peoples in New Guinea. TransNewGuinea.org aims to collect data about these languages and place them online in a consistent format. This database will enable future research into the New Guinea languages with both traditional comparative linguistic methods and novel cutting-edge computational techniques. The long-term aim is to shed light into the prehistory of the peoples of New Guinea, and to understand why there is such major diversity in their languages. PMID:26506615
ERIC Educational Resources Information Center
Kheimets, Nina G.; Epstein, Alek D.
2005-01-01
This paper presents sociological analysis of the linguistic and cultural identity of two of Israel's most influential and high-ranked universities during their formative years, that were also the "de facto" formative years of the Israeli state-in-the-making (1924-1948). We argue that the influence of external universal factors on a…
AphasiaBank: a resource for clinicians.
Forbes, Margaret M; Fromm, Davida; Macwhinney, Brian
2012-08-01
AphasiaBank is a shared, multimedia database containing videos and transcriptions of ~180 aphasic individuals and 140 nonaphasic controls performing a uniform set of discourse tasks. The language in the videos is transcribed in Codes for the Human Analysis of Transcripts (CHAT) format and coded for analysis with Computerized Language ANalysis (CLAN) programs, which can perform a wide variety of language analyses. The database and the CLAN programs are freely available to aphasia researchers and clinicians for educational, clinical, and scholarly uses. This article describes the database, suggests some ways in which clinicians and clinician researchers might find these materials useful, and introduces a new language analysis program, EVAL, designed to streamline the transcription and coding processes, while still producing an extensive and useful language profile. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Huh, Sun
2013-01-01
ScienceCentral, a free or open access, full-text archive of scientific journal literature at the Korean Federation of Science and Technology Societies, was under test in September 2013. Since it is a Journal Article Tag Suite-based full text database, extensible markup language files of all languages can be presented, according to Unicode Transformation Format 8-bit encoding. It is comparable to PubMed Central: however, there are two distinct differences. First, its scope comprises all science fields; second, it accepts all language journals. Launching ScienceCentral is the first step for free access or open access academic scientific journals of all languages to leap to the world, including scientific journals from Croatia. PMID:24266292
Speech evaluation in children with temporomandibular disorders
PIZOLATO, Raquel Aparecida; FERNANDES, Frederico Silva de Freitas; GAVIÃO, Maria Beatriz Duarte
2011-01-01
Objectives The aims of this study were to evaluate the influence of temporomandibular disorders (TMD) on speech in children, and to verify the influence of occlusal characteristics. Material and methods Speech and dental occlusal characteristics were assessed in 152 Brazilian children (78 boys and 74 girls), aged 8 to 12 (mean age 10.05 ± 1.39 years) with or without TMD signs and symptoms. The clinical signs were evaluated using the Research Diagnostic Criteria for TMD (RDC/TMD) (axis I) and the symptoms were evaluated using a questionnaire. The following groups were formed: Group TMD (n=40), TMD signs and symptoms (Group S and S, n=68), TMD signs or symptoms (Group S or S, n=33), and without signs and symptoms (Group N, n=11). Articulatory speech disorders were diagnosed during spontaneous speech and repetition of the words using the "Phonological Assessment of Child Speech" for the Portuguese language. It was also applied a list of 40 phonological balanced words, read by the speech pathologist and repeated by the children. Data were analyzed by descriptive statistics, Fisher's exact or Chi-square tests (α=0.05). Results A slight prevalence of articulatory disturbances, such as substitutions, omissions and distortions of the sibilants /s/ and /z/, and no deviations in jaw lateral movements were observed. Reduction of vertical amplitude was found in 10 children, the prevalence being greater in TMD signs and symptoms children than in the normal children. The tongue protrusion in phonemes /t/, /d/, /n/, /l/ and frontal lips in phonemes /s/ and /z/ were the most prevalent visual alterations. There was a high percentage of dental occlusal alterations. Conclusions There was no association between TMD and speech disorders. Occlusal alterations may be factors of influence, allowing distortions and frontal lisp in phonemes /s/ and /z/ and inadequate tongue position in phonemes /t/; /d/; /n/; /l/. PMID:21986655
NASA Technical Reports Server (NTRS)
Colombano, Silvano; Norvig, Peter (Technical Monitor)
2000-01-01
Few human endeavors can be viewed both as extremely successful and unsuccessful at the same time. This is typically the case when goals have not been well defined or have been shifting in time. This has certainly been true of Artificial Intelligence (AI). The nature of intelligence has been the object of much thought and speculation throughout the history of philosophy. It is in the nature of philosophy that real headway is sometimes made only when appropriate tools become available. Similarly the computer, coupled with the ability to program (at least in principle) any function, appeared to be the tool that could tackle the notion of intelligence. To suit the tool, the problem of the nature of intelligence was soon sidestepped in favor of this notion: If a probing conversation with a computer could not be distinguished from a conversation with a human, then AI had been achieved. This notion became known as the Turing test, after the mathematician Alan Turing who proposed it in 1950. Conceptually rich and interesting, these early efforts gave rise to a large portion of the field's framework. Key to AI, rather than the 'number crunching' typical of computers until then, was viewed as the ability to manipulate symbols and make logical inferences. To facilitate these tasks, AI languages such as LISP and Prolog were invented and used widely in the field. One idea that emerged and enabled some success with real world problems was the notion that 'most intelligence' really resided in knowledge. A phrase attributed to Feigenbaum, one of the pioneers, was 'knowledge is the power.' With this premise, the problem is shifted from 'how do we solve problems' to 'how do we represent knowledge.' A good knowledge representation scheme could allow one to draw conclusions from given premises. Such schemes took forms such as rules,frames and scripts. It allowed the building of what became known as expert systems or knowledge based systems (KBS).
Runtime Verification of C Programs
NASA Technical Reports Server (NTRS)
Havelund, Klaus
2008-01-01
We present in this paper a framework, RMOR, for monitoring the execution of C programs against state machines, expressed in a textual (nongraphical) format in files separate from the program. The state machine language has been inspired by a graphical state machine language RCAT recently developed at the Jet Propulsion Laboratory, as an alternative to using Linear Temporal Logic (LTL) for requirements capture. Transitions between states are labeled with abstract event names and Boolean expressions over such. The abstract events are connected to code fragments using an aspect-oriented pointcut language similar to ASPECTJ's or ASPECTC's pointcut language. The system is implemented in the C analysis and transformation package CIL, and is programmed in OCAML, the implementation language of CIL. The work is closely related to the notion of stateful aspects within aspect-oriented programming, where pointcut languages are extended with temporal assertions over the execution trace.
Processing sequence annotation data using the Lua programming language.
Ueno, Yutaka; Arita, Masanori; Kumagai, Toshitaka; Asai, Kiyoshi
2003-01-01
The data processing language in a graphical software tool that manages sequence annotation data from genome databases should provide flexible functions for the tasks in molecular biology research. Among currently available languages we adopted the Lua programming language. It fulfills our requirements to perform computational tasks for sequence map layouts, i.e. the handling of data containers, symbolic reference to data, and a simple programming syntax. Upon importing a foreign file, the original data are first decomposed in the Lua language while maintaining the original data schema. The converted data are parsed by the Lua interpreter and the contents are stored in our data warehouse. Then, portions of annotations are selected and arranged into our catalog format to be depicted on the sequence map. Our sequence visualization program was successfully implemented, embedding the Lua language for processing of annotation data and layout script. The program is available at http://staff.aist.go.jp/yutaka.ueno/guppy/.
Writing Inservice Guide for English Language Arts and TAAS.
ERIC Educational Resources Information Center
Texas Education Agency, Austin.
This guide, made up of transparencies and text, offers a basis for a 2-day interactive inservice presentation on how to teach writing, to help a school district ensure that its English language arts program addresses the Texas Assessment of Academic Skills (TAAS) test. In addition to sections on the use of the guide and the format of the TAAS…
ERIC Educational Resources Information Center
Valdez, Carmen R.; Mills, Monique T.; Bohlig, Amanda J.; Kaplan, David
2013-01-01
This person-centered study examines the extent to which parents' language dominance influences the effects of an after school, multi-family group intervention, FAST, on low-income children's emotional and behavioral outcomes via parents' relations with other parents and with school staff. Social capital resides in relationships of trust and shared…
ERIC Educational Resources Information Center
Villarreal Ballesteros, Ana Cecilia
2010-01-01
Recent work has shown the importance of identity in language learning and how the desire to belong to an imagined community drives individuals to invest in their learning (Norton, 2000). This work has documented that a mismatch between students' imagined community and the community envisioned by the teacher can have negative outcomes on students'…
ERIC Educational Resources Information Center
Seager, Emily; Abbot-Smith, Kirsten
2017-01-01
Language comprehension delays in pre-schoolers are predictive of difficulties in a range of developmental domains. In England, early years practitioners are required to assess the language comprehension of 2-year-olds in their care. Many use a format based on the Early Years Foundation Stage Unique Child Communication Sheet (EYFS:UCCS) in which…
ERIC Educational Resources Information Center
Peterson, Gabriel M.; Su, Kuichun; Ries, James E.; Sievert, Mary Ellen C.
2002-01-01
Discussion of Internet use for information searches on health-related topics focuses on a study that examined complexity and variability of natural language in using search terms that express the concept of electronic health (e-health). Highlights include precision of retrieved information; shift in terminology; and queries using the Pub Med…
Made in America: An Informal History of the English Language in the United States.
ERIC Educational Resources Information Center
Bryson, Bill
Claiming that understanding the social context in which words are formed is necessary to appreciate the richness and vitality of language, this book presents an informal, discursive examination of how and why American speech came to be the way it is, and in particular where the words came from. The book follows a roughly chronological format from…
ERIC Educational Resources Information Center
Maun, Ian
2006-01-01
This paper examines visual and affective factors involved in the reading of foreign language texts. It draws on the results of a pilot study among students of post-compulsory school stage studying French in England. Through a detailed analysis of students' reactions to texts, it demonstrates that the use of "authentic" documents under…
ERIC Educational Resources Information Center
Elboubekri, Abdellah
2013-01-01
Intercultural pedagogies theorists and cultural studies scholars have no controversies over the fact that language is the appropriate realm for the formation, contestation and negotiation of identities. As a matter of fact, language teaching and learning are not only involved with linguistic structures and lexical components. They are more engaged…
ERIC Educational Resources Information Center
Department of Education, Washington, DC.
This interactive teleconference (in VHS format, Spanish language version) presents renowned national experts, local educators, and community leaders who share ideas on how to improve schools and reach the National Educational Goals. The 60-minute Satellite Town Meeting focuses on laying the foundation for school success through readiness to read.…
ERIC Educational Resources Information Center
González-Carriedo, Ricardo
2014-01-01
The media in general and newspapers in particular have a powerful influence on the formation of public attitudes in society. This study aimed at identifying and analyzing the ideologies of two newspapers in Arizona in regard to English language learners. Using discourse analysis, 90 texts published by "The Arizona Republic" and the…
ERIC Educational Resources Information Center
Boerma, Tessel; Wijnen, Frank; Leseman, Paul; Blom, Elma
2017-01-01
Purpose: Grammatical morphology is often a locus of difficulty for both children with language impairment (LI) and bilingual children. In contrast to previous research that mainly focused on verbal tense and agreement markings, the present study investigated whether plural and past participle formation can disentangle the effects of LI and…
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 2 Core.
Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J
2018-03-09
Computational models can help researchers to interpret data, understand biological functions, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that different software systems can exchange. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 2 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML, their encoding in XML (the eXtensible Markup Language), validation rules that determine the validity of an SBML document, and examples of models in SBML form. The design of Version 2 differs from Version 1 principally in allowing new MathML constructs, making more child elements optional, and adding identifiers to all SBML elements instead of only selected elements. Other materials and software are available from the SBML project website at http://sbml.org/.
Automated Feedback as a Convergence Tool
ERIC Educational Resources Information Center
Chenoweth, Tim; Corral, Karen; Scott, Kit
2016-01-01
This study evaluates two content delivery options for teaching a programming language to determine whether an asynchronous format can achieve the same learning efficacy as a traditional lecture (face-to-face) format. We use media synchronicity theory as a guide to choose media capabilities to incorporate into an asynchronous tutorial used…
ERIC Educational Resources Information Center
Defense Language Inst., Washington, DC.
This basic course in Brazilian Portuguese consists of 75 lessons in six volumes. Volume I is in two parts, with the dialogs, questions and exercises presented in Portuguese in the first part, and the intonation patterns and English translations presented in the second. The general format follows the Defense Language Institute format, employing…
Language Technologies to Support Formative Feedback
ERIC Educational Resources Information Center
Berlanga, Adriana J.; Kalz, Marco; Stoyanov, Slavi; van Rosmalen, Peter; Smithies, Alisdair; Braidman, Isobel
2011-01-01
Formative feedback enables comparison to be made between a learner's current understanding and a desired learning goal. Obtaining this information is a time consuming task that most tutors cannot afford. We therefore wished to develop a support software tool, which provides tutors and learners with information that identifies a learner's progress,…
The Effects of Write Score Formative Assessment on Student Achievement
ERIC Educational Resources Information Center
Fox, Janice M.
2013-01-01
In an "ex post facto" causal-comparative research design, this study investigated the effectiveness of a formative writing assessment program, Write Score, on increasing student writing achievement. Tennessee Comprehensive Assessment Program (TCAP) reading language arts and writing scores from 2012 were utilized for this study. The…
The functions of language: an experimental study.
Redhead, Gina; Dunbar, R I M
2013-08-14
We test between four separate hypotheses (social gossip, social contracts, mate advertising and factual information exchange) for the function(s) of language using a recall paradigm. Subjects recalled the social content of stories (irrespective of whether this concerned social behavior, defection or romantic events) significantly better than they did ecological information. Recall rates were no better on ecological stories if they involved flamboyant language, suggesting that, if true, Miller's "Scheherazade effect" may not be independent of content. One interpretation of these results might be that language evolved as an all-purpose social tool, and perhaps acquired specialist functions (sexual advertising, contract formation, information exchange) at a later date through conventional evolutionary windows of opportunity.
Sordo, Margarita; Boxwala, Aziz A; Ogunyemi, Omolola; Greenes, Robert A
2004-01-01
A major obstacle to sharing computable clinical knowledge is the lack of a common language for specifying expressions and criteria. Such a language could be used to specify decision criteria, formulae, and constraints on data and action. Al-though the Arden Syntax addresses this problem for clinical rules, its generalization to HL7's object-oriented data model is limited. The GELLO Expression language is an object-oriented language used for expressing logical conditions and computations in the GLIF3 (GuideLine Interchange Format, v. 3) guideline modeling language. It has been further developed under the auspices of the HL7 Clinical Decision Support Technical Committee, as a proposed HL7 standard., GELLO is based on the Object Constraint Language (OCL), because it is vendor-independent, object-oriented, and side-effect-free. GELLO expects an object-oriented data model. Although choice of model is arbitrary, standardization is facilitated by ensuring that the data model is compatible with the HL7 Reference Information Model (RIM).
How to use the WWW to distribute STI
NASA Technical Reports Server (NTRS)
Roper, Donna G.
1994-01-01
This presentation explains how to use the World Wide Web (WWW) to distribute scientific and technical information as hypermedia. WWW clients and servers use the HyperText Transfer Protocol (HTTP) to transfer documents containing links to other text, graphics, video, and sound. The standard language for these documents is the HyperText MarkUp Language (HTML). These are simply text files with formatting codes that contain layout information and hyperlinks. HTML documents can be created with any text editor or with one of the publicly available HTML editors or convertors. HTML can also include links to available image formats. This presentation is available online. The URL is http://sti.larc.nasa. (followed by) gov/demos/workshop/introtext.html.
Gene Fusion Markup Language: a prototype for exchanging gene fusion data.
Kalyana-Sundaram, Shanker; Shanmugam, Achiraman; Chinnaiyan, Arul M
2012-10-16
An avalanche of next generation sequencing (NGS) studies has generated an unprecedented amount of genomic structural variation data. These studies have also identified many novel gene fusion candidates with more detailed resolution than previously achieved. However, in the excitement and necessity of publishing the observations from this recently developed cutting-edge technology, no community standardization approach has arisen to organize and represent the data with the essential attributes in an interchangeable manner. As transcriptome studies have been widely used for gene fusion discoveries, the current non-standard mode of data representation could potentially impede data accessibility, critical analyses, and further discoveries in the near future. Here we propose a prototype, Gene Fusion Markup Language (GFML) as an initiative to provide a standard format for organizing and representing the significant features of gene fusion data. GFML will offer the advantage of representing the data in a machine-readable format to enable data exchange, automated analysis interpretation, and independent verification. As this database-independent exchange initiative evolves it will further facilitate the formation of related databases, repositories, and analysis tools. The GFML prototype is made available at http://code.google.com/p/gfml-prototype/. The Gene Fusion Markup Language (GFML) presented here could facilitate the development of a standard format for organizing, integrating and representing the significant features of gene fusion data in an inter-operable and query-able fashion that will enable biologically intuitive access to gene fusion findings and expedite functional characterization. A similar model is envisaged for other NGS data analyses.
ERIC Educational Resources Information Center
Hayes, Ann Milligan
2015-01-01
Over the last two decades, there has been renewed interest in formative assessment, in large part due to the increasing pressures and prevalence of "high stakes" summative assessments. As states try to meet the requirements of the No Child Left Behind law, teachers and administrators are realizing that formative assessment offers an…
Interchanging lexical information for a multilingual dictionary.
Baud, R H; Nyström, M; Borin, L; Evans, R; Schulz, S; Zweigenbaum, P
2005-01-01
To facilitate the interchange of lexical information for multiple languages in the medical domain. To pave the way for the emergence of a generally available truly multilingual electronic dictionary in the medical domain. An interchange format has to be neutral relative to the target languages. It has to be consistent with current needs of lexicon authors, present and future. An active interaction between six potential authors aimed to determine a common denominator striking the right balance between richness of content and ease of use for lexicon providers. A simple list of relevant attributes has been established and published. The format has the potential for collecting relevant parts of a future multilingual dictionary. An XML version is available. This effort makes feasible the exchange of lexical information between research groups. Interchange files are made available in a public repository. This procedure opens the door to a true multilingual dictionary, in the awareness that the exchange of lexical information is (only) a necessary first step, before structuring the corresponding entries in different languages.
Adding Hierarchical Objects to Relational Database General-Purpose XML-Based Information Managements
NASA Technical Reports Server (NTRS)
Lin, Shu-Chun; Knight, Chris; La, Tracy; Maluf, David; Bell, David; Tran, Khai Peter; Gawdiak, Yuri
2006-01-01
NETMARK is a flexible, high-throughput software system for managing, storing, and rapid searching of unstructured and semi-structured documents. NETMARK transforms such documents from their original highly complex, constantly changing, heterogeneous data formats into well-structured, common data formats in using Hypertext Markup Language (HTML) and/or Extensible Markup Language (XML). The software implements an object-relational database system that combines the best practices of the relational model utilizing Structured Query Language (SQL) with those of the object-oriented, semantic database model for creating complex data. In particular, NETMARK takes advantage of the Oracle 8i object-relational database model using physical-address data types for very efficient keyword searches of records across both context and content. NETMARK also supports multiple international standards such as WEBDAV for drag-and-drop file management and SOAP for integrated information management using Web services. The document-organization and -searching capabilities afforded by NETMARK are likely to make this software attractive for use in disciplines as diverse as science, auditing, and law enforcement.
Genomic Sequence Variation Markup Language (GSVML).
Nakaya, Jun; Kimura, Michio; Hiroi, Kaei; Ido, Keisuke; Yang, Woosung; Tanaka, Hiroshi
2010-02-01
With the aim of making good use of internationally accumulated genomic sequence variation data, which is increasing rapidly due to the explosive amount of genomic research at present, the development of an interoperable data exchange format and its international standardization are necessary. Genomic Sequence Variation Markup Language (GSVML) will focus on genomic sequence variation data and human health applications, such as gene based medicine or pharmacogenomics. We developed GSVML through eight steps, based on case analysis and domain investigations. By focusing on the design scope to human health applications and genomic sequence variation, we attempted to eliminate ambiguity and to ensure practicability. We intended to satisfy the requirements derived from the use case analysis of human-based clinical genomic applications. Based on database investigations, we attempted to minimize the redundancy of the data format, while maximizing the data covering range. We also attempted to ensure communication and interface ability with other Markup Languages, for exchange of omics data among various omics researchers or facilities. The interface ability with developing clinical standards, such as the Health Level Seven Genotype Information model, was analyzed. We developed the human health-oriented GSVML comprising variation data, direct annotation, and indirect annotation categories; the variation data category is required, while the direct and indirect annotation categories are optional. The annotation categories contain omics and clinical information, and have internal relationships. For designing, we examined 6 cases for three criteria as human health application and 15 data elements for three criteria as data formats for genomic sequence variation data exchange. The data format of five international SNP databases and six Markup Languages and the interface ability to the Health Level Seven Genotype Model in terms of 317 items were investigated. GSVML was developed as a potential data exchanging format for genomic sequence variation data exchange focusing on human health applications. The international standardization of GSVML is necessary, and is currently underway. GSVML can be applied to enhance the utilization of genomic sequence variation data worldwide by providing a communicable platform between clinical and research applications. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.
Masseroli, Marco; Kaitoua, Abdulrahman; Pinoli, Pietro; Ceri, Stefano
2016-12-01
While a huge amount of (epi)genomic data of multiple types is becoming available by using Next Generation Sequencing (NGS) technologies, the most important emerging problem is the so-called tertiary analysis, concerned with sense making, e.g., discovering how different (epi)genomic regions and their products interact and cooperate with each other. We propose a paradigm shift in tertiary analysis, based on the use of the Genomic Data Model (GDM), a simple data model which links genomic feature data to their associated experimental, biological and clinical metadata. GDM encompasses all the data formats which have been produced for feature extraction from (epi)genomic datasets. We specifically describe the mapping to GDM of SAM (Sequence Alignment/Map), VCF (Variant Call Format), NARROWPEAK (for called peaks produced by NGS ChIP-seq or DNase-seq methods), and BED (Browser Extensible Data) formats, but GDM supports as well all the formats describing experimental datasets (e.g., including copy number variations, DNA somatic mutations, or gene expressions) and annotations (e.g., regarding transcription start sites, genes, enhancers or CpG islands). We downloaded and integrated samples of all the above-mentioned data types and formats from multiple sources. The GDM is able to homogeneously describe semantically heterogeneous data and makes the ground for providing data interoperability, e.g., achieved through the GenoMetric Query Language (GMQL), a high-level, declarative query language for genomic big data. The combined use of the data model and the query language allows comprehensive processing of multiple heterogeneous data, and supports the development of domain-specific data-driven computations and bio-molecular knowledge discovery. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Lee, HyeSun; Winke, Paula
2013-01-01
We adapted three practice College Scholastic Ability Tests (CSAT) of English listening, each with five-option items, to create four- and three-option versions by asking 73 Korean speakers or learners of English to eliminate the least plausible options in two rounds. Two hundred and sixty-four Korean high school English-language learners formed…
ERIC Educational Resources Information Center
Bakulina, Galina A.; Vakhrusheva, Liudmila N.; Shelygina, Olga B.; Savinova, Svetlana V.
2016-01-01
The purpose of the article is to present of an innovative type of exercises in the Russian language, referred to as the complex intellectual-linguistic. The novelty of these exercises is: a) in an unusual arrangement of linguistic material which creates an educational research situation; b) in giving non-traditional tasks, aimed at simultaneously…
ERIC Educational Resources Information Center
Phillips, M. K.
1981-01-01
Discusses questions concerning training students in techniques of academic work and use of the English language in this context. These questions, currently researched at the English Language Center of King Abdulaziz University, focus on the notion of task as a minimal pedagogic unit, and on task-objective coordination criteria. Societe Nouvelle…
ERIC Educational Resources Information Center
Likitrattanaporn, Wannakarn
2017-01-01
The purposes of this investigation were (1) to examine the findings of effectiveness of the process of learning-by-doing; (2) to develop students' skill of designing English teaching materials and teaching English language; and (3) to determine an efficient format of learning-by-doing used for training student-teachers in the skill of teaching…
ERIC Educational Resources Information Center
Leontyeva, Tatyana V.; Shchetynina, Anna V.; Vorobyeva, Natalya A.; Blinova, Anastasiya N.
2016-01-01
The relevance of the investigated problem is stipulated by the necessity to solve a problem of multicultural humanitarian education and formation of tolerance of students to unfamiliar culture. The purpose of the article is to research educational potential of metaphorical vocabulary in different languages, such as zoomorphic naming units of a man…
ERIC Educational Resources Information Center
Calve, Pierre
1983-01-01
Teacher training for French instructors is criticized as incomplete, consisting only of some formulas for immediate classroom application. A plan for a more comprehensive curriculum consisting of theoretical and practical components in the areas of language, culture, learning, communication, and instruction is proposed. (MSE)
ERIC Educational Resources Information Center
Hasanen, Mohammed M.; Al-Kandari, Ali A.; Al-Sharoufi, Hussain
2014-01-01
This study examines the influence of English language usage and international media on the strength of either national or global identity. The regression analysis of 354 responses reveals that individuals who studied at universities that use English as a medium of instruction show significant differences in the extent to which they embrace a…
ERIC Educational Resources Information Center
Tait, Carolyn
2010-01-01
The recruitment of Asian students into western universities has highlighted the debate about commercialisation of education, academic standards and the role of culture and language in approaches to learning. This article investigates Chinese students' perceptions of how two typical examination formats (multiple choice and essay) affect their…
Multiple Theory Formation in High-Level Perception. Technical Report No. 38.
ERIC Educational Resources Information Center
Woods, William A.
This paper is concerned with the process of human reading as a high-level perceptual task. Drawing on insights from artificial-intelligence research--specifically, research in natural language processing and continuous speech understanding--the paper attempts to present a fairly concrete picture of the kinds of hypothesis formation and inference…
Different Approaches to Teaching the Mechanics of American Psychological Association Style
ERIC Educational Resources Information Center
Franz, Timothy M.; Spitzer, Tam M.
2006-01-01
Students have to learn two distinctly different tasks when writing research papers: a) creating and organizing prose, and b) formatting a manuscript according to the nuances and mechanics of a pre-determined format, such as Modern Language Association (MLA) or American Psychological Association (APA) guidelines. Two studies examined different…
ERIC Educational Resources Information Center
Campbell, Tasha M.
2017-01-01
This dissertation explores Spanish nominal plural formation from a morphophonological perspective. The primary objective is to better understand heritage bilinguals' (HBs') phonological categorization of the morphological element of number in their heritage language. This is done by way of picture-naming elicitation tasks of consonant-final nouns…
ERIC Educational Resources Information Center
Tetlan, W. Lou
2009-01-01
This study examined whether the design of textbook material affects comprehension and memory of textbook material under certain cognitive conditions for proficient and remedial readers. Using quantitative and qualitative research methods, format was found to significantly affect comprehension and memory. Proficient Male scored significantly…
36 CFR 1235.50 - What specifications and standards for transfer apply to electronic records?
Code of Federal Regulations, 2012 CFR
2012-07-01
... electronic records in a format that is independent of specific hardware or software. Except as specified in... a request from NARA to provide the software to decompress the records. (3) Agencies interested in... organization. Acceptable transfer formats include the Geography Markup Language (GML) as defined by the Open...
Kratylos: A Tool for Sharing Interlinearized and Lexical Data in Diverse Formats
ERIC Educational Resources Information Center
Kaufman, Daniel; Finkel, Raphael
2018-01-01
In this paper we present Kratylos, at www.kratylos.org/, a web application that creates searchable multimedia corpora from data collections in diverse formats, including collections of interlinearized glossed text (IGT) and dictionaries. There exists a crucial lacuna in the electronic ecology that supports language documentation and linguistic…
Descriptions of the American Deaf Community, 1830-2000: Epistemic Foundations
ERIC Educational Resources Information Center
Rosen, Russell S.
2008-01-01
Prior to the formation of schools for the deaf in America in the early 19th century, with rare exceptions, deaf people lived under largely solitary conditions. After the formation of such schools they became a community with their own language, organizations and cultural traditions. Several social theorists have proffered various descriptions of…
Simonaitis, Linas; Belsito, Anne; Warvel, Jeff; Hui, Siu; McDonald, Clement J
2006-01-01
Clinicians at Wishard Hospital in Indianapolis print and carry clinical reports called "Pocket Rounds". This paper describes a new process we developed to improve these clinical reports. The heart of our new process is a World Wide Web Consortium standard: Extensible Stylesheet Language Formatting Objects (XSL-FO). Using XSL-FO stylesheets we generated Portable Document Format (PDF) and PostScript reports with complex formatting: columns, tables, borders, shading, indents, dividing lines. We observed patterns of clinical report printing during a eight month study period on three Medicine wards. Usage statistics indicated that clinicians accepted the new system enthusiastically: 78% of 26,418 reports were printed using the new system. We surveyed 67 clinical users. Respondents gave the new reports a rating of 4.2 (on a 5 point scale); they gave the old reports a rating of 3.4. The primary complaint was that it took longer to print the new reports. We believe that XSL-FO is a promising way to transform text data into functional and attractive clinical reports: relatively easy to implement and modify.
NASA Astrophysics Data System (ADS)
Linn, Marcia C.
1995-06-01
Designing effective curricula for complex topics and incorporating technological tools is an evolving process. One important way to foster effective design is to synthesize successful practices. This paper describes a framework called scaffolded knowledge integration and illustrates how it guided the design of two successful course enhancements in the field of computer science and engineering. One course enhancement, the LISP Knowledge Integration Environment, improved learning and resulted in more gender-equitable outcomes. The second course enhancement, the spatial reasoning environment, addressed spatial reasoning in an introductory engineering course. This enhancement minimized the importance of prior knowledge of spatial reasoning and helped students develop a more comprehensive repertoire of spatial reasoning strategies. Taken together, the instructional research programs reinforce the value of the scaffolded knowledge integration framework and suggest directions for future curriculum reformers.
NASA Technical Reports Server (NTRS)
Kawamura, K.; Beale, G. O.; Schaffer, J. D.; Hsieh, B. J.; Padalkar, S.; Rodriguez-Moscoso, J. J.
1985-01-01
The results of the first phase of Research on an Expert System for Database Operation of Simulation/Emulation Math Models, is described. Techniques from artificial intelligence (AI) were to bear on task domains of interest to NASA Marshall Space Flight Center. One such domain is simulation of spacecraft attitude control systems. Two related software systems were developed to and delivered to NASA. One was a generic simulation model for spacecraft attitude control, written in FORTRAN. The second was an expert system which understands the usage of a class of spacecraft attitude control simulation software and can assist the user in running the software. This NASA Expert Simulation System (NESS), written in LISP, contains general knowledge about digital simulation, specific knowledge about the simulation software, and self knowledge.
A voxel visualization and analysis system based on AutoCAD
NASA Astrophysics Data System (ADS)
Marschallinger, Robert
1996-05-01
A collection of AutoLISP programs is presented which enable the visualization and analysis of voxel models by AutoCAD rel. 12/rel. 13. The programs serve as an interactive, graphical front end for manipulating the results of three-dimensional modeling software producing block estimation data. ASCII data files describing geometry and attributes per estimation block are imported and stored as a voxel array. Each voxel may contain multiple attributes, therefore different parameters may be incorporated in one voxel array. Voxel classification is implemented on a layer basis providing flexible treatment of voxel classes such as recoloring, peeling, or volumetry. A versatile clipping tool enables slicing voxel arrays according to combinations of three perpendicular clipping planes. The programs feature an up-to-date, graphical user interface for user-friendly operation by non AutoCAD specialists.
GetData: A filesystem-based, column-oriented database format for time-ordered binary data
NASA Astrophysics Data System (ADS)
Wiebe, Donald V.; Netterfield, Calvin B.; Kisner, Theodore S.
2015-12-01
The GetData Project is the reference implementation of the Dirfile Standards, a filesystem-based, column-oriented database format for time-ordered binary data. Dirfiles provide a fast, simple format for storing and reading data, suitable for both quicklook and analysis pipelines. GetData provides a C API and bindings exist for various other languages. GetData is distributed under the terms of the GNU Lesser General Public License.
Application of XML to Journal Table Archiving
NASA Astrophysics Data System (ADS)
Shaya, E. J.; Blackwell, J. H.; Gass, J. E.; Kargatis, V. E.; Schneider, G. L.; Weiland, J. L.; Borne, K. D.; White, R. A.; Cheung, C. Y.
1998-12-01
The Astronomical Data Center (ADC) at the NASA Goddard Space Flight Center is a major archive for machine-readable astronomical data tables. Many ADC tables are derived from published journal articles. Article tables are reformatted to be machine-readable and documentation is crafted to facilitate proper reuse by researchers. The recent switch of journals to web based electronic format has resulted in the generation of large amounts of tabular data that could be captured into machine-readable archive format at fairly low cost. The large data flow of the tables from all major North American astronomical journals (a factor of 100 greater than the present rate at the ADC) necessitates the development of rigorous standards for the exchange of data between researchers, publishers, and the archives. We have selected a suitable markup language that can fully describe the large variety of astronomical information contained in ADC tables. The eXtensible Markup Language XML is a powerful internet-ready documentation format for data. It provides a precise and clear data description language that is both machine- and human-readable. It is rapidly becoming the standard format for business and information transactions on the internet and it is an ideal common metadata exchange format. By labelling, or "marking up", all elements of the information content, documents are created that computers can easily parse. An XML archive can easily and automatically be maintained, ingested into standard databases or custom software, and even totally restructured whenever necessary. Structuring astronomical data into XML format will enable efficient and focused search capabilities via off-the-shelf software. The ADC is investigating XML's expanded hyperlinking power to enhance connectivity within the ADC data/metadata and developing XSL display scripts to enhance display of astronomical data. The ADC XML Definition Type Document can be viewed at http://messier.gsfc.nasa.gov/dtdhtml/DTD-TREE.html
MXA: a customizable HDF5-based data format for multi-dimensional data sets
NASA Astrophysics Data System (ADS)
Jackson, M.; Simmons, J. P.; De Graef, M.
2010-09-01
A new digital file format is proposed for the long-term archival storage of experimental data sets generated by serial sectioning instruments. The format is known as the multi-dimensional eXtensible Archive (MXA) format and is based on the public domain Hierarchical Data Format (HDF5). The MXA data model, its description by means of an eXtensible Markup Language (XML) file with associated Document Type Definition (DTD) are described in detail. The public domain MXA package is available through a dedicated web site (mxa.web.cmu.edu), along with implementation details and example data files.
Open Babel: An open chemical toolbox
2011-01-01
Background A frequent problem in computational modeling is the interconversion of chemical structures between different formats. While standard interchange formats exist (for example, Chemical Markup Language) and de facto standards have arisen (for example, SMILES format), the need to interconvert formats is a continuing problem due to the multitude of different application areas for chemistry data, differences in the data stored by different formats (0D versus 3D, for example), and competition between software along with a lack of vendor-neutral formats. Results We discuss, for the first time, Open Babel, an open-source chemical toolbox that speaks the many languages of chemical data. Open Babel version 2.3 interconverts over 110 formats. The need to represent such a wide variety of chemical and molecular data requires a library that implements a wide range of cheminformatics algorithms, from partial charge assignment and aromaticity detection, to bond order perception and canonicalization. We detail the implementation of Open Babel, describe key advances in the 2.3 release, and outline a variety of uses both in terms of software products and scientific research, including applications far beyond simple format interconversion. Conclusions Open Babel presents a solution to the proliferation of multiple chemical file formats. In addition, it provides a variety of useful utilities from conformer searching and 2D depiction, to filtering, batch conversion, and substructure and similarity searching. For developers, it can be used as a programming library to handle chemical data in areas such as organic chemistry, drug design, materials science, and computational chemistry. It is freely available under an open-source license from http://openbabel.org. PMID:21982300
Networking observers and observatories with remote telescope markup language
NASA Astrophysics Data System (ADS)
Hessman, Frederic V.; Tuparev, Georg; Allan, Alasdair
2006-06-01
Remote Telescope Markup Language (RTML) is an XML-based protocol for the transport of the high-level description of a set of observations to be carried out on a remote, robotic or service telescope. We describe how RTML is being used in a wide variety of contexts: the transport of service and robotic observing requests in the Hands-On Universe TM, ACP, eSTAR, and MONET networks; how RTML is easily combined with other XML protocols for more localized control of telescopes; RTML as a secondary observation report format for the IVOA's VOEvent protocol; the input format for a general-purpose observation simulator; and the observatory-independent means for carrying out request transactions for the international Heterogeneous Telescope Network (HTN).
The semantics of Chemical Markup Language (CML) for computational chemistry : CompChem.
Phadungsukanan, Weerapong; Kraft, Markus; Townsend, Joe A; Murray-Rust, Peter
2012-08-07
: This paper introduces a subdomain chemistry format for storing computational chemistry data called CompChem. It has been developed based on the design, concepts and methodologies of Chemical Markup Language (CML) by adding computational chemistry semantics on top of the CML Schema. The format allows a wide range of ab initio quantum chemistry calculations of individual molecules to be stored. These calculations include, for example, single point energy calculation, molecular geometry optimization, and vibrational frequency analysis. The paper also describes the supporting infrastructure, such as processing software, dictionaries, validation tools and database repositories. In addition, some of the challenges and difficulties in developing common computational chemistry dictionaries are discussed. The uses of CompChem are illustrated by two practical applications.