Science.gov

Sample records for ada source code

  1. A graphically oriented specification language for automatic code generation. GRASP/Ada: A Graphical Representation of Algorithms, Structure, and Processes for Ada, phase 1

    NASA Technical Reports Server (NTRS)

    Cross, James H., II; Morrison, Kelly I.; May, Charles H., Jr.; Waddel, Kathryn C.

    1989-01-01

    The first phase of a three-phase effort to develop a new graphically oriented specification language which will facilitate the reverse engineering of Ada source code into graphical representations (GRs) as well as the automatic generation of Ada source code is described. A simplified view of the three phases of Graphical Representations for Algorithms, Structure, and Processes for Ada (GRASP/Ada) with respect to three basic classes of GRs is presented. Phase 1 concentrated on the derivation of an algorithmic diagram, the control structure diagram (CSD) (CRO88a) from Ada source code or Ada PDL. Phase 2 includes the generation of architectural and system level diagrams such as structure charts and data flow diagrams and should result in a requirements specification for a graphically oriented language able to support automatic code generation. Phase 3 will concentrate on the development of a prototype to demonstrate the feasibility of this new specification language.

  2. STGT program: Ada coding and architecture lessons learned

    NASA Technical Reports Server (NTRS)

    Usavage, Paul; Nagurney, Don

    1992-01-01

    STGT (Second TDRSS Ground Terminal) is currently halfway through the System Integration Test phase (Level 4 Testing). To date, many software architecture and Ada language issues have been encountered and solved. This paper, which is the transcript of a presentation at the 3 Dec. meeting, attempts to define these lessons plus others learned regarding software project management and risk management issues, training, performance, reuse, and reliability. Observations are included regarding the use of particular Ada coding constructs, software architecture trade-offs during the prototyping, development and testing stages of the project, and dangers inherent in parallel or concurrent systems, software, hardware, and operations engineering.

  3. Software engineering capability for Ada (GRASP/Ada Tool)

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1995-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis has been on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada Source code. A new Motif compliant graphical user interface has been developed for the GRASP/Ada prototype.

  4. Coded source neutron imaging

    SciTech Connect

    Bingham, Philip R; Santos-Villalobos, Hector J

    2011-01-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100 m) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100um and 10um aperture hole diameters show resolutions matching the hole diameters.

  5. Coded source neutron imaging

    NASA Astrophysics Data System (ADS)

    Bingham, Philip; Santos-Villalobos, Hector; Tobin, Ken

    2011-03-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100μm) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100μm and 10μm aperture hole diameters show resolutions matching the hole diameters.

  6. Translating expert system rules into Ada code with validation and verification

    NASA Technical Reports Server (NTRS)

    Becker, Lee; Duckworth, R. James; Green, Peter; Michalson, Bill; Gosselin, Dave; Nainani, Krishan; Pease, Adam

    1991-01-01

    The purpose of this ongoing research and development program is to develop software tools which enable the rapid development, upgrading, and maintenance of embedded real-time artificial intelligence systems. The goals of this phase of the research were to investigate the feasibility of developing software tools which automatically translate expert system rules into Ada code and develop methods for performing validation and verification testing of the resultant expert system. A prototype system was demonstrated which automatically translated rules from an Air Force expert system was demonstrated which detected errors in the execution of the resultant system. The method and prototype tools for converting AI representations into Ada code by converting the rules into Ada code modules and then linking them with an Activation Framework based run-time environment to form an executable load module are discussed. This method is based upon the use of Evidence Flow Graphs which are a data flow representation for intelligent systems. The development of prototype test generation and evaluation software which was used to test the resultant code is discussed. This testing was performed automatically using Monte-Carlo techniques based upon a constraint based description of the required performance for the system.

  7. ART-Ada design project, phase 2

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel; Allen, Bradley P.

    1990-01-01

    Interest in deploying expert systems in Ada has increased. An Ada based expert system tool is described called ART-Ada, which was built to support research into the language and methodological issues of expert systems in Ada. ART-Ada allows applications of an existing expert system tool called ART-IM (Automated Reasoning Tool for Information Management) to be deployed in various Ada environments. ART-IM, a C-based expert system tool, is used to generate Ada source code which is compiled and linked with an Ada based inference engine to produce an Ada executable image. ART-Ada is being used to implement several expert systems for NASA's Space Station Freedom Program and the U.S. Air Force.

  8. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  9. Software issues involved in code translation of C to Ada programs

    NASA Technical Reports Server (NTRS)

    Hooi, Robert; Giarratano, Joseph

    1986-01-01

    It is often thought that translation of one programming language to another is a simple solution that can be used to extend the software life span or in rehosting software to another environment. The possible problems are examined as are the advantages and disadvantages of direct machine or human code translation versus that of redesign and rewrite of the software. The translation of the expert system language called C Language Integrated Production System (CLIPS) which is written in C, to Ada, will be used as a case study of the problems that are encountered.

  10. An Embedded Rule-Based Diagnostic Expert System in Ada

    NASA Technical Reports Server (NTRS)

    Jones, Robert E.; Liberman, Eugene M.

    1992-01-01

    Ada is becoming an increasingly popular programming language for large Government-funded software projects. Ada with it portability, transportability, and maintainability lends itself well to today's complex programming environment. In addition, expert systems have also assumed a growing role in providing human-like reasoning capability expertise for computer systems. The integration is discussed of expert system technology with Ada programming language, especially a rule-based expert system using an ART-Ada (Automated Reasoning Tool for Ada) system shell. NASA Lewis was chosen as a beta test site for ART-Ada. The test was conducted by implementing the existing Autonomous Power EXpert System (APEX), a Lisp-based power expert system, in ART-Ada. Three components, the rule-based expert systems, a graphics user interface, and communications software make up SMART-Ada (Systems fault Management with ART-Ada). The rules were written in the ART-Ada development environment and converted to Ada source code. The graphics interface was developed with the Transportable Application Environment (TAE) Plus, which generates Ada source code to control graphics images. SMART-Ada communicates with a remote host to obtain either simulated or real data. The Ada source code generated with ART-Ada, TAE Plus, and communications code was incorporated into an Ada expert system that reads the data from a power distribution test bed, applies the rule to determine a fault, if one exists, and graphically displays it on the screen. The main objective, to conduct a beta test on the ART-Ada rule-based expert system shell, was achieved. The system is operational. New Ada tools will assist in future successful projects. ART-Ada is one such tool and is a viable alternative to the straight Ada code when an application requires a rule-based or knowledge-based approach.

  11. Update of GRASP/Ada reverse engineering tools for Ada

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1992-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation of Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype (Version 1) was designed and implemented using FLEX and BISON running under VMS on a VAS 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented using the HP widget toolkit and the X Windows System. In Phase 3, the user interface was extensively reworked using the Athena widget toolkit and X Windows. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. Following Phase 3, the prototype was evaluated by software engineering students at Auburn University and then updated with significant enhancements to the user interface including editing capabilities. Version 3.2 of the prototype was prepared for limited distribution to facilitate further evaluation. The current prototype provides the capability for the user to generate CSD's from Ada PDL or source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical application.

  12. The development of a program analysis environment for Ada: Reverse engineering tools for Ada

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1991-01-01

    The Graphical Representations of Algorithms, Structures, and Processes for Ada (GRASP/Ada) has successfully created and prototyped a new algorithm level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and thus improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype (Version 1) was designed and implemented using FLEX and BISON running under the Virtual Memory System (VMS) on a VAX 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. In Phase 3 of the project, the prototype was prepared for limited distribution (GRASP/Ada Version 3.0) to facilitate evaluation. The user interface was extensively reworked. The current prototype provides the capability for the user to generate CSD from Ada source code in a reverse engineering mode with a level of flexibility suitable for practical application.

  13. Generic Ada code in the NASA space station command, control and communications environment

    NASA Technical Reports Server (NTRS)

    Mcdougall, D. P.; Vollman, T. E.

    1986-01-01

    The results of efforts to apply powerful Ada constructs to the formatted message handling process are described. The goal of these efforts was to extend the state-of-technology in message handling while at the same time producing production-quality, reusable code. The first effort was initiated in September, 1984 and delivered in April, 1985. That product, the Generic Message Handling Facility, met initial goals, was reused, and is available in the Ada Repository on ARPANET. However, it became apparent during its development that the initial approach to building a message handler template was not optimal. As a result of this initial effort, several alternate approaches were identified, and research is now on-going to identify an improved product. The ultimate goal is to be able to instantly build a message handling system for any message format given a specification of that message format. The problem lies in how to specify the message format, and one that is done, how to use that information to build the message handler. Message handling systems and message types are described. The initial efforts, its results and its shortcomings are detailed. The approach now being taken to build a system which will be significantly easier to implement, and once implemented, easier to use, is described. Finally, conclusions are offered.

  14. Astrophysics Source Code Library Enhancements

    NASA Astrophysics Data System (ADS)

    Hanisch, R. J.; Allen, A.; Berriman, G. B.; DuPrie, K.; Mink, J.; Nemiroff, R. J.; Schmidt, J.; Shamir, L.; Shortridge, K.; Taylor, M.; Teuben, P. J.; Wallin, J.

    2015-09-01

    The Astrophysics Source Code Library (ASCL)1 is a free online registry of codes used in astronomy research; it currently contains over 900 codes and is indexed by ADS. The ASCL has recently moved a new infrastructure into production. The new site provides a true database for the code entries and integrates the WordPress news and information pages and the discussion forum into one site. Previous capabilities are retained and permalinks to ascl.net continue to work. This improvement offers more functionality and flexibility than the previous site, is easier to maintain, and offers new possibilities for collaboration. This paper covers these recent changes to the ASCL.

  15. Classic-Ada(TM)

    NASA Technical Reports Server (NTRS)

    Valley, Lois

    1989-01-01

    The SPS product, Classic-Ada, is a software tool that supports object-oriented Ada programming with powerful inheritance and dynamic binding. Object Oriented Design (OOD) is an easy, natural development paradigm, but it is not supported by Ada. Following the DOD Ada mandate, SPS developed Classic-Ada to provide a tool which supports OOD and implements code in Ada. It consists of a design language, a code generator and a toolset. As a design language, Classic-Ada supports the object-oriented principles of information hiding, data abstraction, dynamic binding, and inheritance. It also supports natural reuse and incremental development through inheritance, code factoring, and Ada, Classic-Ada, dynamic binding and static binding in the same program. Only nine new constructs were added to Ada to provide object-oriented design capabilities. The Classic-Ada code generator translates user application code into fully compliant, ready-to-run, standard Ada. The Classic-Ada toolset is fully supported by SPS and consists of an object generator, a builder, a dictionary manager, and a reporter. Demonstrations of Classic-Ada and the Classic-Ada Browser were given at the workshop.

  16. C Language Integrated Production System, Ada Version

    NASA Technical Reports Server (NTRS)

    Culbert, Chris; Riley, Gary; Savely, Robert T.; Melebeck, Clovis J.; White, Wesley A.; Mcgregor, Terry L.; Ferguson, Melisa; Razavipour, Reza

    1992-01-01

    CLIPS/Ada provides capabilities of CLIPS v4.3 but uses Ada as source language for CLIPS executable code. Implements forward-chaining rule-based language. Program contains inference engine and language syntax providing framework for construction of expert-system program. Also includes features for debugging application program. Based on Rete algorithm which provides efficient method for performing repeated matching of patterns. Written in Ada.

  17. Update of GRASP/Ada reverse engineering tools for Ada

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1993-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional pretty printed Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype CSD generator (Version 1) was designed and implemented using FLEX and BISON running under VMS on a VAX 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented using the HP widget toolkit and the X Windows System. In Phase 3, the user interface was extensively reworked using the Athena widget toolkit and X Windows. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. Following Phase 3,e two update phases were completed. Update'92 focused on the initial analysis of evaluation data collected from software engineering students at Auburn University and the addition of significant enhancements to the user interface. Update'93 (the current update) focused on the statistical analysis of the data collected in the previous update and preparation of Version 3.4 of the prototype for limited distribution to facilitate further evaluation. The current prototype provides the capability for the user to generate CSD's from Ada PDL or source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical

  18. GRASP/Ada 95: Reverse Engineering Tools for Ada

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1996-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped an algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD), and a new visualization for a fine-grained complexity metric called the Complexity Profile Graph (CPG). By synchronizing the CSD and the CPG, the CSD view of control structure, nesting, and source code is directly linked to the corresponding visualization of statement level complexity in the CPG. GRASP has been integrated with GNAT, the GNU Ada 95 Translator to provide a comprehensive graphical user interface and development environment for Ada 95. The user may view, edit, print, and compile source code as a CSD with no discernible addition to storage or computational overhead. The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis has been on the automatic generation of the CSD from Ada 95 source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada source code. The current update has focused on the design and implementation of a new Motif compliant user interface, and a new CSD generator consisting of a tagger and renderer. The Complexity Profile Graph (CPG) is based on a set of functions that describes the context, content, and the scaling for complexity on a statement by statement basis. When combined graphicafly, the result is a composite profile of complexity for the program unit. Ongoing research includes the development and refinement of the associated functions, and the development of the CPG generator prototype. The current Version 5.0 prototype provides the capability for the user to generate CSDs and CPGs from Ada 95 source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for

  19. Syndrome source coding and its universal generalization

    NASA Technical Reports Server (NTRS)

    Ancheta, T. C., Jr.

    1975-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A universal generalization of syndrome-source-coding is formulated which provides robustly-effective, distortionless, coding of source ensembles.

  20. GENERAL PURPOSE ADA PACKAGES

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.

    1994-01-01

    Ten families of subprograms are bundled together for the General-Purpose Ada Packages. The families bring to Ada many features from HAL/S, PL/I, FORTRAN, and other languages. These families are: string subprograms (INDEX, TRIM, LOAD, etc.); scalar subprograms (MAX, MIN, REM, etc.); array subprograms (MAX, MIN, PROD, SUM, GET, and PUT); numerical subprograms (EXP, CUBIC, etc.); service subprograms (DATE_TIME function, etc.); Linear Algebra II; Runge-Kutta integrators; and three text I/O families of packages. In two cases, a family consists of a single non-generic package. In all other cases, a family comprises a generic package and its instances for a selected group of scalar types. All generic packages are designed to be easily instantiated for the types declared in the user facility. The linear algebra package is LINRAG2. This package includes subprograms supplementing those in NPO-17985, An Ada Linear Algebra Package Modeled After HAL/S (LINRAG). Please note that LINRAG2 cannot be compiled without LINRAG. Most packages have widespread applicability, although some are oriented for avionics applications. All are designed to facilitate writing new software in Ada. Several of the packages use conventions introduced by other programming languages. A package of string subprograms is based on HAL/S (a language designed for the avionics software in the Space Shuttle) and PL/I. Packages of scalar and array subprograms are taken from HAL/S or generalized current Ada subprograms. A package of Runge-Kutta integrators is patterned after a built-in MAC (MIT Algebraic Compiler) integrator. Those packages modeled after HAL/S make it easy to translate existing HAL/S software to Ada. The General-Purpose Ada Packages program source code is available on two 360K 5.25" MS-DOS format diskettes. The software was developed using VAX Ada v1.5 under DEC VMS v4.5. It should be portable to any validated Ada compiler and it should execute either interactively or in batch. The largest package

  1. Practices in Code Discoverability: Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; Teuben, P.; Nemiroff, R. J.; Shamir, L.

    2012-09-01

    Here we describe the Astrophysics Source Code Library (ASCL), which takes an active approach to sharing astrophysics source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL now has over 340 codes in it and continues to grow. In 2011, the ASCL has on average added 19 codes per month. An advisory committee has been established to provide input and guide the development and expansion of the new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This paper provides the history and description of the ASCL. It lists the requirements for including codes, examines the advantages of the ASCL, and outlines some of its future plans.

  2. The development of a program analysis environment for Ada: Reverse engineering tools for Ada. Final Report, 1 Jun. 1990 - 30 Sep. 1991

    SciTech Connect

    Cross, J.H. II.

    1991-09-01

    The Graphical Representations of Algorithms, Structures, and Processes for Ada (GRASP/Ada) has successfully created and prototyped a new algorithm level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and thus improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype (Version 1) was designed and implemented using FLEX and BISON running under the Virtual Memory System (VMS) on a VAX 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. In Phase 3 of the project, the prototype was prepared for limited distribution (GRASP/Ada Version 3.0) to facilitate evaluation. The user interface was extensively reworked. The current prototype provides the capability for the user to generate CSD from Ada source code in a reverse engineering mode with a level of flexibility suitable for practical application.

  3. The Astrophysics Source Code Library: An Update

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Nemiroff, R. J.; Shamir, L.; Teuben, P. J.

    2012-01-01

    The Astrophysics Source Code Library (ASCL), founded in 1999, takes an active approach to sharing astrophysical source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL moved to a new location in 2010, and has over 300 codes in it and continues to grow. In 2011, the ASCL (http://asterisk.apod.com/viewforum.php?f=35) has on average added 19 new codes per month; we encourage scientists to submit their codes for inclusion. An advisory committee has been established to provide input and guide the development and expansion of its new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This presentation covers the history of the ASCL and examines the current state and benefits of the ASCL, the means of and requirements for including codes, and outlines its future plans.

  4. Ada/POSIX binding: A focused Ada investigation

    NASA Technical Reports Server (NTRS)

    Legrand, Sue

    1988-01-01

    NASA is seeking an operating system interface definition (OSID) for the Space Station Program (SSP) in order to take advantage of the commercial off-the-shelf (COTS) products available today and the many that are expected in the future. NASA would also like to avoid the reliance on any one source for operating systems, information system, communication system, or instruction set architecture. The use of the Portable Operating System Interface for Computer Environments (POSIX) is examined as a possible solution to this problem. Since Ada is already the language of choice for SSP, the question of an Ada/POSIX binding is addressed. The intent of the binding is to provide access to the POSIX standard operation system (OS) interface and environment, by which application portability of Ada applications will be supported at the source code level. A guiding principle of Ada/POSIX binding development is a clear conformance of the Ada interface with the functional definition of POSIX. The interface is intended to be used by both application developers and system implementors. The objective is to provide a standard that allows a strictly conforming application source program that can be compiled to execute on any conforming implementation. Special emphasis is placed on first providing those functions and facilities that are needed in a wide variety of commercial applications

  5. An Ada programming support environment

    NASA Technical Reports Server (NTRS)

    Tyrrill, AL; Chan, A. David

    1986-01-01

    The toolset of an Ada Programming Support Environment (APSE) being developed at North American Aircraft Operations (NAAO) of Rockwell International, is described. The APSE is resident on three different hosts and must support developments for the hosts and for embedded targets. Tools and developed software must be freely portable between the hosts. The toolset includes the usual editors, compilers, linkers, debuggers, configuration magnagers, and documentation tools. Generally, these are being supplied by the host computer vendors. Other tools, for example, pretty printer, cross referencer, compilation order tool, and management tools were obtained from public-domain sources, are implemented in Ada and are being ported to the hosts. Several tools being implemented in-house are of interest, these include an Ada Design Language processor based on compilable Ada. A Standalone Test Environment Generator facilitates test tool construction and partially automates unit level testing. A Code Auditor/Static Analyzer permits the Ada programs to be evaluated against measures of quality. An Ada Comment Box Generator partially automates generation of header comment boxes.

  6. Implementation issues in source coding

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Chen, Yun-Chung; Hadenfeldt, A. C.

    1989-01-01

    An edge preserving image coding scheme which can be operated in both a lossy and a lossless manner was developed. The technique is an extension of the lossless encoding algorithm developed for the Mars observer spectral data. It can also be viewed as a modification of the DPCM algorithm. A packet video simulator was also developed from an existing modified packet network simulator. The coding scheme for this system is a modification of the mixture block coding (MBC) scheme described in the last report. Coding algorithms for packet video were also investigated.

  7. Making your code citable with the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; DuPrie, Kimberly; Schmidt, Judy; Berriman, G. Bruce; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.

    2016-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. With nearly 1,200 codes, it is the largest indexed resource for astronomy codes in existence. Established in 1999, it offers software authors a path to citation of their research codes even without publication of a paper describing the software, and offers scientists a way to find codes used in refereed publications, thus improving the transparency of the research. It also provides a method to quantify the impact of source codes in a fashion similar to the science metrics of journal articles. Citations using ASCL IDs are accepted by major astronomy journals and if formatted properly are tracked by ADS and other indexing services. The number of citations to ASCL entries increased sharply from 110 citations in January 2014 to 456 citations in September 2015. The percentage of code entries in ASCL that were cited at least once rose from 7.5% in January 2014 to 17.4% in September 2015. The ASCL's mid-2014 infrastructure upgrade added an easy entry submission form, more flexible browsing, search capabilities, and an RSS feeder for updates. A Changes/Additions form added this past fall lets authors submit links for papers that use their codes for addition to the ASCL entry even if those papers don't formally cite the codes, thus increasing the transparency of that research and capturing the value of their software to the community.

  8. Source Code Plagiarism--A Student Perspective

    ERIC Educational Resources Information Center

    Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.

    2011-01-01

    This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…

  9. Astrophysics Source Code Library: Incite to Cite!

    NASA Astrophysics Data System (ADS)

    DuPrie, K.; Allen, A.; Berriman, B.; Hanisch, R. J.; Mink, J.; Nemiroff, R. J.; Shamir, L.; Shortridge, K.; Taylor, M. B.; Teuben, P.; Wallen, J. F.

    2014-05-01

    The Astrophysics Source Code Library (ASCl,http://ascl.net/) is an on-line registry of over 700 source codes that are of interest to astrophysicists, with more being added regularly. The ASCL actively seeks out codes as well as accepting submissions from the code authors, and all entries are citable and indexed by ADS. All codes have been used to generate results published in or submitted to a refereed journal and are available either via a download site or from an identified source. In addition to being the largest directory of scientist-written astrophysics programs available, the ASCL is also an active participant in the reproducible research movement with presentations at various conferences, numerous blog posts and a journal article. This poster provides a description of the ASCL and the changes that we are starting to see in the astrophysics community as a result of the work we are doing.

  10. Distributed transform coding via source-splitting

    NASA Astrophysics Data System (ADS)

    Yahampath, Pradeepa

    2012-12-01

    Transform coding (TC) is one of the best known practical methods for quantizing high-dimensional vectors. In this article, a practical approach to distributed TC of jointly Gaussian vectors is presented. This approach, referred to as source-split distributed transform coding (SP-DTC), can be used to easily implement two terminal transform codes for any given rate-pair. The main idea is to apply source-splitting using orthogonal-transforms, so that only Wyner-Ziv (WZ) quantizers are required for compression of transform coefficients. This approach however requires optimizing the bit allocation among dependent sets of WZ quantizers. In order to solve this problem, a low-complexity tree-search algorithm based on analytical models for transform coefficient quantization is developed. A rate-distortion (RD) analysis of SP-DTCs for jointly Gaussian sources is presented, which indicates that these codes can significantly outperform the practical alternative of independent TC of each source, whenever there is a strong correlation between the sources. For practical implementation of SP-DTCs, the idea of using conditional entropy constrained (CEC) quantizers followed by Slepian-Wolf coding is explored. Experimental results obtained with SP-DTC designs based on both CEC scalar quantizers and CEC trellis-coded quantizers demonstrate that actual implementations of SP-DTCs can achieve RD performance close to the analytically predicted limits.

  11. Managing Ada development

    NASA Technical Reports Server (NTRS)

    Green, James R.

    1986-01-01

    The Ada programming language was developed under the sponsorship of the Department of Defense to address the soaring costs associated with software development and maintenance. Ada is powerful, and yet to take full advantage of its power, it is sufficiently complex and different from current programming approaches that there is considerable risk associated with committing a program to be done in Ada. There are also few programs of any substantial size that have been implemented using Ada that may be studied to determine those management methods that resulted in a successful Ada project. The items presented are the author's opinions which have been formed as a result of going through an experience software development. The difficulties faced, risks assumed, management methods applied, and lessons learned, and most importantly, the techniques that were successful are all valuable sources of management information for those managers ready to assume major Ada developments projects.

  12. Astrophysics Source Code Library -- Now even better!

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Schmidt, Judy; Berriman, Bruce; DuPrie, Kimberly; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.

    2015-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. Indexed by ADS, it now contains nearly 1,000 codes and with recent major changes, is better than ever! The resource has a new infrastructure that offers greater flexibility and functionality for users, including an easier submission process, better browsing, one-click author search, and an RSS feeder for news. The new database structure is easier to maintain and offers new possibilities for collaboration. Come see what we've done!

  13. Using the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Teuben, P. J.; Berriman, G. B.; DuPrie, K.; Hanisch, R. J.; Mink, J. D.; Nemiroff, R. J.; Shamir, L.; Wallin, J. F.

    2013-01-01

    The Astrophysics Source Code Library (ASCL) is a free on-line registry of source codes that are of interest to astrophysicists; with over 500 codes, it is the largest collection of scientist-written astrophysics programs in existence. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are available either via a download site or from an identified source. An advisory committee formed in 2011 provides input and guides the development and expansion of the ASCL, and since January 2012, all accepted ASCL entries are indexed by ADS. Though software is increasingly important for the advancement of science in astrophysics, these methods are still often hidden from view or difficult to find. The ASCL (ascl.net/) seeks to improve the transparency and reproducibility of research by making these vital methods discoverable, and to provide recognition and incentive to those who write and release programs useful for astrophysics research. This poster provides a description of the ASCL, an update on recent additions, and the changes in the astrophysics community we are starting to see because of the ASCL.

  14. GRASP/Ada: Graphical Representations of Algorithms, Structures, and Processes for Ada. The development of a program analysis environment for Ada: Reverse engineering tools for Ada, task 2, phase 3

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1991-01-01

    The main objective is the investigation, formulation, and generation of graphical representations of algorithms, structures, and processes for Ada (GRASP/Ada). The presented task, in which various graphical representations that can be extracted or generated from source code are described and categorized, is focused on reverse engineering. The following subject areas are covered: the system model; control structure diagram generator; object oriented design diagram generator; user interface; and the GRASP library.

  15. Iterative Reconstruction of Coded Source Neutron Radiographs

    SciTech Connect

    Santos-Villalobos, Hector J; Bingham, Philip R; Gregor, Jens

    2012-01-01

    Use of a coded source facilitates high-resolution neutron imaging but requires that the radiographic data be deconvolved. In this paper, we compare direct deconvolution with two different iterative algorithms, namely, one based on direct deconvolution embedded in an MLE-like framework and one based on a geometric model of the neutron beam and a least squares formulation of the inverse imaging problem.

  16. Documentation generator application for VHDL source codes

    NASA Astrophysics Data System (ADS)

    Niton, B.; Pozniak, K. T.; Romaniuk, R. S.

    2011-06-01

    The UML, which is a complex system modeling and description technology, has recently been expanding its uses in the field of formalization and algorithmic approach to such systems like multiprocessor photonic, optoelectronic and advanced electronics carriers; distributed, multichannel measurement systems; optical networks, industrial electronics, novel R&D solutions. The paper describes a realization of an application for documenting VHDL source codes. There are presented own novel solution based on Doxygen program which is available on the free license, with accessible source code. The used supporting tools for parser building were Bison and Flex. There are presented the practical results of the documentation generator. The program was applied for exemplary VHDL codes. The documentation generator application is used for design of large optoelectronic and electronic measurement and control systems. The paper consists of three parts which describe the following components of the documentation generator for photonic and electronic systems: concept, MatLab application and VHDL application. This is part three which describes the VHDL application. VHDL is used for behavioral description of the Optoelectronic system.

  17. Iterative Reconstruction of Coded Source Neutron Radiographs

    SciTech Connect

    Santos-Villalobos, Hector J; Bingham, Philip R; Gregor, Jens

    2013-01-01

    Use of a coded source facilitates high-resolution neutron imaging through magnifications but requires that the radiographic data be deconvolved. A comparison of direct deconvolution with two different iterative algorithms has been performed. One iterative algorithm is based on a maximum likelihood estimation (MLE)-like framework and the second is based on a geometric model of the neutron beam within a least squares formulation of the inverse imaging problem. Simulated data for both uniform and Gaussian shaped source distributions was used for testing to understand the impact of non-uniformities present in neutron beam distributions on the reconstructed images. Results indicate that the model based reconstruction method will match resolution and improve on contrast over convolution methods in the presence of non-uniform sources. Additionally, the model based iterative algorithm provides direct calculation of quantitative transmission values while the convolution based methods must be normalized base on known values.

  18. GRASP/Ada (Graphical Representations of Algorithms, Structures, and Processes for Ada): The development of a program analysis environment for Ada. Reverse engineering tools for Ada, task 1, phase 2

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1990-01-01

    The study, formulation, and generation of structures for Ada (GRASP/Ada) are discussed in this second phase report of a three phase effort. Various graphical representations that can be extracted or generated from source code are described and categorized with focus on reverse engineering. The overall goal is to provide the foundation for a CASE (computer-aided software design) environment in which reverse engineering and forward engineering (development) are tightly coupled. Emphasis is on a subset of architectural diagrams that can be generated automatically from source code with the control structure diagram (CSD) included for completeness.

  19. A proposed classification scheme for Ada-based software products

    NASA Technical Reports Server (NTRS)

    Cernosek, Gary J.

    1986-01-01

    As the requirements for producing software in the Ada language become a reality for projects such as the Space Station, a great amount of Ada-based program code will begin to emerge. Recognizing the potential for varying levels of quality to result in Ada programs, what is needed is a classification scheme that describes the quality of a software product whose source code exists in Ada form. A 5-level classification scheme is proposed that attempts to decompose this potentially broad spectrum of quality which Ada programs may possess. The number of classes and their corresponding names are not as important as the mere fact that there needs to be some set of criteria from which to evaluate programs existing in Ada. An exact criteria for each class is not presented, nor are any detailed suggestions of how to effectively implement this quality assessment. The idea of Ada-based software classification is introduced and a set of requirements from which to base further research and development is suggested.

  20. TRW’s Ada Process Model for Incremental Development of Large Software Systems

    DTIC Science & Technology

    1990-01-01

    TRW’s Ada Process Model has proven to be key to the Command Center Processing and Display System-Replacement (CCPDS-R) project’s success to data in...developing over 3000,000 lines of Ada source code executing in a distributed VAX VMS environment. The Ada Process Model is, in simplest terms, a...software progress metrics. This paper provides an overview of the techniques and benefits of the Ada Process Model and describes some of the experience and

  1. Syndrome-source-coding and its universal generalization. [error correcting codes for data compression

    NASA Technical Reports Server (NTRS)

    Ancheta, T. C., Jr.

    1976-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A 'universal' generalization of syndrome-source-coding is formulated which provides robustly effective distortionless coding of source ensembles. Two examples are given, comparing the performance of noiseless universal syndrome-source-coding to (1) run-length coding and (2) Lynch-Davisson-Schalkwijk-Cover universal coding for an ensemble of binary memoryless sources.

  2. Software Model Checking Without Source Code

    NASA Technical Reports Server (NTRS)

    Chaki, Sagar; Ivers, James

    2009-01-01

    We present a framework, called AIR, for verifying safety properties of assembly language programs via software model checking. AIR extends the applicability of predicate abstraction and counterexample guided abstraction refinement to the automated verification of low-level software. By working at the assembly level, AIR allows verification of programs for which source code is unavailable-such as legacy and COTS software-and programs that use features-such as pointers, structures, and object-orientation-that are problematic for source-level software verification tools. In addition, AIR makes no assumptions about the underlying compiler technology. We have implemented a prototype of AIR and present encouraging results on several non-trivial examples.

  3. CLIPS/Ada: An Ada-based tool for building expert systems

    NASA Technical Reports Server (NTRS)

    White, W. A.

    1990-01-01

    Clips/Ada is a production system language and a development environment. It is functionally equivalent to the CLIPS tool. CLIPS/Ada was developed in order to provide a means of incorporating expert system technology into projects where the use of the Ada language had been mandated. A secondary purpose was to glean information about the Ada language and its compilers. Specifically, whether or not the language and compilers were mature enough to support AI applications. The CLIPS/Ada tool is coded entirely in Ada and is designed to be used by Ada systems that require expert reasoning.

  4. Ada software productivity prototypes: A case study

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus M.; Habib-Agahi, Hamid; Malhotra, Shan

    1988-01-01

    A case study of the impact of Ada on a Command and Control project completed at the Jet Propulsion Laboratory (JPL) is given. The data for this study was collected as part of a general survey of software costs and productivity at JPL and other NASA sites. The task analyzed is a successful example of the use of rapid prototyping as applied to command and control for the U.S. Air Force and provides the U.S. Air Force Military Airlift Command with the ability to track aircraft, air crews and payloads worldwide. The task consists of a replicated database at several globally distributed sites. The local databases at each site can be updated within seconds after changes are entered at any one site. The system must be able to handle up to 400,000 activities per day. There are currently seven sites, each with a local area network of computers and a variety of user displays; the local area networks are tied together into a single wide area network. Using data obtained for eight modules, totaling approximately 500,000 source lines of code, researchers analyze the differences in productivities between subtasks. Factors considered are percentage of Ada used in coding, years of programmer experience, and the use of Ada tools and modern programming practices. The principle findings are the following. Productivity is very sensitive to programmer experience. The use of Ada software tools and the use of modern programming practices are important; without such use Ada is just a large complex language which can cause productivity to decrease. The impact of Ada on development effort phases is consistent with earlier reports at the project level but not at the module level.

  5. Ada style guide (version 1.1)

    NASA Technical Reports Server (NTRS)

    Seidewitz, Edwin V.; Agresti, William; Ferry, Daniel; Lavallee, David; Maresca, Paul; Nelson, Robert; Quimby, Kelvin; Rosenberg, Jacob; Roy, Daniel; Shell, Allyn

    1987-01-01

    Ada is a programming language of considerable expressive power. The Ada Language Reference Manual provides a thorough definition of the language. However, it does not offer sufficient guidance on the appropriate use of Ada's powerful features. For this reason, the Goddard Space Flight Center Ada User's Group has produced this style guide which addresses such program style issues. The guide covers three areas of Ada program style: the structural decomposition of a program; the coding and the use of specific Ada features; and the textural formatting of a program.

  6. Ada COCOMO and the Ada Process Model

    DTIC Science & Technology

    1989-01-01

    language, the use of incremental development, and the use of the Ada process model capitalizing on the strengths of Ada to improve the efficiency of software...development. This paper presents the portions of the revised Ada COCOMO dealing with the effects of Ada and the Ada process model . The remainder of...this section of the paper discusses the objectives of Ada COCOMO. Section 2 describes the Ada Process Model and its overall effects on software

  7. Astronomy education and the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Nemiroff, Robert J.

    2016-01-01

    The Astrophysics Source Code Library (ASCL) is an online registry of source codes used in refereed astrophysics research. It currently lists nearly 1,200 codes and covers all aspects of computational astrophysics. How can this resource be of use to educators and to the graduate students they mentor? The ASCL serves as a discovery tool for codes that can be used for one's own research. Graduate students can also investigate existing codes to see how common astronomical problems are approached numerically in practice, and use these codes as benchmarks for their own solutions to these problems. Further, they can deepen their knowledge of software practices and techniques through examination of others' codes.

  8. Distributed source coding using chaos-based cryptosystem

    NASA Astrophysics Data System (ADS)

    Zhou, Junwei; Wong, Kwok-Wo; Chen, Jianyong

    2012-12-01

    A distributed source coding scheme is proposed by incorporating a chaos-based cryptosystem in the Slepian-Wolf coding. The punctured codeword generated by the chaos-based cryptosystem results in ambiguity at the decoder side. This ambiguity can be removed by the maximum a posteriori decoding with the help of side information. In this way, encryption and source coding are performed simultaneously. This leads to a simple encoder structure with low implementation complexity. Simulation results show that the encoder complexity is lower than that of existing distributed source coding schemes. Moreover, at small block size, the proposed scheme has a performance comparable to existing distributed source coding schemes.

  9. A Construction of Lossy Source Code Using LDPC Matrices

    NASA Astrophysics Data System (ADS)

    Miyake, Shigeki; Muramatsu, Jun

    Research into applying LDPC code theory, which is used for channel coding, to source coding has received a lot of attention in several research fields such as distributed source coding. In this paper, a source coding problem with a fidelity criterion is considered. Matsunaga et al. and Martinian et al. constructed a lossy code under the conditions of a binary alphabet, a uniform distribution, and a Hamming measure of fidelity criterion. We extend their results and construct a lossy code under the extended conditions of a binary alphabet, a distribution that is not necessarily uniform, and a fidelity measure that is bounded and additive and show that the code can achieve the optimal rate, rate-distortion function. By applying a formula for the random walk on lattice to the analysis of LDPC matrices on Zq, where q is a prime number, we show that results similar to those for the binary alphabet condition hold for Zq, the multiple alphabet condition.

  10. Initial Ada components evaluation

    NASA Technical Reports Server (NTRS)

    Moebes, Travis

    1989-01-01

    The SAIC has the responsibility for independent test and validation of the SSE. They have been using a mathematical functions library package implemented in Ada to test the SSE IV and V process. The library package consists of elementary mathematical functions and is both machine and accuracy independent. The SSE Ada components evaluation includes code complexity metrics based on Halstead's software science metrics and McCabe's measure of cyclomatic complexity. Halstead's metrics are based on the number of operators and operands on a logical unit of code and are compiled from the number of distinct operators, distinct operands, and total number of occurrences of operators and operands. These metrics give an indication of the physical size of a program in terms of operators and operands and are used diagnostically to point to potential problems. McCabe's Cyclomatic Complexity Metrics (CCM) are compiled from flow charts transformed to equivalent directed graphs. The CCM is a measure of the total number of linearly independent paths through the code's control structure. These metrics were computed for the Ada mathematical functions library using Software Automated Verification and Validation (SAVVAS), the SSE IV and V tool. A table with selected results was shown, indicating that most of these routines are of good quality. Thresholds for the Halstead measures indicate poor quality if the length metric exceeds 260 or difficulty is greater than 190. The McCabe CCM indicated a high quality of software products.

  11. QUEST/Ada (Query Utility Environment for Software Testing of Ada): The development of a prgram analysis environment for Ada, task 1, phase 2

    NASA Technical Reports Server (NTRS)

    Brown, David B.

    1990-01-01

    The results of research and development efforts are described for Task one, Phase two of a general project entitled The Development of a Program Analysis Environment for Ada. The scope of this task includes the design and development of a prototype system for testing Ada software modules at the unit level. The system is called Query Utility Environment for Software Testing of Ada (QUEST/Ada). The prototype for condition coverage provides a platform that implements expert system interaction with program testing. The expert system can modify data in the instrument source code in order to achieve coverage goals. Given this initial prototype, it is possible to evaluate the rule base in order to develop improved rules for test case generation. The goals of Phase two are the following: (1) to continue to develop and improve the current user interface to support the other goals of this research effort (i.e., those related to improved testing efficiency and increased code reliable); (2) to develop and empirically evaluate a succession of alternative rule bases for the test case generator such that the expert system achieves coverage in a more efficient manner; and (3) to extend the concepts of the current test environment to address the issues of Ada concurrency.

  12. Data processing with microcode designed with source coding

    DOEpatents

    McCoy, James A; Morrison, Steven E

    2013-05-07

    Programming for a data processor to execute a data processing application is provided using microcode source code. The microcode source code is assembled to produce microcode that includes digital microcode instructions with which to signal the data processor to execute the data processing application.

  13. Software unit testing in Ada environment

    NASA Technical Reports Server (NTRS)

    Warnock, Glenn

    1986-01-01

    A validation procedure for the Ada binding of the Graphical Kernel System (GKS) is being developed. PRIOR Data Sciences is also producing a version of the GKS written in Ada. These major software engineering projects will provide an opportunity to demonstrate a sound approach for software testing in an Ada environment. The GKS/Ada validation capability will be a collection of test programs and data, and test management guidelines. These products will be used to assess the correctness, completeness, and efficiency of any GKS/Ada implementation. The GKS/Ada developers will be able to obtain the validation software for their own use. It is anticipated that this validation software will eventually be taken over by an independent standards body to provide objective assessments of GKS/Ada implementations, using an approach similar to the validation testing currently applied to Ada compilers. In the meantime, if requested, this validation software will be used to assess GKS/Ada products. The second project, implementation of GKS using the Ada language, is a conventional software engineering tasks. It represents a large body of Ada code and has some interesting testing problems associated with automatic testing of graphics routines. Here the normal test practices which include automated regression testing, independent quality assistance, test configuration management, and the application of software quality metrics will be employed. The software testing methods emphasize quality enhancement and automated procedures. Ada makes some aspects of testing easier, and introduces some concerns. These issues are addressed.

  14. Ada Structure Design Language (ASDL)

    NASA Technical Reports Server (NTRS)

    Chedrawi, Lutfi

    1986-01-01

    An artist acquires all the necessary tools before painting a scene. In the same analogy, a software engineer needs the necessary tools to provide their design with the proper means for implementation. Ada provide these tools. Yet, as an artist's painting needs a brochure to accompany it for further explanation of the scene, an Ada design also needs a document along with it to show the design in its detailed structure and hierarchical order. Ada could be self-explanatory in small programs not exceeding fifty lines of code in length. But, in a large environment, ranging from thousands of lines and above, Ada programs need to be well documented to be preserved and maintained. The language used to specify an Ada document is called Ada Structure Design Language (ASDL). This language sets some rules to help derive a well formatted Ada detailed design document. The rules are defined to meet the needs of a project manager, a maintenance team, a programmer and a systems designer. The design document templates, the document extractor, and the rules set forth by the ASDL are explained in detail.

  15. Ada issues in implementing ART-Ada

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel

    1990-01-01

    Due to the Ada mandate of a number of government agencies, interest in deploying expert systems such as Ada has increased. Recently, several Ada-based expert system tools have been developed. According to a recent benchmark report, these tools do not perform as well as similar tools written in C. While poorly implemented Ada compilers contribute to the poor benchmark result, some fundamental problems of the Ada language itself have been uncovered. Here, the authors describe Ada language issues encountered during the deployment of ART-Ada, an expert system tool for Ada deployment. ART-Ada is being used to implement several prototype expert systems for the Space Station Freedom and the U.S. Air Force.

  16. System Data Model (SDM) Source Code

    DTIC Science & Technology

    2012-08-23

    subject 407: ecode pointer to current position in compiled code 408: mstart pointer to the current match start position (can be...repeated call or recursion limit) 425: */ 426: 427: static int 428: match(REGISTER USPTR eptr, REGISTER const uschar * ecode , const uschar *mstart...variables */ 453: 454: frame->Xeptr = eptr; 455: frame->Xecode = ecode ; 456: frame->Xmstart = mstart; 457: frame->Xoffset_top = offset_top; 458

  17. Distributed joint source-channel coding in wireless sensor networks.

    PubMed

    Zhu, Xuqi; Liu, Yu; Zhang, Lin

    2009-01-01

    Considering the fact that sensors are energy-limited and the wireless channel conditions in wireless sensor networks, there is an urgent need for a low-complexity coding method with high compression ratio and noise-resisted features. This paper reviews the progress made in distributed joint source-channel coding which can address this issue. The main existing deployments, from the theory to practice, of distributed joint source-channel coding over the independent channels, the multiple access channels and the broadcast channels are introduced, respectively. To this end, we also present a practical scheme for compressing multiple correlated sources over the independent channels. The simulation results demonstrate the desired efficiency.

  18. MATLAB tensor classes for fast algorithm prototyping : source code.

    SciTech Connect

    Bader, Brett William; Kolda, Tamara Gibson

    2004-10-01

    We present the source code for three MATLAB classes for manipulating tensors in order to allow fast algorithm prototyping. A tensor is a multidimensional or Nway array. This is a supplementary report; details on using this code are provided separately in SAND-XXXX.

  19. ASPT software source code: ASPT signal excision software package

    NASA Astrophysics Data System (ADS)

    Parliament, Hugh

    1992-08-01

    The source code for the ASPT Signal Excision Software Package which is part of the Adaptive Signal Processing Testbed (ASPT) is presented. The source code covers the programs 'excision', 'ab.out', 'd0.out', 'bd1.out', 'develop', 'despread', 'sorting', and 'convert'. These programs are concerned with collecting data, filtering out interference from a spread spectrum signal, analyzing the results, and developing and testing new filtering algorithms.

  20. The FORTRAN static source code analyzer program (SAP) system description

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Merwarth, P.; Oneill, M.; Goorevich, C.; Waligora, S.

    1982-01-01

    A source code analyzer program (SAP) designed to assist personnel in conducting studies of FORTRAN programs is described. The SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. The processing performed by SAP and of the routines, COMMON blocks, and files used by SAP are described. The system generation procedure for SAP is also presented.

  1. Mining Program Source Code for Improving Software Quality

    DTIC Science & Technology

    2013-01-01

    REPORT Mining Program Source Code for Improving Software Quality 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: While the last decade has witnessed great...Z39.18 - 7-Sep-2012 Mining Program Source Code for Improving Software Quality Report Title ABSTRACT While the last decade has witnessed great...businesses, governments, and societies, improving software productivity and quality is an important goal of software engineering. Mining software

  2. Ada task scheduling: A focused Ada investigation

    NASA Technical Reports Server (NTRS)

    Legrand, Sue

    1988-01-01

    The types of control that are important for real time task scheduling are discussed. Some closely related real time issues are mentioned and major committee and research activities in this area are delineated. Although there are some problems with Ada and its real time task scheduling, Ada presents fewer than any known alternative. Ada was designed for the domain of real time embedded systems, but Ada compilers may not contain a level of task scheduling support that is adequate for all real time applications. The question addressed is which implementations of Ada's task scheduling are adequate for effective real time systems for NASA applications.

  3. Transforming AdaPT to Ada

    NASA Technical Reports Server (NTRS)

    Goldsack, Stephen J.; Holzbach-Valero, A. A.; Waldrop, Raymond S.; Volz, Richard A.

    1991-01-01

    This paper describes how the main features of the proposed Ada language extensions intended to support distribution, and offered as possible solutions for Ada9X can be implemented by transformation into standard Ada83. We start by summarizing the features proposed in a paper (Gargaro et al, 1990) which constitutes the definition of the extensions. For convenience we have called the language in its modified form AdaPT which might be interpreted as Ada with partitions. These features were carefully chosen to provide support for the construction of executable modules for execution in nodes of a network of loosely coupled computers, but flexibly configurable for different network architectures and for recovery following failure, or adapting to mode changes. The intention in their design was to provide extensions which would not impact adversely on the normal use of Ada, and would fit well in style and feel with the existing standard. We begin by summarizing the features introduced in AdaPT.

  4. Toward the Automated Generation of Components from Existing Source Code

    SciTech Connect

    Quinlan, D; Yi, Q; Kumfert, G; Epperly, T; Dahlgren, T; Schordan, M; White, B

    2004-12-02

    A major challenge to achieving widespread use of software component technology in scientific computing is an effective migration strategy for existing, or legacy, source code. This paper describes initial work and challenges in automating the identification and generation of components using the ROSE compiler infrastructure and the Babel language interoperability tool. Babel enables calling interfaces expressed in the Scientific Interface Definition Language (SIDL) to be implemented in, and called from, an arbitrary combination of supported languages. ROSE is used to build specialized source-to-source translators that (1) extract a SIDL interface specification from information implicit in existing C++ source code and (2) transform Babel's output to include dispatches to the legacy code.

  5. Availability of Ada and C++ Compilers, Tools, Education and Training

    DTIC Science & Technology

    1991-07-01

    assembler languages of various sorts, C, and Fortran languages. Several provide an interface to Pascal and one to Cobol. The ability to import and export...program that runs on a different target platform. As an example, a Fortran or Pascal compiler running on a DEC VAX computer may produce output which...Ada, Pascal , Fortran, C++, PLI, and Jovial. The entire source code is not necessarily generated and some tools provide user- customizable templates that

  6. Streamlined Genome Sequence Compression using Distributed Source Coding

    PubMed Central

    Wang, Shuang; Jiang, Xiaoqian; Chen, Feng; Cui, Lijuan; Cheng, Samuel

    2014-01-01

    We aim at developing a streamlined genome sequence compression algorithm to support alternative miniaturized sequencing devices, which have limited communication, storage, and computation power. Existing techniques that require heavy client (encoder side) cannot be applied. To tackle this challenge, we carefully examined distributed source coding theory and developed a customized reference-based genome compression protocol to meet the low-complexity need at the client side. Based on the variation between source and reference, our protocol will pick adaptively either syndrome coding or hash coding to compress subsequences of changing code length. Our experimental results showed promising performance of the proposed method when compared with the state-of-the-art algorithm (GRS). PMID:25520552

  7. Encoding of multi-alphabet sources by binary arithmetic coding

    NASA Astrophysics Data System (ADS)

    Guo, Muling; Oka, Takahumi; Kato, Shigeo; Kajiwara, Hiroshi; Kawamura, Naoto

    1998-12-01

    In case of encoding a multi-alphabet source, the multi- alphabet symbol sequence can be encoded directly by a multi- alphabet arithmetic encoder, or the sequence can be first converted into several binary sequences and then each binary sequence is encoded by binary arithmetic encoder, such as the L-R arithmetic coder. Arithmetic coding, however, requires arithmetic operations for each symbol and is computationally heavy. In this paper, a binary representation method using Huffman tree is introduced to reduce the number of arithmetic operations, and a new probability approximation for L-R arithmetic coding is further proposed to improve the coding efficiency when the probability of LPS (Least Probable Symbol) is near 0.5. Simulation results show that our proposed scheme has high coding efficacy and can reduce the number of coding symbols.

  8. Distributed Joint Source-Channel Coding in Wireless Sensor Networks

    PubMed Central

    Zhu, Xuqi; Liu, Yu; Zhang, Lin

    2009-01-01

    Considering the fact that sensors are energy-limited and the wireless channel conditions in wireless sensor networks, there is an urgent need for a low-complexity coding method with high compression ratio and noise-resisted features. This paper reviews the progress made in distributed joint source-channel coding which can address this issue. The main existing deployments, from the theory to practice, of distributed joint source-channel coding over the independent channels, the multiple access channels and the broadcast channels are introduced, respectively. To this end, we also present a practical scheme for compressing multiple correlated sources over the independent channels. The simulation results demonstrate the desired efficiency. PMID:22408560

  9. Top ten reasons to register your code with the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; DuPrie, Kimberly; Berriman, G. Bruce; Mink, Jessica D.; Nemiroff, Robert J.; Robitaille, Thomas; Schmidt, Judy; Shamir, Lior; Shortridge, Keith; Teuben, Peter J.; Wallin, John F.; Warmels, Rein

    2017-01-01

    With 1,400 codes, the Astrophysics Source Code Library (ASCL, ascl.net) is the largest indexed resource for codes used in astronomy research in existence. This free online registry was established in 1999, is indexed by Web of Science and ADS, and is citable, with citations to its entries tracked by ADS. Registering your code with the ASCL is easy with our online submissions system. Making your software available for examination shows confidence in your research and makes your research more transparent, reproducible, and falsifiable. ASCL registration allows your software to be cited on its own merits and provides a citation that is trackable and accepted by all astronomy journals and journals such as Science and Nature. Registration also allows others to find your code more easily. This presentation covers the benefits of registering astronomy research software with the ASCL.

  10. GSFC Ada programming guidelines

    NASA Technical Reports Server (NTRS)

    Roy, Daniel M.; Nelson, Robert W.

    1986-01-01

    A significant Ada effort has been under way at Goddard for the last two years. To ease the center's transition toward Ada (notably for future space station projects), a cooperative effort of half a dozen companies and NASA personnel was started in 1985 to produce programming standards and guidelines for the Ada language. The great richness of the Ada language and the need of programmers for good style examples makes Ada programming guidelines an important tool to smooth the Ada transition. Because of the natural divergence of technical opinions, the great diversity of our government and private organizations and the novelty of the Ada technology, the creation of an Ada programming guidelines document is a difficult and time consuming task. It is also a vital one. Steps must now be taken to ensure that the guide is refined in an organized but timely manner to reflect the growing level of expertise of the Ada community.

  11. Using cryptology models for protecting PHP source code

    NASA Astrophysics Data System (ADS)

    Jevremović, Aleksandar; Ristić, Nenad; Veinović, Mladen

    2013-10-01

    Protecting PHP scripts from unwanted use, copying and modifications is a big issue today. Existing solutions on source code level are mostly working as obfuscators, they are free, and they are not providing any serious protection. Solutions that encode opcode are more secure, but they are commercial and require closed-source proprietary PHP interpreter's extension. Additionally, encoded opcode is not compatible with future versions of interpreters which imply re-buying encoders from the authors. Finally, if extension source-code is compromised, all scripts encoded with that solution are compromised too. In this paper, we will present a new model for free and open-source PHP script protection solution. Protection level provided by the proposed solution is equal to protection level of commercial solutions. Model is based on conclusions from use of standard cryptology models for analysis of strengths and weaknesses of the existing solutions, when a scripts protection is seen as secure communication channel in the cryptology.

  12. Source mask optimization using real-coded genetic algorithms

    NASA Astrophysics Data System (ADS)

    Yang, Chaoxing; Wang, Xiangzhao; Li, Sikun; Erdmann, Andreas

    2013-04-01

    Source mask optimization (SMO) is considered to be one of the technologies to push conventional 193nm lithography to its ultimate limits. In comparison with other SMO methods that use an inverse problem formulation, SMO based on genetic algorithm (GA) requires very little knowledge of the process, and has the advantage of flexible problem formulation. Recent publications on SMO using a GA employ a binary-coded GA. In general, the performance of a GA depends not only on the merit or fitness function, but also on the parameters, operators and their algorithmic implementation. In this paper, we propose a SMO method using real-coded GA where the source and mask solutions are represented by floating point strings instead of bit strings. Besides from that, the selection, crossover, and mutation operators are replaced by corresponding floating-point versions. Both binary-coded and real-coded genetic algorithms were implemented in two versions of SMO and compared in numerical experiments, where the target patterns are staggered contact holes and a logic pattern with critical dimensions of 100 nm, respectively. The results demonstrate the performance improvement of the real-coded GA in comparison to the binary-coded version. Specifically, these improvements can be seen in a better convergence behavior. For example, the numerical experiments for the logic pattern showed that the average number of generations to converge to a proper fitness of 6.0 using the real-coded method is 61.8% (100 generations) less than that using binary-coded method.

  13. Plagiarism Detection Algorithm for Source Code in Computer Science Education

    ERIC Educational Resources Information Center

    Liu, Xin; Xu, Chan; Ouyang, Boyu

    2015-01-01

    Nowadays, computer programming is getting more necessary in the course of program design in college education. However, the trick of plagiarizing plus a little modification exists among some students' home works. It's not easy for teachers to judge if there's plagiarizing in source code or not. Traditional detection algorithms cannot fit this…

  14. Supporting Source Code Comprehension during Software Evolution and Maintenance

    ERIC Educational Resources Information Center

    Alhindawi, Nouh

    2013-01-01

    This dissertation addresses the problems of program comprehension to support the evolution of large-scale software systems. The research concerns how software engineers locate features and concepts along with categorizing changes within very large bodies of source code along with their versioned histories. More specifically, advanced Information…

  15. A Comparison of Source Code Plagiarism Detection Engines

    ERIC Educational Resources Information Center

    Lancaster, Thomas; Culwin, Fintan

    2004-01-01

    Automated techniques for finding plagiarism in student source code submissions have been in use for over 20 years and there are many available engines and services. This paper reviews the literature on the major modern detection engines, providing a comparison of them based upon the metrics and techniques they deploy. Generally the most common and…

  16. Secondary neutron source modelling using MCNPX and ALEPH codes

    NASA Astrophysics Data System (ADS)

    Trakas, Christos; Kerkar, Nordine

    2014-06-01

    Monitoring the subcritical state and divergence of reactors requires the presence of neutron sources. But mainly secondary neutrons from these sources feed the ex-core detectors (SRD, Source Range Detector) whose counting rate is correlated with the level of the subcriticality of reactor. In cycle 1, primary neutrons are provided by sources activated outside of the reactor (e.g. Cf252); part of this source can be used for the divergence of cycle 2 (not systematic). A second family of neutron sources is used for the second cycle: the spontaneous neutrons of actinides produced after irradiation of fuel in the first cycle. Both families of sources are not sufficient to efficiently monitor the divergence of the second cycles and following ones, in most reactors. Secondary sources cluster (SSC) fulfil this role. In the present case, the SSC [Sb, Be], after activation in the first cycle (production of Sb124, unstable), produces in subsequent cycles a photo-neutron source by gamma (from Sb124)-neutron (on Be9) reaction. This paper presents the model of the process between irradiation in cycle 1 and cycle 2 results for SRD counting rate at the beginning of cycle 2, using the MCNPX code and the depletion chain ALEPH-V1 (coupling of MCNPX and ORIGEN codes). The results of this simulation are compared with two experimental results of the PWR 1450 MWe-N4 reactors. A good agreement is observed between these results and the simulations. The subcriticality of the reactors is about at -15,000 pcm. Discrepancies on the SRD counting rate between calculations and measurements are in the order of 10%, lower than the combined uncertainty of measurements and code simulation. This comparison validates the AREVA methodology, which allows having an SRD counting rate best-estimate for cycles 2 and next ones and optimizing the position of the SSC, depending on the geographic location of sources, main parameter for optimal monitoring of subcritical states.

  17. Source-Code Instrumentation and Quantification of Events

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Havelund, Klaus; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Aspect Oriented Programming (AOP) is making quantified programmatic assertions over programs that otherwise are not annotated to receive these assertions. Varieties of AOP systems are characterized by which quantified assertions they allow, what they permit in the actions of the assertions (including how the actions interact with the base code), and what mechanisms they use to achieve the overall effect. Here, we argue that all quantification is over dynamic events, and describe our preliminary work in developing a system that maps dynamic events to transformations over source code. We discuss possible applications of this system, particularly with respect to debugging concurrent systems.

  18. A small evaluation suite for Ada compilers

    NASA Technical Reports Server (NTRS)

    Wilke, Randy; Roy, Daniel M.

    1986-01-01

    After completing a small Ada pilot project (OCC simulator) for the Multi Satellite Operations Control Center (MSOCC) at Goddard last year, the use of Ada to develop OCCs was recommended. To help MSOCC transition toward Ada, a suite of about 100 evaluation programs was developed which can be used to assess Ada compilers. These programs compare the overall quality of the compilation system, compare the relative efficiencies of the compilers and the environments in which they work, and compare the size and execution speed of generated machine code. Another goal of the benchmark software was to provide MSOCC system developers with rough timing estimates for the purpose of predicting performance of future systems written in Ada.

  19. ART/Ada and CLIPS/Ada

    NASA Technical Reports Server (NTRS)

    Culbert, Chris

    1990-01-01

    Although they have reached a point of commercial viability, expert systems were originally developed in artificial intelligence (AI) research environments. Many of the available tools still work best in such environments. These environments typically utilize special hardware such as LISP machines and relatively unfamiliar languages such as LISP or Prolog. Space Station applications will require deep integration of expert system technology with applications developed in conventional languages, specifically Ada. The ability to apply automation to Space Station functions could be greatly enhanced by widespread availability of state-of-the-art expert system tools based on Ada. Although there have been some efforts to examine the use of Ada for AI applications, there are few, if any, existing products which provide state-of-the-art AI capabilities in an Ada tool. The goal of the ART/Ada Design Project is to conduct research into the implementation in Ada of state-of-the-art hybrid expert systems building tools (ESBT's). This project takes the following approach: using the existing design of the ART-IM ESBT as a starting point, analyze the impact of the Ada language and Ada development methodologies on that design; redesign the system in Ada; and analyze its performance. The research project will attempt to achieve a comprehensive understanding of the potential for embedding expert systems in Ada systems for eventual application in future Space Station Freedom projects. During Phase 1 of the project, initial requirements analysis, design, and implementation of the kernel subset of ART-IM functionality was completed. During Phase 2, the effort has been focused on the implementation and performance analysis of several versions with increasing functionality. Since production quality ART/Ada tools will not be available for a considerable time, and additional subtask of this project will be the completion of an Ada version of the CLIPS expert system shell developed by NASA

  20. ADAS Update and Maintainability

    NASA Technical Reports Server (NTRS)

    Watson, Leela R.

    2010-01-01

    Since 2000, both the National Weather Service Melbourne (NWS MLB) and the Spaceflight Meteorology Group (SMG) have used a local data integration system (LOIS) as part of their forecast and warning operations. The original LOIS was developed by the Applied Meteorology Unit (AMU) in 1998 (Manobianco and Case 1998) and has undergone subsequent improvements. Each has benefited from three-dimensional (3-D) analyses that are delivered to forecasters every 15 minutes across the peninsula of Florida. The intent is to generate products that enhance short-range weather forecasts issued in support of NWS MLB and SMG operational requirements within East Central Florida. The current LDIS uses the Advanced Regional Prediction System (ARPS) Data Analysis System (AD AS) package as its core, which integrates a wide variety of national, regional, and local observational data sets. It assimilates all available real-time data within its domain and is run at a finer spatial and temporal resolution than current national or regional-scale analysis packages. As such, it provides local forecasters with a more comprehensive understanding of evolving fine-scale weather features. Over the years, the LDIS has become problematic to maintain since it depends on AMU-developed shell scripts that were written for an earlier version of the ADAS software. The goals of this task were to update the NWS MLB/SMG LDIS with the latest version of ADAS, incorporate new sources of observational data, and upgrade and modify the AMU-developed shell scripts written to govern the system. In addition, the previously developed ADAS graphical user interface (GUI) was updated. Operationally, these upgrades will result in more accurate depictions of the current local environment to help with short-range weather forecasting applications, while also offering an improved initialization for local versions of the Weather Research and Forecasting (WRF) model used by both groups.

  1. A MCTF video coding scheme based on distributed source coding principles

    NASA Astrophysics Data System (ADS)

    Tagliasacchi, Marco; Tubaro, Stefano

    2005-07-01

    Motion Compensated Temporal Filtering (MCTF) has proved to be an efficient coding tool in the design of open-loop scalable video codecs. In this paper we propose a MCTF video coding scheme based on lifting where the prediction step is implemented using PRISM (Power efficient, Robust, hIgh compression Syndrome-based Multimedia coding), a video coding framework built on distributed source coding principles. We study the effect of integrating the update step at the encoder or at the decoder side. We show that the latter approach allows to improve the quality of the side information exploited during decoding. We present the analytical results obtained by modeling the video signal along the motion trajectories as a first order auto-regressive process. We show that the update step at the decoder allows to half the contribution of the quantization noise. We also include experimental results with real video data that demonstrate the potential of this approach when the video sequences are coded at low bitrates.

  2. ART/Ada design project, phase 1. Task 3 report: Test plan

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.

    1988-01-01

    The plan is described for the integrated testing and benchmark of Phase Ada based ESBT Design Research Project. The integration testing is divided into two phases: (1) the modules that do not rely on the Ada code generated by the Ada Generator are tested before the Ada Generator is implemented; and (2) all modules are integrated and tested with the Ada code generated by the Ada Generator. Its performance and size as well as its functionality is verified in this phase. The target platform is a DEC Ada compiler on VAX mini-computers and VAX stations running the VMS operating system.

  3. Ada concurrent programming

    SciTech Connect

    Gehani, N.

    1984-01-01

    In this book, Narain Gehani explains the concurrent programming facilities in Ada and shows how to use them effectively in writing concurrent programs. He also surveys concurrent programming facilities in other languages, discusses issues specific to concurrent programming, and examines the limitations of the concurrent programming facilities in Ada. Topics considered include an introduction to concurrent programming, the concurrent programming model in Ada, and a survey of other concurrent programming models; tasking, i.e., concurrent programming facilities in Ada; task types; exceptions and tasking; device drivers; real-time programming; topics related to concurrent programming; more examples of concurrent programming; and synopsis of sequential programming in Ada.

  4. Ada 9X overview

    NASA Technical Reports Server (NTRS)

    Weller, David G.

    1992-01-01

    The current version of Ada has been an ANSI standard since 1983. In 1988, the Ada Joint Program Office was tasked with reevaluating the language and proposing changes to the standard. Since that time, the world has seen a tremendous explosion in object-oriented languages, as well as other growing fields such as distributed computing and support for very large software systems. The speaker will discuss new features being added to the next version of Ada, currently called Ada 9X, and what transition issues must be considered for current Ada projects.

  5. Verification test calculations for the Source Term Code Package

    SciTech Connect

    Denning, R S; Wooton, R O; Alexander, C A; Curtis, L A; Cybulskis, P; Gieseke, J A; Jordan, H; Lee, K W; Nicolosi, S L

    1986-07-01

    The purpose of this report is to demonstrate the reasonableness of the Source Term Code Package (STCP) results. Hand calculations have been performed spanning a wide variety of phenomena within the context of a single accident sequence, a loss of all ac power with late containment failure, in the Peach Bottom (BWR) plant, and compared with STCP results. The report identifies some of the limitations of the hand calculation effort. The processes involved in a core meltdown accident are complex and coupled. Hand calculations by their nature must deal with gross simplifications of these processes. Their greatest strength is as an indicator that a computer code contains an error, for example that it doesn't satisfy basic conservation laws, rather than in showing the analysis accurately represents reality. Hand calculations are an important element of verification but they do not satisfy the need for code validation. The code validation program for the STCP is a separate effort. In general the hand calculation results show that models used in the STCP codes (e.g., MARCH, TRAP-MELT, VANESA) obey basic conservation laws and produce reasonable results. The degree of agreement and significance of the comparisons differ among the models evaluated. 20 figs., 26 tabs.

  6. The Need for Vendor Source Code at NAS. Revised

    NASA Technical Reports Server (NTRS)

    Carter, Russell; Acheson, Steve; Blaylock, Bruce; Brock, David; Cardo, Nick; Ciotti, Bob; Poston, Alan; Wong, Parkson; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    The Numerical Aerodynamic Simulation (NAS) Facility has a long standing practice of maintaining buildable source code for installed hardware. There are two reasons for this: NAS's designated pathfinding role, and the need to maintain a smoothly running operational capacity given the widely diversified nature of the vendor installations. NAS has a need to maintain support capabilities when vendors are not able; diagnose and remedy hardware or software problems where applicable; and to support ongoing system software development activities whether or not the relevant vendors feel support is justified. This note provides an informal history of these activities at NAS, and brings together the general principles that drive the requirement that systems integrated into the NAS environment run binaries built from source code, onsite.

  7. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (IBM VERSION)

    NASA Technical Reports Server (NTRS)

    Manteufel, R.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  8. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (DEC VAX VERSION)

    NASA Technical Reports Server (NTRS)

    Merwarth, P. D.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  9. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  10. Development of an Ada programming support environment database SEAD (Software Engineering and Ada Database) administration manual

    NASA Technical Reports Server (NTRS)

    Liaw, Morris; Evesson, Donna

    1988-01-01

    Software Engineering and Ada Database (SEAD) was developed to provide an information resource to NASA and NASA contractors with respect to Ada-based resources and activities which are available or underway either in NASA or elsewhere in the worldwide Ada community. The sharing of such information will reduce duplication of effort while improving quality in the development of future software systems. SEAD data is organized into five major areas: information regarding education and training resources which are relevant to the life cycle of Ada-based software engineering projects such as those in the Space Station program; research publications relevant to NASA projects such as the Space Station Program and conferences relating to Ada technology; the latest progress reports on Ada projects completed or in progress both within NASA and throughout the free world; Ada compilers and other commercial products that support Ada software development; and reusable Ada components generated both within NASA and from elsewhere in the free world. This classified listing of reusable components shall include descriptions of tools, libraries, and other components of interest to NASA. Sources for the data include technical newletters and periodicals, conference proceedings, the Ada Information Clearinghouse, product vendors, and project sponsors and contractors.

  11. Documentation generator application for MatLab source codes

    NASA Astrophysics Data System (ADS)

    Niton, B.; Pozniak, K. T.; Romaniuk, R. S.

    2011-06-01

    The UML, which is a complex system modeling and description technology, has recently been expanding its uses in the field of formalization and algorithmic approach to such systems like multiprocessor photonic, optoelectronic and advanced electronics carriers; distributed, multichannel measurement systems; optical networks, industrial electronics, novel R&D solutions. The paper describes a realization of an application for documenting MatLab source codes. There are presented own novel solution based on Doxygen program which is available on the free license, with accessible source code. The used supporting tools for parser building were Bison and Flex. There are presented the practical results of the documentation generator. The program was applied for exemplary MatLab codes. The documentation generator application is used for design of large optoelectronic and electronic measurement and control systems. The paper consists of three parts which describe the following components of the documentation generator for photonic and electronic systems: concept, MatLab application and VHDL application. This is part two which describes the MatLab application. MatLab is used for description of the measured phenomena.

  12. Development of parallel DEM for the open source code MFIX

    SciTech Connect

    Gopalakrishnan, Pradeep; Tafti, Danesh

    2013-02-01

    The paper presents the development of a parallel Discrete Element Method (DEM) solver for the open source code, Multiphase Flow with Interphase eXchange (MFIX) based on the domain decomposition method. The performance of the code was evaluated by simulating a bubbling fluidized bed with 2.5 million particles. The DEM solver shows strong scalability up to 256 processors with an efficiency of 81%. Further, to analyze weak scaling, the static height of the fluidized bed was increased to hold 5 and 10 million particles. The results show that global communication cost increases with problem size while the computational cost remains constant. Further, the effects of static bed height on the bubble hydrodynamics and mixing characteristics are analyzed.

  13. Users manual for doctext: Producing documentation from C source code

    SciTech Connect

    Gropp, W.

    1995-03-01

    One of the major problems that software library writers face, particularly in a research environment, is the generation of documentation. Producing good, professional-quality documentation is tedious and time consuming. Often, no documentation is produced. For many users, however, much of the need for documentation may be satisfied by a brief description of the purpose and use of the routines and their arguments. Even for more complete, hand-generated documentation, this information provides a convenient starting point. We describe here a tool that may be used to generate documentation about programs written in the C language. It uses a structured comment convention that preserves the original C source code and does not require any additional files. The markup language is designed to be an almost invisible structured comment in the C source code, retaining readability in the original source. Documentation in a form suitable for the Unix man program (nroff), LaTeX, and the World Wide Web can be produced.

  14. A robust CELP coder with source-dependent channel coding

    NASA Technical Reports Server (NTRS)

    Sukkar, Rafid A.; Kleijn, W. Bastiaan

    1990-01-01

    A CELP coder using Source Dependent Channel Encoding (SDCE) for optimal channel error protection is introduced. With SDCE, each of the CELP parameters are encoded by minimizing a perceptually meaningful error criterion under prevalent channel conditions. Unlike conventional channel coding schemes, SDCE allows for optimal balance between error detection and correction. The experimental results show that the CELP system is robust under various channel bit error rates and displays a graceful degradation in SSNR as the channel error rate increases. This is a desirable property to have in a coder since the exact channel conditions cannot usually be specified a priori.

  15. How I treat ADA deficiency.

    PubMed

    Gaspar, H Bobby; Aiuti, Alessandro; Porta, Fulvio; Candotti, Fabio; Hershfield, Michael S; Notarangelo, Luigi D

    2009-10-22

    Adenosine deaminase deficiency is a disorder of purine metabolism leading to severe combined immunodeficiency (ADA-SCID). Without treatment, the condition is fatal and requires early intervention. Haematopoietic stem cell transplantation is the major treatment for ADA-SCID, although survival following different donor sources varies considerably. Unlike other SCID forms, 2 other options are available for ADA-SCID: enzyme replacement therapy (ERT) with pegylated bovine ADA, and autologous haematopoietic stem cell gene therapy (GT). Due to the rarity of the condition, the lack of large scale outcome studies, and availability of different treatments, guidance on treatment strategies is limited. We have reviewed the currently available evidence and together with our experience of managing this condition propose a consensus management strategy. Matched sibling donor transplants represent a successful treatment option with high survival rates and excellent immune recovery. Mismatched parental donor transplants have a poor survival outcome and should be avoided unless other treatments are unavailable. ERT and GT both show excellent survival, and therefore the choice between ERT, MUD transplant, or GT is difficult and dependent on several factors, including accessibility to the different modalities, response of patients to long-term ERT, and the attitudes of physicians and parents to the short- and potential long-term risks associated with different treatments.

  16. Utilities for master source code distribution: MAX and Friends

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1988-01-01

    MAX is a program for the manipulation of FORTRAN master source code (MSC). This is a technique by which one maintains one and only one master copy of a FORTRAN program under a program developing system, which for MAX is assumed to be VAX/VMS. The master copy is not intended to be directly compiled. Instead it must be pre-processed by MAX to produce compilable instances. These instances may correspond to different code versions (for example, double precision versus single precision), different machines (for example, IBM, CDC, Cray) or different operating systems (i.e., VAX/VMS versus VAX/UNIX). The advantage os using a master source is more pronounced in complex application programs that are developed and maintained over many years and are to be transported and executed on several computer environments. The version lag problem that plagues many such programs is avoided by this approach. MAX is complemented by several auxiliary programs that perform nonessential functions. The ensemble is collectively known as MAX and Friends. All of these programs, including MAX, are executed as foreign VAX/VMS commands and can easily be hidden in customized VMS command procedures.

  17. LLNL state-of-the-art codes for source calculations

    SciTech Connect

    Glenn, L.A.

    1995-02-01

    The explosion-source region is defined as the region surrounding an underground explosion that cannot be described by elastic or anelastic theory. This region extends typically to ranges on the order of 1 km/kt. For the simulation or analysis of seismic signals, what is required is the time resolved motion and stress state at the inelastic boundary. Various analytic approximations have been made for these boundary conditions, but since they rely on near-field empirical data they cannot be expected to reliably extrapolate to different explosion sites. More important, without some knowledge of the initial energy density and the characteristics of the medium immediately surrounding the explosion, these simplified models are unable to distinguish chemical from nuclear explosions, identify cavity decoupling, or account for such phenomena as anomalous dissipation via pore collapse. The purpose here is to document the state-of-the-art codes at LLNL involved in simulating underground (chemical and nuclear) explosions and, in so doing, present an overview of the physics. In what follows, the authors first describe the fundamental equations involved, discuss solution methods, coordinate frames and dimensionality. Then they identify the codes used at LLNL and their limitations. A companion report will describe the factors that most influence the seismic response, i.e., the source properties important for discrimination. That report will emphasize the coupling between the rock properties and the characteristics of the explosion cavity.

  18. Righting the ADA

    ERIC Educational Resources Information Center

    National Council on Disability, 2004

    2004-01-01

    Many Americans with disabilities feel that a series of negative court decisions is reducing their status to that of "second-class citizens," a status that the Americans with Disabilities Act (ADA) was supposed to remedy forever. In this report, the National Council on Disability (NCD), which first proposed the enactment of an ADA and…

  19. ART/Ada design project, phase 1: Project plan

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.

    1988-01-01

    The plan and schedule for Phase 1 of the Ada based ESBT Design Research Project is described. The main platform for the project is a DEC Ada compiler on VAX mini-computers and VAXstations running the Virtual Memory System (VMS) operating system. The Ada effort and lines of code are given in tabular form. A chart is given of the entire project life cycle.

  20. Source Code Analysis Laboratory (SCALe) for Energy Delivery Systems

    DTIC Science & Technology

    2010-12-01

    applications for conformance to one of the CERT® secure coding standards. CERT secure coding standards provide a detailed enumeration of coding errors...automated analysis tools to help them code securely. Secure coding standards provide a detailed enumeration of coding errors that have caused...including possible additional job aids . SCALe analysts will also be interviewed for context information surrounding incorrect judgments as part of

  1. Annotated Ada 95 Reference Manual.

    DTIC Science & Technology

    1994-12-21

    A. Strohmeier (Swiss Fed Inst of Technology: Switzerland); W. Taylor (consultant: UK); J. Tokar ( Tartan ); E. Vasilescu (Grumman); J. Vladik...Language Preci- 69 sion Team (Odyssey Research Associates), the Ada 9X User/Implementer Teams (AETECH, Tartan , TeleSoft), the Ada 9X Implementation...the RM83. [Ada Issue (AI)) (All An Ada Issue (AT) is a numbered ruling from the ARG. {Ada Commentary Integration Document ( ACID )) ( ACID ) The Ada

  2. Continuation of research into language concepts for the mission support environment: Source code

    NASA Technical Reports Server (NTRS)

    Barton, Timothy J.; Ratner, Jeremiah M.

    1991-01-01

    Research into language concepts for the Mission Control Center is presented. A computer code for source codes is presented. The file contains the routines which allow source code files to be created and compiled. The build process assumes that all elements and the COMP exist in the current directory. The build process places as much code generation as possible on the preprocessor as possible. A summary is given of the source files as used and/or manipulated by the build routine.

  3. ART-Ada: An Ada-based expert system tool

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel; Allen, Bradley P.

    1990-01-01

    The Department of Defense mandate to standardize on Ada as the language for software systems development has resulted in an increased interest in making expert systems technology readily available in Ada environments. NASA's Space Station Freedom is an example of the large Ada software development projects that will require expert systems in the 1990's. Another large scale application that can benefit from Ada based expert system tool technology is the Pilot's Associate (PA) expert system project for military combat aircraft. The Automated Reasoning Tool-Ada (ART-Ada), an Ada expert system tool, is explained. ART-Ada allows applications of a C-based expert system tool called ART-IM to be deployed in various Ada environments. ART-Ada is being used to implement several prototype expert systems for NASA's Space Station Freedom program and the U.S. Air Force.

  4. Using ADA Tasks to Simulate Operating Equipment

    NASA Technical Reports Server (NTRS)

    DeAcetis, Louis A.; Schmidt, Oron; Krishen, Kumar

    1990-01-01

    A method of simulating equipment using ADA tasks is discussed. Individual units of equipment are coded as concurrently running tasks that monitor and respond to input signals. This technique has been used in a simulation of the space-to-ground Communications and Tracking subsystem of Space Station Freedom.

  5. Using Ada tasks to simulate operating equipment

    NASA Technical Reports Server (NTRS)

    Deacetis, Louis A.; Schmidt, Oron; Krishen, Kumar

    1990-01-01

    A method of simulating equipment using Ada tasks is discussed. Individual units of equipment are coded as concurrently running tasks that monitor and respond to input signals. This technique has been used in a simulation of the space-to-ground Communications and Tracking subsystem of Space Station Freedom.

  6. FLOWTRAN-TF v1. 2 source code

    SciTech Connect

    Aleman, S.E.; Cooper, R.E.; Flach, G.P.; Hamm, L.L.; Lee, S.; Smith, F.G. III.

    1993-02-01

    The FLOWTRAN-TF code development effort was initiated in early 1989 as a code to monitor production reactor cooling systems at the Savannah River Plant. This report is a documentation of the various codes that make up FLOWTRAN-TF.

  7. FLOWTRAN-TF v1.2 source code

    SciTech Connect

    Aleman, S.E.; Cooper, R.E.; Flach, G.P.; Hamm, L.L.; Lee, S.; Smith, F.G. III

    1993-02-01

    The FLOWTRAN-TF code development effort was initiated in early 1989 as a code to monitor production reactor cooling systems at the Savannah River Plant. This report is a documentation of the various codes that make up FLOWTRAN-TF.

  8. HELIOS: A new open-source radiative transfer code

    NASA Astrophysics Data System (ADS)

    Malik, Matej; Grosheintz, Luc; Lukas Grimm, Simon; Mendonça, João; Kitzmann, Daniel; Heng, Kevin

    2015-12-01

    I present the new open-source code HELIOS, developed to accurately describe radiative transfer in a wide variety of irradiated atmospheres. We employ a one-dimensional multi-wavelength two-stream approach with scattering. Written in Cuda C++, HELIOS uses the GPU’s potential of massive parallelization and is able to compute the TP-profile of an atmosphere in radiative equilibrium and the subsequent emission spectrum in a few minutes on a single computer (for 60 layers and 1000 wavelength bins).The required molecular opacities are obtained with the recently published code HELIOS-K [1], which calculates the line shapes from an input line list and resamples the numerous line-by-line data into a manageable k-distribution format. Based on simple equilibrium chemistry theory [2] we combine the k-distribution functions of the molecules H2O, CO2, CO & CH4 to generate a k-table, which we then employ in HELIOS.I present our results of the following: (i) Various numerical tests, e.g. isothermal vs. non-isothermal treatment of layers. (ii) Comparison of iteratively determined TP-profiles with their analytical parametric prescriptions [3] and of the corresponding spectra. (iii) Benchmarks of TP-profiles & spectra for various elemental abundances. (iv) Benchmarks of averaged TP-profiles & spectra for the exoplanets GJ1214b, HD189733b & HD209458b. (v) Comparison with secondary eclipse data for HD189733b, XO-1b & Corot-2b.HELIOS is being developed, together with the dynamical core THOR and the chemistry solver VULCAN, in the group of Kevin Heng at the University of Bern as part of the Exoclimes Simulation Platform (ESP) [4], which is an open-source project aimed to provide community tools to model exoplanetary atmospheres.-----------------------------[1] Grimm & Heng 2015, ArXiv, 1503.03806[2] Heng, Lyons & Tsai, Arxiv, 1506.05501Heng & Lyons, ArXiv, 1507.01944[3] e.g. Heng, Mendonca & Lee, 2014, ApJS, 215, 4H[4] exoclime.net

  9. ART-Ada: An Ada-based expert system tool

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel; Allen, Bradley P.

    1991-01-01

    The Department of Defense mandate to standardize on Ada as the language for software systems development has resulted in increased interest in making expert systems technology readily available in Ada environments. NASA's Space Station Freedom is an example of the large Ada software development projects that will require expert systems in the 1990's. Another large scale application that can benefit from Ada based expert system tool technology is the Pilot's Associate (PA) expert system project for military combat aircraft. Automated Reasoning Tool (ART) Ada, an Ada Expert system tool is described. ART-Ada allow applications of a C-based expert system tool called ART-IM to be deployed in various Ada environments. ART-Ada is being used to implement several prototype expert systems for NASA's Space Station Freedom Program and the U.S. Air Force.

  10. Methodology Study for Real-Time Ada Problems

    DTIC Science & Technology

    1989-03-24

    very highly, even though this was seemingly not that important a feature theoretically (see figure 2). The most popular tools were those that automated ...of the availability and maturity of automated software tools which implement an Ada-oriented software method. The issue of availability involves the...oriented: The PAMELA methodology automates the process of moving from symbology to Ada coding. This stabilizes the coding step of the development

  11. Procedures and tools for building large Ada systems

    NASA Technical Reports Server (NTRS)

    Hyde, Ben

    1986-01-01

    Some of the problems unique to building a very large Ada system are addressed. This is done through examples from experience. In the winter of 1985 and 1986, Intermetrics bootstrapped the Ada compiler, which was being built over the last few years. This system consists of about one million lines of full Ada. Over the last few years a number of procedures and tools were adopted for managing the life cycle of each of the many parts of an Ada system. Many of these procedures are well known to most system builders: release management, quality assurance testing; and source file revision control. Others are unique to working in an Ada language environment; i.e., recompilation management, Ada program library management, and managing multiple implementations. First a look is taken at how a large Ada system is broken down into pieces. The Ada definition leaves unspecified a number of issues that the system builder must address: versions, subsystems, multiple implementations, and synchronization of branched development paths. Having introduced how the Ada systems are decomposed, a look is taken, via a series of examples, at how the life cylces of those parts is managed. The procedures and tools used to manage the evolution of the system are examined. It is hoped that other Ada system builders can build upon the experience of the last few years.

  12. Ada Namelist Package

    NASA Technical Reports Server (NTRS)

    Klumpp, Allan R.

    1991-01-01

    Ada Namelist Package, developed for Ada programming language, enables calling program to read and write FORTRAN-style namelist files. Features are: handling of any combination of types defined by user; ability to read vectors, matrices, and slices of vectors and matrices; handling of mismatches between variables in namelist file and those in programmed list of namelist variables; and ability to avoid searching entire input file for each variable. Principle benefits derived by user: ability to read and write namelist-readable files, ability to detect most file errors in initialization phase, and organization keeping number of instantiated units to few packages rather than to many subprograms.

  13. ADA Test and Evaluation

    DTIC Science & Technology

    1981-02-06

    language. LIR. 290; 13.9.1P There should be, a standard (unisafe) way of building an Ada array from a block of storaga pasmed Intoý an Ada routine...66 pqs. (TRW - USAF) 607 go pJs. (ESD/TOIT - USAF) 008 66 pgs. (AFCCPC - USAF) 669 66 pgs. (Hq.SAC - USAF) 616 66 pgs. ( Aerospace Corp. - USAF) 611...Oer.MOD/!AIO) Type CHARACTER 0611- 01 pgs. (Gor.MOD/IAUG) Record Representation* 091- 03 Pgs. Nagle(Ford Aerospace ) Integers - 92 - 66S- 61 pgs

  14. Process Model Improvement for Source Code Plagiarism Detection in Student Programming Assignments

    ERIC Educational Resources Information Center

    Kermek, Dragutin; Novak, Matija

    2016-01-01

    In programming courses there are various ways in which students attempt to cheat. The most commonly used method is copying source code from other students and making minimal changes in it, like renaming variable names. Several tools like Sherlock, JPlag and Moss have been devised to detect source code plagiarism. However, for larger student…

  15. Presenting an Alternative Source Code Plagiarism Detection Framework for Improving the Teaching and Learning of Programming

    ERIC Educational Resources Information Center

    Hattingh, Frederik; Buitendag, Albertus A. K.; van der Walt, Jacobus S.

    2013-01-01

    The transfer and teaching of programming and programming related skills has become, increasingly difficult on an undergraduate level over the past years. This is partially due to the number of programming languages available as well as access to readily available source code over the Web. Source code plagiarism is common practice amongst many…

  16. Technology Infusion of CodeSonar into the Space Network Ground Segment

    NASA Technical Reports Server (NTRS)

    Benson, Markland J.

    2009-01-01

    This slide presentation reviews the applicability of CodeSonar to the Space Network software. CodeSonar is a commercial off the shelf system that analyzes programs written in C, C++ or Ada for defects in the code. Software engineers use CodeSonar results as an input to the existing source code inspection process. The study is focused on large scale software developed using formal processes. The systems studied are mission critical in nature but some use commodity computer systems.

  17. A LISP-Ada connection

    NASA Technical Reports Server (NTRS)

    Jaworski, Allan; Lavallee, David; Zoch, David

    1987-01-01

    The prototype demonstrates the feasibility of using Ada for expert systems and the implementation of an expert-friendly interface which supports knowledge entry. In the Ford LISP-Ada Connection (FLAC) system LISP and Ada are used in ways which complement their respective capabilities. Future investigation will concentrate on the enhancement of the expert knowledge entry/debugging interface and on the issues associated with multitasking and real-time expert systems implementation in Ada.

  18. Simulation of the space station information system in Ada

    NASA Technical Reports Server (NTRS)

    Spiegel, James R.

    1986-01-01

    The Flexible Ada Simulation Tool (FAST) is a discrete event simulation language which is written in Ada. FAST has been used to simulate a number of options for ground data distribution of Space Station payload data. The fact that Ada language is used for implementation has allowed a number of useful interactive features to be built into FAST and has facilitated quick enhancement of its capabilities to support new modeling requirements. General simulation concepts are discussed, and how these concepts are implemented in FAST. The FAST design is discussed, and it is pointed out how the used of the Ada language enabled the development of some significant advantages over classical FORTRAN based simulation languages. The advantages discussed are in the areas of efficiency, ease of debugging, and ease of integrating user code. The specific Ada language features which enable these advances are discussed.

  19. Benchmark Lisp And Ada Programs

    NASA Technical Reports Server (NTRS)

    Davis, Gloria; Galant, David; Lim, Raymond; Stutz, John; Gibson, J.; Raghavan, B.; Cheesema, P.; Taylor, W.

    1992-01-01

    Suite of nonparallel benchmark programs, ELAPSE, designed for three tests: comparing efficiency of computer processing via Lisp vs. Ada; comparing efficiencies of several computers processing via Lisp; or comparing several computers processing via Ada. Tests efficiency which computer executes routines in each language. Available for computer equipped with validated Ada compiler and/or Common Lisp system.

  20. Alma Flor Ada: Writer, Translator, Storyteller.

    ERIC Educational Resources Information Center

    Brodie, Carolyn S.

    2003-01-01

    Discusses the work of children's author Alma Flor Ada, a Cuban native who has won awards honoring Latino writers and illustrators. Includes part of an interview that explores her background, describes activity ideas, and presents a bibliography of works written by her (several title published in both English and Spanish) as well as sources of…

  1. Multicode comparison of selected source-term computer codes

    SciTech Connect

    Hermann, O.W.; Parks, C.V.; Renier, J.P.; Roddy, J.W.; Ashline, R.C.; Wilson, W.B.; LaBauve, R.J.

    1989-04-01

    This report summarizes the results of a study to assess the predictive capabilities of three radionuclide inventory/depletion computer codes, ORIGEN2, ORIGEN-S, and CINDER-2. The task was accomplished through a series of comparisons of their output for several light-water reactor (LWR) models (i.e., verification). Of the five cases chosen, two modeled typical boiling-water reactors (BWR) at burnups of 27.5 and 40 GWd/MTU and two represented typical pressurized-water reactors (PWR) at burnups of 33 and 50 GWd/MTU. In the fifth case, identical input data were used for each of the codes to examine the results of decay only and to show differences in nuclear decay constants and decay heat rates. Comparisons were made for several different characteristics (mass, radioactivity, and decay heat rate) for 52 radionuclides and for nine decay periods ranging from 30 d to 10,000 years. Only fission products and actinides were considered. The results are presented in comparative-ratio tables for each of the characteristics, decay periods, and cases. A brief summary description of each of the codes has been included. Of the more than 21,000 individual comparisons made for the three codes (taken two at a time), nearly half (45%) agreed to within 1%, and an additional 17% fell within the range of 1 to 5%. Approximately 8% of the comparison results disagreed by more than 30%. However, relatively good agreement was obtained for most of the radionuclides that are expected to contribute the greatest impact to waste disposal. Even though some defects have been noted, each of the codes in the comparison appears to produce respectable results. 12 figs., 12 tabs.

  2. Some Techniques in Universal Source Coding and During for Composite Sources.

    DTIC Science & Technology

    1981-12-01

    if £(z) 2: 1 for some xE B() ~~f a ~60 otherwi~se 1 Nov the innermost sum is KC(pqp ;si.) 𔃿 n( for any s a- so 1I f< n EL (B.12) A,- v aa )n =3 E E A...pp. 289-295, 1976. 16. Yu. 11. Starkov , "The coding; of finite messages on output of a source I with unknown statistic," rooag.nga g JAI 5= Co r n A...J. G. Dunham, "The principle of conservation of entropy," ro2 eIZA. g I J= Aa AllertonConference g Communications. Control. dan S , pp. 440-445

  3. AdaNET phase 0 support for the AdaNET Dynamic Software Inventory (DSI) management system prototype. Catalog of available reusable software components

    NASA Technical Reports Server (NTRS)

    Hanley, Lionel

    1989-01-01

    The Ada Software Repository is a public-domain collection of Ada software and information. The Ada Software Repository is one of several repositories located on the SIMTEL20 Defense Data Network host computer at White Sands Missile Range, and available to any host computer on the network since 26 November 1984. This repository provides a free source for Ada programs and information. The Ada Software Repository is divided into several subdirectories. These directories are organized by topic, and their names and a brief overview of their topics are contained. The Ada Software Repository on SIMTEL20 serves two basic roles: to promote the exchange and use (reusability) of Ada programs and tools (including components) and to promote Ada education.

  4. Techniques and implementation of the embedded rule-based expert system using Ada

    NASA Technical Reports Server (NTRS)

    Liberman, Eugene M.; Jones, Robert E.

    1991-01-01

    Ada is becoming an increasingly popular programming language for large Government-funded software projects. Ada with its portability, transportability, and maintainability lends itself well to today's complex programming environment. In addition, expert systems have also assured a growing role in providing human-like reasoning capability and expertise for computer systems. The integration of expert system technology with Ada programming language, specifically a rule-based expert system using an ART-Ada (Automated Reasoning Tool for Ada) system shell is discussed. The NASA Lewis Research Center was chosen as a beta test site for ART-Ada. The test was conducted by implementing the existing Autonomous Power EXpert System (APEX), a Lisp-base power expert system, in ART-Ada. Three components, the rule-based expert system, a graphics user interface, and communications software make up SMART-Ada (Systems fault Management with ART-Ada). The main objective, to conduct a beta test on the ART-Ada rule-based expert system shell, was achieved. The system is operational. New Ada tools will assist in future successful projects. ART-Ada is one such tool and is a viable alternative to the straight Ada code when an application requires a rule-based or knowledge-based approach.

  5. AN ADA NAMELIST PACKAGE

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.

    1994-01-01

    The Ada Namelist Package, developed for the Ada programming language, enables a calling program to read and write FORTRAN-style namelist files. A namelist file consists of any number of assignment statements in any order. Features of the Ada Namelist Package are: the handling of any combination of user-defined types; the ability to read vectors, matrices, and slices of vectors and matrices; the handling of mismatches between variables in the namelist file and those in the programmed list of namelist variables; and the ability to avoid searching the entire input file for each variable. The principle user benefits of this software are the following: the ability to write namelist-readable files, the ability to detect most file errors in the initialization phase, a package organization that reduces the number of instantiated units to a few packages rather than to many subprograms, a reduced number of restrictions, and an increased execution speed. The Ada Namelist reads data from an input file into variables declared within a user program. It then writes data from the user program to an output file, printer, or display. The input file contains a sequence of assignment statements in arbitrary order. The output is in namelist-readable form. There is a one-to-one correspondence between namelist I/O statements executed in the user program and variables read or written. Nevertheless, in the input file, mismatches are allowed between assignment statements in the file and the namelist read procedure statements in the user program. The Ada Namelist Package itself is non-generic. However, it has a group of nested generic packages following the nongeneric opening portion. The opening portion declares a variety of useraccessible constants, variables and subprograms. The subprograms are procedures for initializing namelists for reading, reading and writing strings. The subprograms are also functions for analyzing the content of the current dataset and diagnosing errors. Two nested

  6. Software Engineering in Ada

    DTIC Science & Technology

    1988-03-22

    Avo Valmu - An aces value provides the locationi of. or "polnah. to". ani 4bjec which has been craea~d by the evaluation of an allocator. Keyword: amis...and traininp need.. of the DOD community. including methodologies and materials to rill those needu. Ada Va~lidation Orgamluatlo ( AVO ) - The component...type we the tesis for equality and inequality and the assignment operation. unless the type is limited.. in which case no operations awe implicitly

  7. Joint source-channel coding for a quantum multiple access channel

    NASA Astrophysics Data System (ADS)

    Wilde, Mark M.; Savov, Ivan

    2012-11-01

    Suppose that two senders each obtain one share of the output of a classical, bivariate, correlated information source. They would like to transmit the correlated source to a receiver using a quantum multiple access channel. In prior work, Cover, El Gamal and Salehi provided a combined source-channel coding strategy for a classical multiple access channel which outperforms the simpler ‘separation’ strategy where separate codebooks are used for the source coding and the channel coding tasks. In this paper, we prove that a coding strategy similar to the Cover-El Gamal-Salehi strategy and a corresponding quantum simultaneous decoder allow for the reliable transmission of a source over a quantum multiple access channel, as long as a set of information inequalities involving the Holevo quantity hold.

  8. Integrity and security in an Ada runtime environment

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    A review is provided of the Formal Methods group discussions. It was stated that integrity is not a pure mathematical dual of security. The input data is part of the integrity domain. The group provided a roadmap for research. One item of the roadmap and the final position statement are closely related to the space shuttle and space station. The group's position is to use a safe subset of Ada. Examples of safe sets include the Army Secure Operating System and the Penelope Ada verification tool. It is recommended that a conservative attitude is required when writing Ada code for life and property critical systems.

  9. Open Genetic Code: on open source in the life sciences.

    PubMed

    Deibel, Eric

    2014-01-01

    The introduction of open source in the life sciences is increasingly being suggested as an alternative to patenting. This is an alternative, however, that takes its shape at the intersection of the life sciences and informatics. Numerous examples can be identified wherein open source in the life sciences refers to access, sharing and collaboration as informatic practices. This includes open source as an experimental model and as a more sophisticated approach of genetic engineering. The first section discusses the greater flexibly in regard of patenting and the relationship to the introduction of open source in the life sciences. The main argument is that the ownership of knowledge in the life sciences should be reconsidered in the context of the centrality of DNA in informatic formats. This is illustrated by discussing a range of examples of open source models. The second part focuses on open source in synthetic biology as exemplary for the re-materialization of information into food, energy, medicine and so forth. The paper ends by raising the question whether another kind of alternative might be possible: one that looks at open source as a model for an alternative to the commodification of life that is understood as an attempt to comprehensively remove the restrictions from the usage of DNA in any of its formats.

  10. The random energy model in a magnetic field and joint source channel coding

    NASA Astrophysics Data System (ADS)

    Merhav, Neri

    2008-09-01

    We demonstrate that there is an intimate relationship between the magnetic properties of Derrida’s random energy model (REM) of spin glasses and the problem of joint source-channel coding in Information Theory. In particular, typical patterns of erroneously decoded messages in the coding problem have “magnetization” properties that are analogous to those of the REM in certain phases, where the non-uniformity of the distribution of the source in the coding problem plays the role of an external magnetic field applied to the REM. We also relate the ensemble performance (random coding exponents) of joint source-channel codes to the free energy of the REM in its different phases.

  11. Source Authentication for Code Dissemination Supporting Dynamic Packet Size in Wireless Sensor Networks †

    PubMed Central

    Kim, Daehee; Kim, Dongwan; An, Sunshin

    2016-01-01

    Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption. PMID:27409616

  12. Experiences with Ada (Trademark) Code Generation

    DTIC Science & Technology

    1984-12-05

    and dynamic objects. In this case, the integer length, and the descriptors for the variables dyn arrl and dyn.arr2 are static objects. Descriptors are...v.o. of dyn.arrl 0 te I" virtual I,.- O__Ign _> dyn.arr2 descriptor3,n- arrl i-" offset 4 v.o. of dyn.arr.2 stack data for dyn arrl grows this way 4...11/780 Architecture Handbook , Digital Equipment Corporation, 1981. [Cor8l] J. R. Cortopassi, "RX, A RIGEL Interpreter", Master’s Thesis, Computer

  13. Model-Based Least Squares Reconstruction of Coded Source Neutron Radiographs: Integrating the ORNL HFIR CG1D Source Model

    SciTech Connect

    Santos-Villalobos, Hector J; Gregor, Jens; Bingham, Philip R

    2014-01-01

    At the present, neutron sources cannot be fabricated small and powerful enough in order to achieve high resolution radiography while maintaining an adequate flux. One solution is to employ computational imaging techniques such as a Magnified Coded Source Imaging (CSI) system. A coded-mask is placed between the neutron source and the object. The system resolution is increased by reducing the size of the mask holes and the flux is increased by increasing the size of the coded-mask and/or the number of holes. One limitation of such system is that the resolution of current state-of-the-art scintillator-based detectors caps around 50um. To overcome this challenge, the coded-mask and object are magnified by making the distance from the coded-mask to the object much smaller than the distance from object to detector. In previous work, we have shown via synthetic experiments that our least squares method outperforms other methods in image quality and reconstruction precision because of the modeling of the CSI system components. However, the validation experiments were limited to simplistic neutron sources. In this work, we aim to model the flux distribution of a real neutron source and incorporate such a model in our least squares computational system. We provide a full description of the methodology used to characterize the neutron source and validate the method with synthetic experiments.

  14. Building guide : how to build Xyce from source code.

    SciTech Connect

    Keiter, Eric Richard; Russo, Thomas V.; Schiek, Richard Louis; Sholander, Peter E.; Thornquist, Heidi K.; Mei, Ting; Verley, Jason C.

    2013-08-01

    While Xyce uses the Autoconf and Automake system to configure builds, it is often necessary to perform more than the customary %E2%80%9C./configure%E2%80%9D builds many open source users have come to expect. This document describes the steps needed to get Xyce built on a number of common platforms.

  15. Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.

    2013-12-01

    Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third

  16. AdaNET executive summary

    NASA Technical Reports Server (NTRS)

    Digman, R. Michael

    1988-01-01

    The goal of AdaNET is to transfer existing and emerging software engineering technology from the Federal government to the private sector. The views and perspectives of the current project participants on long and short term goals for AdaNET; organizational structure; resources and returns; summary of identified AdaNET services; and the summary of the organizational model currently under discussion are presented.

  17. Evaluating Open-Source Full-Text Search Engines for Matching ICD-10 Codes.

    PubMed

    Jurcău, Daniel-Alexandru; Stoicu-Tivadar, Vasile

    2016-01-01

    This research presents the results of evaluating multiple free, open-source engines on matching ICD-10 diagnostic codes via full-text searches. The study investigates what it takes to get an accurate match when searching for a specific diagnostic code. For each code the evaluation starts by extracting the words that make up its text and continues with building full-text search queries from the combinations of these words. The queries are then run against all the ICD-10 codes until a match indicates the code in question as a match with the highest relative score. This method identifies the minimum number of words that must be provided in order for the search engines choose the desired entry. The engines analyzed include a popular Java-based full-text search engine, a lightweight engine written in JavaScript which can even execute on the user's browser, and two popular open-source relational database management systems.

  18. AdaNET research project

    NASA Technical Reports Server (NTRS)

    Digman, R. Michael

    1988-01-01

    The components necessary for the success of the commercialization of an Ada Technology Transition Network are reported in detail. The organizational plan presents the planned structure for services development and technical transition of AdaNET services to potential user communities. The Business Plan is the operational plan for the AdaNET service as a commercial venture. The Technical Plan is the plan from which the AdaNET can be designed including detailed requirements analysis. Also contained is an analysis of user fees and charges, and a proposed user fee schedule.

  19. Introduction to Image Algebra Ada

    NASA Astrophysics Data System (ADS)

    Wilson, Joseph N.

    1991-07-01

    Image Algebra Ada (IAA) is a superset of the Ada programming language designed to support use of the Air Force Armament Laboratory's image algebra in the development of computer vision application programs. The IAA language differs from other computer vision languages is several respects. It is machine independent, and an IAA translator has been implemented in the military standard Ada language. Its image operands and operations can be used to program a range of both low- and high-level vision algorithms. This paper provides an overview of the image algebra constructs supported in IAA and describes the embodiment of these constructs in the IAA extension of Ada. Examples showing the use of IAA for a range of computer vision tasks are given. The design of IAA as a superset of Ada and the implementation of the initial translator in Ada represent critical choices. The authors discuss the reasoning behind these choices as well as the benefits and drawbacks associated with them. Implementation strategies associated with the use of Ada as an implementation language for IAA are also discussed. While one can look on IAA as a program design language (PDL) for specifying Ada programs, it is useful to consider IAA as a separate language superset of Ada. This admits the possibility of directly translating IAA for implementation on special purpose architectures. This paper explores strategies for porting IAA to various architectures and notes the critical language and implementation features for porting to different architectures.

  20. Transforming AdaPT to Ada9x

    NASA Technical Reports Server (NTRS)

    Goldsack, Stephen J.; Holzbach-Valero, A. A.; Volz, Richard A.; Waldrop, Raymond S.

    1993-01-01

    How the concepts of AdaPT can be transformed into programs using the object oriented features proposed in the preliminary mapping for Ada9x are described. Emphasizing, as they do, the importance of data types as units of program, these features match well with the development of partitions as translations into Abstract Data Types which was exploited in the Ada83 translation covered in report R3. By providing a form of polymorphic type, the Ada83 version also gives support for the conformant partition idea which could be achieved in Ada83 only by using UNCHECKED CONVERSIONS. It is assumed that the reader understands AdaPT itself, but the translation into Ada83 is briefly reviewed, by applying it to a small example. This is then used to show how the same translation would be achieved in the 9x version. It is important to appreciate that the distribution features which are proposed in current mapping are not used or discussed in any detail, as those are not well matched to the AdaPT approach. Critical evaluation and comparison of these approaches is given in a separate report.

  1. Distributed and parallel Ada and the Ada 9X recommendations

    NASA Technical Reports Server (NTRS)

    Volz, Richard A.; Goldsack, Stephen J.; Theriault, R.; Waldrop, Raymond S.; Holzbacher-Valero, A. A.

    1992-01-01

    Recently, the DoD has sponsored work towards a new version of Ada, intended to support the construction of distributed systems. The revised version, often called Ada 9X, will become the new standard sometimes in the 1990s. It is intended that Ada 9X should provide language features giving limited support for distributed system construction. The requirements for such features are given. Many of the most advanced computer applications involve embedded systems that are comprised of parallel processors or networks of distributed computers. If Ada is to become the widely adopted language envisioned by many, it is essential that suitable compilers and tools be available to facilitate the creation of distributed and parallel Ada programs for these applications. The major languages issues impacting distributed and parallel programming are reviewed, and some principles upon which distributed/parallel language systems should be built are suggested. Based upon these, alternative language concepts for distributed/parallel programming are analyzed.

  2. Distributed and parallel Ada and the Ada 9X recommendations

    SciTech Connect

    Volz, R.A.; Goldsack, S.J.; Theriault, R.; Waldrop, R.S.; Holzbacher-Valero, A.A.

    1992-04-01

    Recently, the DoD has sponsored work towards a new version of Ada, intended to support the construction of distributed systems. The revised version, often called Ada9x, will become the new standard sometimes in the 1990s. It is intended that Ada9x should provide language features giving limited support for distributed system construction. The requirements for such features are given. Many of the most advanced computer applications involve embedded systems that are comprised of parallel processors or networks of distributed computers. If Ada is to become the widely adopted language envisioned by many, it is essential that suitable compilers and tools be available to facilitate the creation of distributed and parallel Ada programs for these applications. The major languages issues impacting distributed and parallel programming are reviewed, and some principles upon which distributed/parallel language systems should be built are suggested. Based upon these, alternative language concepts for distributed/parallel programming are analyzed.

  3. IllinoisGRMHD: an open-source, user-friendly GRMHD code for dynamical spacetimes

    NASA Astrophysics Data System (ADS)

    Etienne, Zachariah B.; Paschalidis, Vasileios; Haas, Roland; Mösta, Philipp; Shapiro, Stuart L.

    2015-09-01

    In the extreme violence of merger and mass accretion, compact objects like black holes and neutron stars are thought to launch some of the most luminous outbursts of electromagnetic and gravitational wave energy in the Universe. Modeling these systems realistically is a central problem in theoretical astrophysics, but has proven extremely challenging, requiring the development of numerical relativity codes that solve Einstein's equations for the spacetime, coupled to the equations of general relativistic (ideal) magnetohydrodynamics (GRMHD) for the magnetized fluids. Over the past decade, the Illinois numerical relativity (ILNR) group's dynamical spacetime GRMHD code has proven itself as a robust and reliable tool for theoretical modeling of such GRMHD phenomena. However, the code was written ‘by experts and for experts’ of the code, with a steep learning curve that would severely hinder community adoption if it were open-sourced. Here we present IllinoisGRMHD, which is an open-source, highly extensible rewrite of the original closed-source GRMHD code of the ILNR group. Reducing the learning curve was the primary focus of this rewrite, with the goal of facilitating community involvement in the code's use and development, as well as the minimization of human effort in generating new science. IllinoisGRMHD also saves computer time, generating roundoff-precision identical output to the original code on adaptive-mesh grids, but nearly twice as fast at scales of hundreds to thousands of cores.

  4. ADA Guide for Small Businesses.

    ERIC Educational Resources Information Center

    Department of Justice, Washington, DC. Civil Rights Div.

    This guide presents an informal overview of some basic Americans with Disabilities Act (ADA) requirements for small businesses that provide goods or services to the public. References to key sections of the regulations or other information are included. The first section describes the ADA briefly. Section two lists the 12 categories of public…

  5. Using Ada: The deeper challenges

    NASA Technical Reports Server (NTRS)

    Feinberg, David A.

    1986-01-01

    The Ada programming language and the associated Ada Programming Support Environment (APSE) and Ada Run Time Environment (ARTE) provide the potential for significant life-cycle cost reductions in computer software development and maintenance activities. The Ada programming language itself is standardized, trademarked, and controlled via formal validation procedures. Though compilers are not yet production-ready as most would desire, the technology for constructing them is sufficiently well known and understood that time and money should suffice to correct current deficiencies. The APSE and ARTE are, on the other hand, significantly newer issues within most software development and maintenance efforts. Currently, APSE and ARTE are highly dependent on differing implementer concepts, strategies, and market objectives. Complex and sophisticated mission-critical computing systems require the use of a complete Ada-based capability, not just the programming language itself; yet the range of APSE and ARTE features which must actually be utilized can vary significantly from one system to another. As a consequence, the need to understand, objectively evaluate, and select differing APSE and ARTE capabilities and features is critical to the effective use of Ada and the life-cycle efficiencies it is intended to promote. It is the selection, collection, and understanding of APSE and ARTE which provide the deeper challenges of using Ada for real-life mission-critical computing systems. Some of the current issues which must be clarified, often on a case-by-case basis, in order to successfully realize the full capabilities of Ada are discussed.

  6. Its Ada: An Intelligent Tutoring System for the ADA Programming Language

    DTIC Science & Technology

    1991-12-01

    interchange programs and programmers and virtually impossible for effective software maintenance (Sammet, 1986, p. 722). In 1975, at the request of DoD...for a Xerox machine exclusively. PROUST receives a complete program produced by the student as input for diagnosis and prints out a comprehensive bug... printed on the screen, the student will be placed directly into the editor to write Ada code in response to the • pic problem just presented. The editing

  7. Joint source coding, transport processing, and error concealment for H.323-based packet video

    NASA Astrophysics Data System (ADS)

    Zhu, Qin-Fan; Kerofsky, Louis

    1998-12-01

    In this paper, we investigate how to adapt different parameters in H.263 source coding, transport processing and error concealment to optimize end-to-end video quality at different bitrates and packet loss rates for H.323-based packet video. First different intra coding patterns are compared and we show that the contiguous rectangle or square block pattern offers the best performance in terms of video quality in the presence of packet loss. Second, the optimal intra coding frequency is found for different bitrates and packet loss rates. The optimal number of GOB headers to be inserted in the source coding is then determined. The effect of transport processing strategies such as packetization and retransmission is also examined. For packetization, the impact of packet size and the effect of macroblock segmentation to picture quality are investigated. Finally, we show that the dejitter buffering delay can be used to the advantage for packet loss recovery with video retransmission without incurring any extra delay.

  8. Novel joint source-channel coding for wireless transmission of radiography images.

    PubMed

    Watanabe, Katsuhiro; Takizawa, Kenichi; Ikegami, Tetsushi

    2010-01-01

    A wireless technology is required to realize robust transmission of medical images like a radiography image over noisy environment. The use of error correction technique is essential for realizing such a reliable communication, in which a suitable channel coding is introduced to correct erroneous bits caused by passing through a noisy channel. However, the use of a channel code decreases its efficiency because redundancy bits are also transmitted with information bits. This paper presents a joint source-channel coding which maintains the channel efficiency during transmission of medical images like a radiography image. As medical images under the test, we use typical radiography images in this paper. The joint coding technique enjoys correlations between pixels of the radiography image. The results show that the proposed joint coding provides capability to correcting erroneous bits without increasing the redundancy of the codeword.

  9. SDI satellite autonomy using AI and Ada

    NASA Technical Reports Server (NTRS)

    Fiala, Harvey E.

    1990-01-01

    The use of Artificial Intelligence (AI) and the programming language Ada to help a satellite recover from selected failures that could lead to mission failure are described. An unmanned satellite will have a separate AI subsystem running in parallel with the normal satellite subsystems. A satellite monitoring subsystem (SMS), under the control of a blackboard system, will continuously monitor selected satellite subsystems to become alert to any actual or potential problems. In the case of loss of communications with the earth or the home base, the satellite will go into a survival mode to reestablish communications with the earth. The use of an AI subsystem in this manner would have avoided the tragic loss of the two recent Soviet probes that were sent to investigate the planet Mars and its moons. The blackboard system works in conjunction with an SMS and a reconfiguration control subsystem (RCS). It can be shown to be an effective way for one central control subsystem to monitor and coordinate the activities and loads of many interacting subsystems that may or may not contain redundant and/or fault-tolerant elements. The blackboard system will be coded in Ada using tools such as the ABLE development system and the Ada Production system.

  10. SENR, A Super-Efficient Code for Gravitational Wave Source Modeling: Latest Results

    NASA Astrophysics Data System (ADS)

    Ruchlin, Ian; Etienne, Zachariah; Baumgarte, Thomas

    2017-01-01

    The science we extract from gravitational wave observations will be limited by our theoretical understanding, so with the recent breakthroughs by LIGO, reliable gravitational wave source modeling has never been more critical. Due to efficiency considerations, current numerical relativity codes are very limited in their applicability to direct LIGO source modeling, so it is important to develop new strategies for making our codes more efficient. We introduce SENR, a Super-Efficient, open-development numerical relativity (NR) code aimed at improving the efficiency of moving-puncture-based LIGO gravitational wave source modeling by 100x. SENR builds upon recent work, in which the BSSN equations are evolved in static spherical coordinates, to allow dynamical coordinates with arbitrary spatial distributions. The physical domain is mapped to a uniform-resolution grid on which derivative operations are approximated using standard central finite difference stencils. The source code is designed to be human-readable, efficient, parallelized, and readily extensible. We present the latest results from the SENR code.

  11. Joint source-channel coding for wireless object-based video communications utilizing data hiding.

    PubMed

    Wang, Haohong; Tsaftaris, Sotirios A; Katsaggelos, Aggelos K

    2006-08-01

    In recent years, joint source-channel coding for multimedia communications has gained increased popularity. However, very limited work has been conducted to address the problem of joint source-channel coding for object-based video. In this paper, we propose a data hiding scheme that improves the error resilience of object-based video by adaptively embedding the shape and motion information into the texture data. Within a rate-distortion theoretical framework, the source coding, channel coding, data embedding, and decoder error concealment are jointly optimized based on knowledge of the transmission channel conditions. Our goal is to achieve the best video quality as expressed by the minimum total expected distortion. The optimization problem is solved using Lagrangian relaxation and dynamic programming. The performance of the proposed scheme is tested using simulations of a Rayleigh-fading wireless channel, and the algorithm is implemented based on the MPEG-4 verification model. Experimental results indicate that the proposed hybrid source-channel coding scheme significantly outperforms methods without data hiding or unequal error protection.

  12. Ada Compiler Validation Summary Report: NYU Ada/ED, Version 19.7 V-001.

    DTIC Science & Technology

    1983-04-11

    SCz25 C~z37 EC85 C37304A- ABADA P SCa22 CTs63 1.7.10.9 £7.38 Validation Summary Report for NYU Ada/ED April 11, 1983 A-6 A CMplete Li3t Of Tst3 and...9 NOa 352002E- ABADA PM SWaS CT*23 ECz2 B520021-8. ADA P SCs5 CTs11 EC.1 55202- ABADA P SC85 CT89 ECul Validation Summary Report for M! Ad a/ ED April... ABADA , P SCx17 C7.16 M~al B9710OA.AB.ADA P4 SC.18 CTx18 ECal C97101AADA PH SCu13 CT.66 1..1.6£75 C9711GA-A.ADi PM SCs1 CTx17 LT.13. 76 B97115A-B.ADA

  13. Plug-in to Eclipse environment for VHDL source code editor with advanced formatting of text

    NASA Astrophysics Data System (ADS)

    Niton, B.; Pozniak, K. T.; Romaniuk, R. S.

    2011-10-01

    The paper describes an idea and realization of a smart plug-in to the Eclipse software environment. The plug-in is predicted for editing of the VHDL source code. It extends considerably the capabilities of the VEditor program, which bases on the open license. There are presented the results of the formatting procedures performed on chosen examples of the VHDL source codes. The work is a part of a bigger project of building smart programming environment for design of advanced photonic and electronic systems. The examples of such systems are quoted in references.

  14. SOURCES 4C : a code for calculating ([alpha],n), spontaneous fission, and delayed neutron sources and spectra.

    SciTech Connect

    Wilson, W. B.; Perry, R. T.; Shores, E. F.; Charlton, W. S.; Parish, Theodore A.; Estes, G. P.; Brown, T. H.; Arthur, Edward D. ,; Bozoian, Michael; England, T. R.; Madland, D. G.; Stewart, J. E.

    2002-01-01

    SOURCES 4C is a computer code that determines neutron production rates and spectra from ({alpha},n) reactions, spontaneous fission, and delayed neutron emission due to radionuclide decay. The code is capable of calculating ({alpha},n) source rates and spectra in four types of problems: homogeneous media (i.e., an intimate mixture of a-emitting source material and low-Z target material), two-region interface problems (i.e., a slab of {alpha}-emitting source material in contact with a slab of low-Z target material), three-region interface problems (i.e., a thin slab of low-Z target material sandwiched between {alpha}-emitting source material and low-Z target material), and ({alpha},n) reactions induced by a monoenergetic beam of {alpha}-particles incident on a slab of target material. Spontaneous fission spectra are calculated with evaluated half-life, spontaneous fission branching, and Watt spectrum parameters for 44 actinides. The ({alpha},n) spectra are calculated using an assumed isotropic angular distribution in the center-of-mass system with a library of 107 nuclide decay {alpha}-particle spectra, 24 sets of measured and/or evaluated ({alpha},n) cross sections and product nuclide level branching fractions, and functional {alpha}-particle stopping cross sections for Z < 106. The delayed neutron spectra are taken from an evaluated library of 105 precursors. The code provides the magnitude and spectra, if desired, of the resultant neutron source in addition to an analysis of the'contributions by each nuclide in the problem. LASTCALL, a graphical user interface, is included in the code package.

  15. Non-Uniform Contrast and Noise Correction for Coded Source Neutron Imaging

    SciTech Connect

    Santos-Villalobos, Hector J; Bingham, Philip R

    2012-01-01

    Since the first application of neutron radiography in the 1930s, the field of neutron radiography has matured enough to develop several applications. However, advances in the technology are far from concluded. In general, the resolution of scintillator-based detection systems is limited to the $10\\mu m$ range, and the relatively low neutron count rate of neutron sources compared to other illumination sources restricts time resolved measurement. One path toward improved resolution is the use of magnification; however, to date neutron optics are inefficient, expensive, and difficult to develop. There is a clear demand for cost-effective scintillator-based neutron imaging systems that achieve resolutions of $1 \\mu m$ or less. Such imaging system would dramatically extend the application of neutron imaging. For such purposes a coded source imaging system is under development. The current challenge is to reduce artifacts in the reconstructed coded source images. Artifacts are generated by non-uniform illumination of the source, gamma rays, dark current at the imaging sensor, and system noise from the reconstruction kernel. In this paper, we describe how to pre-process the coded signal to reduce noise and non-uniform illumination, and how to reconstruct the coded signal with three reconstruction methods correlation, maximum likelihood estimation, and algebraic reconstruction technique. We illustrates our results with experimental examples.

  16. Neutron imaging with coded sources: new challenges and the implementation of a simultaneous iterative reconstruction technique

    SciTech Connect

    Santos-Villalobos, Hector J; Bingham, Philip R; Gregor, Jens

    2013-01-01

    The limitations in neutron flux and resolution (L/D) of current neutron imaging systems can be addressed with a Coded Source Imaging system with magnification (xCSI). More precisely, the multiple sources in an xCSI system can exceed the flux of a single pinhole system for several orders of magnitude, while maintaining a higher L/D with the small sources. Moreover, designing for an xCSI system reduces noise from neutron scattering, because the object is placed away from the detector to achieve magnification. However, xCSI systems are adversely affected by correlated noise such as non-uniform illumination of the neutron source, incorrect sampling of the coded radiograph, misalignment of the coded masks, mask transparency, and the imperfection of the system Point Spread Function (PSF). We argue that a model-based reconstruction algorithm can overcome these problems and describe the implementation of a Simultaneous Iterative Reconstruction Technique algorithm for coded sources. Design pitfalls that preclude a satisfactory reconstruction are documented.

  17. Compiling knowledge-based systems from KEE to Ada

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    The dominant technology for developing AI applications is to work in a multi-mechanism, integrated, knowledge-based system (KBS) development environment. Unfortunately, systems developed in such environments are inappropriate for delivering many applications - most importantly, they carry the baggage of the entire Lisp environment and are not written in conventional languages. One resolution of this problem would be to compile applications from complex environments to conventional languages. Here the first efforts to develop a system for compiling KBS developed in KEE to Ada (trademark). This system is called KATYDID, for KEE/Ada Translation Yields Development Into Delivery. KATYDID includes early prototypes of a run-time KEE core (object-structure) library module for Ada, and translation mechanisms for knowledge structures, rules, and Lisp code to Ada. Using these tools, part of a simple expert system was compiled (not quite automatically) to run in a purely Ada environment. This experience has given us various insights on Ada as an artificial intelligence programming language, potential solutions of some of the engineering difficulties encountered in early work, and inspiration on future system development.

  18. 3-D localization of gamma ray sources with coded apertures for medical applications

    NASA Astrophysics Data System (ADS)

    Kaissas, I.; Papadimitropoulos, C.; Karafasoulis, K.; Potiriadis, C.; Lambropoulos, C. P.

    2015-09-01

    Several small gamma cameras for radioguided surgery using CdTe or CdZnTe have parallel or pinhole collimators. Coded aperture imaging is a well-known method for gamma ray source directional identification, applied in astrophysics mainly. The increase in efficiency due to the substitution of the collimators by the coded masks renders the method attractive for gamma probes used in radioguided surgery. We have constructed and operationally verified a setup consisting of two CdTe gamma cameras with Modified Uniform Redundant Array (MURA) coded aperture masks of rank 7 and 19 and a video camera. The 3-D position of point-like radioactive sources is estimated via triangulation using decoded images acquired by the gamma cameras. We have also developed code for both fast and detailed simulations and we have verified the agreement between experimental results and simulations. In this paper we present a simulation study for the spatial localization of two point sources using coded aperture masks with rank 7 and 19.

  19. Shared and Distributed Memory Parallel Security Analysis of Large-Scale Source Code and Binary Applications

    SciTech Connect

    Quinlan, D; Barany, G; Panas, T

    2007-08-30

    Many forms of security analysis on large scale applications can be substantially automated but the size and complexity can exceed the time and memory available on conventional desktop computers. Most commercial tools are understandably focused on such conventional desktop resources. This paper presents research work on the parallelization of security analysis of both source code and binaries within our Compass tool, which is implemented using the ROSE source-to-source open compiler infrastructure. We have focused on both shared and distributed memory parallelization of the evaluation of rules implemented as checkers for a wide range of secure programming rules, applicable to desktop machines, networks of workstations and dedicated clusters. While Compass as a tool focuses on source code analysis and reports violations of an extensible set of rules, the binary analysis work uses the exact same infrastructure but is less well developed into an equivalent final tool.

  20. SOURCES 4A: A Code for Calculating (alpha,n), Spontaneous Fission, and Delayed Neutron Sources and Spectra

    SciTech Connect

    Madland, D.G.; Arthur, E.D.; Estes, G.P.; Stewart, J.E.; Bozoian, M.; Perry, R.T.; Parish, T.A.; Brown, T.H.; England, T.R.; Wilson, W.B.; Charlton, W.S.

    1999-09-01

    SOURCES 4A is a computer code that determines neutron production rates and spectra from ({alpha},n) reactions, spontaneous fission, and delayed neutron emission due to the decay of radionuclides. The code is capable of calculating ({alpha},n) source rates and spectra in four types of problems: homogeneous media (i.e., a mixture of {alpha}-emitting source material and low-Z target material), two-region interface problems (i.e., a slab of {alpha}-emitting source material in contact with a slab of low-Z target material), three-region interface problems (i.e., a thin slab of low-Z target material sandwiched between {alpha}-emitting source material and low-Z target material), and ({alpha},n) reactions induced by a monoenergetic beam of {alpha}-particles incident on a slab of target material. Spontaneous fission spectra are calculated with evaluated half-life, spontaneous fission branching, and Watt spectrum parameters for 43 actinides. The ({alpha},n) spectra are calculated using an assumed isotropic angular distribution in the center-of-mass system with a library of 89 nuclide decay {alpha}-particle spectra, 24 sets of measured and/or evaluated ({alpha},n) cross sections and product nuclide level branching fractions, and functional {alpha}-particle stopping cross sections for Z < 106. The delayed neutron spectra are taken from an evaluated library of 105 precursors. The code outputs the magnitude and spectra of the resultant neutron source. It also provides an analysis of the contributions to that source by each nuclide in the problem.

  1. Documentation for grants equal to tax model: Volume 3, Source code

    SciTech Connect

    Boryczka, M.K.

    1986-01-01

    The GETT model is capable of forecasting the amount of tax liability associated with all property owned and all activities undertaken by the US Department of Energy (DOE) in site characterization and repository development. The GETT program is a user-friendly, menu-driven model developed using dBASE III/trademark/, a relational data base management system. The data base for GETT consists primarily of eight separate dBASE III/trademark/ files corresponding to each of the eight taxes (real property, personal property, corporate income, franchise, sales, use, severance, and excise) levied by State and local jurisdictions on business property and activity. Additional smaller files help to control model inputs and reporting options. Volume 3 of the GETT model documentation is the source code. The code is arranged primarily by the eight tax types. Other code files include those for JURISDICTION, SIMULATION, VALIDATION, TAXES, CHANGES, REPORTS, GILOT, and GETT. The code has been verified through hand calculations.

  2. Detection and Location of Gamma-Ray Sources with a Modulating Coded Mask

    SciTech Connect

    Anderson, Dale N.; Stromswold, David C.; Wunschel, Sharon C.; Peurrung, Anthony J.; Hansen, Randy R.

    2006-01-31

    This paper presents methods of detecting and locating a concelaed nuclear gamma-ray source with a coded aperture mask. Energetic gamma rays readily penetrate moderate amounts of shielding material and can be detected at distances of many meters. The detection of high energy gamma-ray sources is vitally important to national security for several reasons, including nuclear materials smuggling interdiction, monitoring weapon components under treaties, and locating nuclear weapons and materials in the possession terrorist organizations.

  3. Ada Linear-Algebra Program

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.; Lawson, C. L.

    1988-01-01

    Routines provided for common scalar, vector, matrix, and quaternion operations. Computer program extends Ada programming language to include linear-algebra capabilities similar to HAS/S programming language. Designed for such avionics applications as software for Space Station.

  4. VizieR Online Data Catalog: Transiting planets search Matlab/Octave source code (Ofir+, 2014)

    NASA Astrophysics Data System (ADS)

    Ofir, A.

    2014-01-01

    The Matlab/Octave source code for Optimal BLS is made available here. Detailed descriptions of all inputs and outputs are given by comment lines in the file. Note: Octave does not currently support parallel for loops ("parfor"). Octave users therefore need to change the "parfor" command (line 217 of OptimalBLS.m) to "for". (7 data files).

  5. Ada (R) assessment: An important issue within European Columbus Support Technology Programme

    NASA Technical Reports Server (NTRS)

    Vielcanet, P.

    1986-01-01

    Software will be more important and more critical for Columbus than for any ESA previous project. As a simple comparison, overall software size has been in the range of 100 K source statements for EXOSAT, 500 K for Spacelab, and will probably reach several million lines of code for Columbus (all element together). Based on past experience, the total development cost of software can account for about 10 pct to 15 pct of the total space project development cost. The Ada technology may support the strong software engineering principles needed for Columbus, provided that technology is sufficiently mature and industry plans are meeting the Columbus project schedule. Over the past 3 years, Informatique Internationale has conducted a coherent program based on Ada technology assessment studies and experiments, for ESA and CNES. This specific research and development program benefits from 15 years experience in the field of space software development and is supported by the overall software engineering expertise of the company. The assessment and experiments of Ada software engineering by Informatique Internationale are detailed.

  6. Ada To X-Window Bindings

    NASA Technical Reports Server (NTRS)

    Souleles, Dean

    1993-01-01

    Ada to X-Window Bindings computer program developed to provide Ada programmers with complete interfaces to Xt Intrinsics and OSF Motif toolkits. Provides "Ada view" of some mostly C-language programming libraries. Package of software written in Ada and C languages.

  7. Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code

    NASA Astrophysics Data System (ADS)

    Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.

    2015-12-01

    WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be

  8. Paranoia.Ada: Sample output reports

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Paranoia.Ada is a program to diagnose floating point arithmetic in the context of the Ada programming language. The program evaluates the quality of a floating point arithmetic implementation with respect to the proposed IEEE Standards P754 and P854. Paranoia.Ada is derived from the original BASIC programming language version of Paranoia. The Paranoia.Ada replicates in Ada the test algorithms originally implemented in BASIC and adheres to the evaluation criteria established by W. M. Kahan. Paranoia.Ada incorporates a major structural redesign and employs applicable Ada architectural and stylistic features.

  9. SOURCES-3A: A code for calculating ({alpha}, n), spontaneous fission, and delayed neutron sources and spectra

    SciTech Connect

    Perry, R.T.; Wilson, W.B.; Charlton, W.S.

    1998-04-01

    In many systems, it is imperative to have accurate knowledge of all significant sources of neutrons due to the decay of radionuclides. These sources can include neutrons resulting from the spontaneous fission of actinides, the interaction of actinide decay {alpha}-particles in ({alpha},n) reactions with low- or medium-Z nuclides, and/or delayed neutrons from the fission products of actinides. Numerous systems exist in which these neutron sources could be important. These include, but are not limited to, clean and spent nuclear fuel (UO{sub 2}, ThO{sub 2}, MOX, etc.), enrichment plant operations (UF{sub 6}, PuF{sub 4}, etc.), waste tank studies, waste products in borosilicate glass or glass-ceramic mixtures, and weapons-grade plutonium in storage containers. SOURCES-3A is a computer code that determines neutron production rates and spectra from ({alpha},n) reactions, spontaneous fission, and delayed neutron emission due to the decay of radionuclides in homogeneous media (i.e., a mixture of {alpha}-emitting source material and low-Z target material) and in interface problems (i.e., a slab of {alpha}-emitting source material in contact with a slab of low-Z target material). The code is also capable of calculating the neutron production rates due to ({alpha},n) reactions induced by a monoenergetic beam of {alpha}-particles incident on a slab of target material. Spontaneous fission spectra are calculated with evaluated half-life, spontaneous fission branching, and Watt spectrum parameters for 43 actinides. The ({alpha},n) spectra are calculated using an assumed isotropic angular distribution in the center-of-mass system with a library of 89 nuclide decay {alpha}-particle spectra, 24 sets of measured and/or evaluated ({alpha},n) cross sections and product nuclide level branching fractions, and functional {alpha}-particle stopping cross sections for Z < 106. The delayed neutron spectra are taken from an evaluated library of 105 precursors. The code outputs the magnitude

  10. Storage management in Ada. Three reports. Volume 1: Storage management in Ada as a risk to the development of reliable software. Volume 2: Relevant aspects of language. Volume 3: Requirements of the language versus manifestations of current implementations

    NASA Technical Reports Server (NTRS)

    Auty, David

    1988-01-01

    The risk to the development of program reliability is derived from the use of a new language and from the potential use of new storage management techniques. With Ada and associated support software, there is a lack of established guidelines and procedures, drawn from experience and common usage, which assume reliable behavior. The risk is identified and clarified. In order to provide a framework for future consideration of dynamic storage management on Ada, a description of the relevant aspects of the language is presented in two sections: Program data sources, and declaration and allocation in Ada. Storage-management characteristics of the Ada language and storage-management characteristics of Ada implementations are differentiated. Terms that are used are defined in a narrow and precise sense. The storage-management implications of the Ada language are described. The storage-management options available to the Ada implementor and the implications of the implementor's choice for the Ada programmer are also described.

  11. A plug-in to Eclipse for VHDL source codes: functionalities

    NASA Astrophysics Data System (ADS)

    Niton, B.; Poźniak, K. T.; Romaniuk, R. S.

    The paper presents an original application, written by authors, which supports writing and edition of source codes in VHDL language. It is a step towards fully automatic, augmented code writing for photonic and electronic systems, also systems based on FPGA and/or DSP processors. An implementation is described, based on VEditor. VEditor is a free license program. Thus, the work presented in this paper supplements and extends this free license. The introduction characterizes shortly available tools on the market which serve for aiding the design processes of electronic systems in VHDL. Particular attention was put on plug-ins to the Eclipse environment and Emacs program. There are presented detailed properties of the written plug-in such as: programming extension conception, and the results of the activities of formatter, re-factorizer, code hider, and other new additions to the VEditor program.

  12. Benchmarking Defmod, an open source FEM code for modeling episodic fault rupture

    NASA Astrophysics Data System (ADS)

    Meng, Chunfang

    2017-03-01

    We present Defmod, an open source (linear) finite element code that enables us to efficiently model the crustal deformation due to (quasi-)static and dynamic loadings, poroelastic flow, viscoelastic flow and frictional fault slip. Ali (2015) provides the original code introducing an implicit solver for (quasi-)static problem, and an explicit solver for dynamic problem. The fault constraint is implemented via Lagrange Multiplier. Meng (2015) combines these two solvers into a hybrid solver that uses failure criteria and friction laws to adaptively switch between the (quasi-)static state and dynamic state. The code is capable of modeling episodic fault rupture driven by quasi-static loadings, e.g. due to reservoir fluid withdraw or injection. Here, we focus on benchmarking the Defmod results against some establish results.

  13. Ada Implementation Guide. Software Engineering With Ada. Volume 1

    DTIC Science & Technology

    1994-04-01

    teaching, the student is less likely to readily adopt new, more powerful ways of accomplishing old tasks 122 Depatn of the NaY I ! Trablng and Educaion and...Maturity Model3 (CMU/SEI-92-TR-25, ESC-TR-/92-0M5). Pittsburgh, PA : Carnegie-Mellon University, 1992. SBoehm. B.W. Software Engineering Economics...Pittsburgh, PA : Carnegie-Mellon University, 19-21 March 1991. £ Contrast: Ada 9X and C++, Schonberg, E. New York University, 1992 (Distributed by Ada IC on

  14. HELIOS-R: An Ultrafast, Open-Source Retrieval Code For Exoplanetary Atmosphere Characterization

    NASA Astrophysics Data System (ADS)

    LAVIE, Baptiste

    2015-12-01

    Atmospheric retrieval is a growing, new approach in the theory of exoplanet atmosphere characterization. Unlike self-consistent modeling it allows us to fully explore the parameter space, as well as the degeneracies between the parameters using a Bayesian framework. We present HELIOS-R, a very fast retrieving code written in Python and optimized for GPU computation. Once it is ready, HELIOS-R will be the first open-source atmospheric retrieval code accessible to the exoplanet community. As the new generation of direct imaging instruments (SPHERE, GPI) have started to gather data, the first version of HELIOS-R focuses on emission spectra. We use a 1D two-stream forward model for computing fluxes and couple it to an analytical temperature-pressure profile that is constructed to be in radiative equilibrium. We use our ultra-fast opacity calculator HELIOS-K (also open-source) to compute the opacities of CO2, H2O, CO and CH4 from the HITEMP database. We test both opacity sampling (which is typically used by other workers) and the method of k-distributions. Using this setup, we compute a grid of synthetic spectra and temperature-pressure profiles, which is then explored using a nested sampling algorithm. By focusing on model selection (Occam’s razor) through the explicit computation of the Bayesian evidence, nested sampling allows us to deal with current sparse data as well as upcoming high-resolution observations. Once the best model is selected, HELIOS-R provides posterior distributions of the parameters. As a test for our code we studied HR8799 system and compared our results with the previous analysis of Lee, Heng & Irwin (2013), which used the proprietary NEMESIS retrieval code. HELIOS-R and HELIOS-K are part of the set of open-source community codes we named the Exoclimes Simulation Platform (www.exoclime.org).

  15. Can space station software be specified through Ada?

    NASA Technical Reports Server (NTRS)

    Knoebel, Arthur

    1987-01-01

    Programming of the space station is to be done in the Ada programming language. A breadboard of selected parts of the work package for Marshall Space Flight Center is to be built, and programming this small part will be a good testing ground for Ada. One coding of the upper levels of the design brings out several problems with top-down design when it is to be carried out strictly within the language. Ada is evaluated on the basis of this experience, and the points raised are compared with other experience as related in the literature. Rapid prototyping is another approach to the initial programming; several different types of prototypes are discussed, and compared with the art of specification. Some solutions are proposed and a number of recommendations presented.

  16. Beyond the Business Model: Incentives for Organizations to Publish Software Source Code

    NASA Astrophysics Data System (ADS)

    Lindman, Juho; Juutilainen, Juha-Pekka; Rossi, Matti

    The software stack opened under Open Source Software (OSS) licenses is growing rapidly. Commercial actors have released considerable amounts of previously proprietary source code. These actions beg the question why companies choose a strategy based on giving away software assets? Research on outbound OSS approach has tried to answer this question with the concept of the “OSS business model”. When studying the reasons for code release, we have observed that the business model concept is too generic to capture the many incentives organizations have. Conversely, in this paper we investigate empirically what the companies’ incentives are by means of an exploratory case study of three organizations in different stages of their code release. Our results indicate that the companies aim to promote standardization, obtain development resources, gain cost savings, improve the quality of software, increase the trustworthiness of software, or steer OSS communities. We conclude that future research on outbound OSS could benefit from focusing on the heterogeneous incentives for code release rather than on revenue models.

  17. An Adaptive Source-Channel Coding with Feedback for Progressive Transmission of Medical Images

    PubMed Central

    Lo, Jen-Lung; Sanei, Saeid; Nazarpour, Kianoush

    2009-01-01

    A novel adaptive source-channel coding with feedback for progressive transmission of medical images is proposed here. In the source coding part, the transmission starts from the region of interest (RoI). The parity length in the channel code varies with respect to both the proximity of the image subblock to the RoI and the channel noise, which is iteratively estimated in the receiver. The overall transmitted data can be controlled by the user (clinician). In the case of medical data transmission, it is vital to keep the distortion level under control as in most of the cases certain clinically important regions have to be transmitted without any visible error. The proposed system significantly reduces the transmission time and error. Moreover, the system is very user friendly since the selection of the RoI, its size, overall code rate, and a number of test features such as noise level can be set by the users in both ends. A MATLAB-based TCP/IP connection has been established to demonstrate the proposed interactive and adaptive progressive transmission system. The proposed system is simulated for both binary symmetric channel (BSC) and Rayleigh channel. The experimental results verify the effectiveness of the design. PMID:19190770

  18. CACTI: Free, Open-Source Software for the Sequential Coding of Behavioral Interactions

    PubMed Central

    Glynn, Lisa H.; Hallgren, Kevin A.; Houck, Jon M.; Moyers, Theresa B.

    2012-01-01

    The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery. PMID:22815713

  19. CACTI: free, open-source software for the sequential coding of behavioral interactions.

    PubMed

    Glynn, Lisa H; Hallgren, Kevin A; Houck, Jon M; Moyers, Theresa B

    2012-01-01

    The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery.

  20. Preliminary study of nuclear fuel element testing based on coded source neutron imaging

    SciTech Connect

    Sheng Wang; Hang Li; Chao Cao; Yang Wu; Heyong Huo; Bin Tang

    2015-07-01

    Neutron radiography (NR) is one of the most important nondestructive testing methods, which is sensitive to low density materials. Especially, Neutron transfer imaging method could be used to test radioactivity materials refraining from γ effect, but it is difficult to realize tomography. Coded source neutron imaging (CSNI) is a newly NR method developed fast in the last several years. The distance between object and detector is much longer than traditional NR, which could be used to test radioactivity materials. With pre-reconstruction process from fold-cover projections, CSNI could easily realize tomography. This thesis carries out preliminary study on the nuclear fuel element testing by coded source neutron imaging. We calculate different enrichment, flaws and activity in nuclear fuel elements tested by CSNI with Monte-Carlo simulation. The results show that CSNI could be a useful testing method for nuclear fuel element testing. (authors)

  1. The FORTRAN static source code analyzer program (SAP) user's guide, revision 1

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Eslinger, S.

    1982-01-01

    The FORTRAN Static Source Code Analyzer Program (SAP) User's Guide (Revision 1) is presented. SAP is a software tool designed to assist Software Engineering Laboratory (SEL) personnel in conducting studies of FORTRAN programs. SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. This document is a revision of the previous SAP user's guide, Computer Sciences Corporation document CSC/TM-78/6045. SAP Revision 1 is the result of program modifications to provide several new reports, additional complexity analysis, and recognition of all statements described in the FORTRAN 77 standard. This document provides instructions for operating SAP and contains information useful in interpreting SAP output.

  2. Severe accident source term characteristics for selected Peach Bottom sequences predicted by the MELCOR Code

    SciTech Connect

    Carbajo, J.J.

    1993-09-01

    The purpose of this report is to compare in-containment source terms developed for NUREG-1159, which used the Source Term Code Package (STCP), with those generated by MELCOR to identify significant differences. For this comparison, two short-term depressurized station blackout sequences (with a dry cavity and with a flooded cavity) and a Loss-of-Coolant Accident (LOCA) concurrent with complete loss of the Emergency Core Cooling System (ECCS) were analyzed for the Peach Bottom Atomic Power Station (a BWR-4 with a Mark I containment). The results indicate that for the sequences analyzed, the two codes predict similar total in-containment release fractions for each of the element groups. However, the MELCOR/CORBH Package predicts significantly longer times for vessel failure and reduced energy of the released material for the station blackout sequences (when compared to the STCP results). MELCOR also calculated smaller releases into the environment than STCP for the station blackout sequences.

  3. The NASA Langley Research Center 0.3-meter transonic cryogenic tunnel microcomputer controller source code

    NASA Technical Reports Server (NTRS)

    Kilgore, W. Allen; Balakrishna, S.

    1991-01-01

    The 0.3 m Transonic Cryogenic Tunnel (TCT) microcomputer based controller has been operating for several thousand hours in a safe and efficient manner. A complete listing is provided of the source codes for the tunnel controller and tunnel simulator. Included also is a listing of all the variables used in these programs. Several changes made to the controller are described. These changes are to improve the controller ease of use and safety.

  4. AN OPEN-SOURCE NEUTRINO RADIATION HYDRODYNAMICS CODE FOR CORE-COLLAPSE SUPERNOVAE

    SciTech Connect

    O’Connor, Evan

    2015-08-15

    We present an open-source update to the spherically symmetric, general-relativistic hydrodynamics, core-collapse supernova (CCSN) code GR1D. The source code is available at http://www.GR1Dcode.org. We extend its capabilities to include a general-relativistic treatment of neutrino transport based on the moment formalisms of Shibata et al. and Cardall et al. We pay special attention to implementing and testing numerical methods and approximations that lessen the computational demand of the transport scheme by removing the need to invert large matrices. This is especially important for the implementation and development of moment-like transport methods in two and three dimensions. A critical component of neutrino transport calculations is the neutrino–matter interaction coefficients that describe the production, absorption, scattering, and annihilation of neutrinos. In this article we also describe our open-source neutrino interaction library NuLib (available at http://www.nulib.org). We believe that an open-source approach to describing these interactions is one of the major steps needed to progress toward robust models of CCSNe and robust predictions of the neutrino signal. We show, via comparisons to full Boltzmann neutrino-transport simulations of CCSNe, that our neutrino transport code performs remarkably well. Furthermore, we show that the methods and approximations we employ to increase efficiency do not decrease the fidelity of our results. We also test the ability of our general-relativistic transport code to model failed CCSNe by evolving a 40-solar-mass progenitor to the onset of collapse to a black hole.

  5. An Open-source Neutrino Radiation Hydrodynamics Code for Core-collapse Supernovae

    NASA Astrophysics Data System (ADS)

    O'Connor, Evan

    2015-08-01

    We present an open-source update to the spherically symmetric, general-relativistic hydrodynamics, core-collapse supernova (CCSN) code GR1D. The source code is available at http://www.GR1Dcode.org. We extend its capabilities to include a general-relativistic treatment of neutrino transport based on the moment formalisms of Shibata et al. and Cardall et al. We pay special attention to implementing and testing numerical methods and approximations that lessen the computational demand of the transport scheme by removing the need to invert large matrices. This is especially important for the implementation and development of moment-like transport methods in two and three dimensions. A critical component of neutrino transport calculations is the neutrino-matter interaction coefficients that describe the production, absorption, scattering, and annihilation of neutrinos. In this article we also describe our open-source neutrino interaction library NuLib (available at http://www.nulib.org). We believe that an open-source approach to describing these interactions is one of the major steps needed to progress toward robust models of CCSNe and robust predictions of the neutrino signal. We show, via comparisons to full Boltzmann neutrino-transport simulations of CCSNe, that our neutrino transport code performs remarkably well. Furthermore, we show that the methods and approximations we employ to increase efficiency do not decrease the fidelity of our results. We also test the ability of our general-relativistic transport code to model failed CCSNe by evolving a 40-solar-mass progenitor to the onset of collapse to a black hole.

  6. AdaNET research plan

    NASA Technical Reports Server (NTRS)

    Mcbride, John G.

    1990-01-01

    The mission of the AdaNET research effort is to determine how to increase the availability of reusable Ada components and associated software engineering technology to both private and Federal sectors. The effort is structured to define the requirements for transfer of Federally developed software technology, study feasible approaches to meeting the requirements, and to gain experience in applying various technologies and practices. The overall approach to the development of the AdaNET System Specification is presented. A work breakdown structure is presented with each research activity described in detail. The deliverables for each work area are summarized. The overall organization and responsibilities for each research area are described. The schedule and necessary resources are presented for each research activity. The estimated cost is summarized for each activity. The project plan is fully described in the Super Project Expert data file contained on the floppy disk attached to the back cover of this plan.

  7. Multiprocessor performance modeling with ADAS

    NASA Technical Reports Server (NTRS)

    Hayes, Paul J.; Andrews, Asa M.

    1989-01-01

    A graph managing strategy referred to as the Algorithm to Architecture Mapping Model (ATAMM) appears useful for the time-optimized execution of application algorithm graphs in embedded multiprocessors and for the performance prediction of graph designs. This paper reports the modeling of ATAMM in the Architecture Design and Assessment System (ADAS) to make an independent verification of ATAMM's performance prediction capability and to provide a user framework for the evaluation of arbitrary algorithm graphs. Following an overview of ATAMM and its major functional rules are descriptions of the ADAS model of ATAMM, methods to enter an arbitrary graph into the model, and techniques to analyze the simulation results. The performance of a 7-node graph example is evaluated using the ADAS model and verifies the ATAMM concept by substantiating previously published performance results.

  8. Modification of source contribution in PALS by simulation using Geant4 code

    NASA Astrophysics Data System (ADS)

    Ning, Xia; Cao, Xingzhong; Li, Chong; Li, Demin; Zhang, Peng; Gong, Yihao; Xia, Rui; Wang, Baoyi; Wei, Long

    2017-04-01

    The contribution of positron source for the results of a positron annihilation lifetime spectrum (PALS) is simulated using Geant4 code. The geometrical structure of PALS measurement system is a sandwich structure: the 22Na radiation source is encapsulated by Kapton films, and the specimens are attached on the outside of the films. The probabilities of a positron being annihilated in the films, annihilated in the targets, and the effect of positrons reflected back from the specimen surface, are simulated. The probability of a positron annihilated in the film is related to the species of targets and the source film thickness. The simulation result is in reasonable agreement with the available experimental data. Thus, modification of the source contribution calculated by Geant4 is viable, and it beneficial for the analysis of the results of PALS.

  9. Ada Implementation Guide. Software Engineering With Ada. Volume 2

    DTIC Science & Technology

    1994-04-01

    Standards and Technology ............. A-4 DON Software Executive Official ...................... A-4 DON Ada Representative...Cost Analysis ........................ A-6 Software Technology Support Center .................... A-6 Software Engineering Institute...A-7 Software Technology for Adaptable, Reliable Systems (STARS) ....................................... A-7 A.1.2 Training

  10. Software reuse issues affecting AdaNET

    NASA Technical Reports Server (NTRS)

    Mcbride, John G.

    1989-01-01

    The AdaNet program is reviewing its long-term goals and strategies. A significant concern is whether current AdaNet plans adequately address the major strategic issues of software reuse technology. The major reuse issues of providing AdaNet services that should be addressed as part of future AdaNet development are identified and reviewed. Before significant development proceeds, a plan should be developed to resolve the aforementioned issues. This plan should also specify a detailed approach to develop AdaNet. A three phased strategy is recommended. The first phase would consist of requirements analysis and produce an AdaNet system requirements specification. It would consider the requirements of AdaNet in terms of mission needs, commercial realities, and administrative policies affecting development, and the experience of AdaNet and other projects promoting the transfer software engineering technology. Specifically, requirements analysis would be performed to better understand the requirements for AdaNet functions. The second phase would provide a detailed design of the system. The AdaNet should be designed with emphasis on the use of existing technology readily available to the AdaNet program. A number of reuse products are available upon which AdaNet could be based. This would significantly reduce the risk and cost of providing an AdaNet system. Once a design was developed, implementation would proceed in the third phase.

  11. REBOUND: an open-source multi-purpose N-body code for collisional dynamics

    NASA Astrophysics Data System (ADS)

    Rein, H.; Liu, S.-F.

    2012-01-01

    REBOUND is a new multi-purpose N-body code which is freely available under an open-source license. It was designed for collisional dynamics such as planetary rings but can also solve the classical N-body problem. It is highly modular and can be customized easily to work on a wide variety of different problems in astrophysics and beyond. REBOUND comes with three symplectic integrators: leap-frog, the symplectic epicycle integrator (SEI) and a Wisdom-Holman mapping (WH). It supports open, periodic and shearing-sheet boundary conditions. REBOUND can use a Barnes-Hut tree to calculate both self-gravity and collisions. These modules are fully parallelized with MPI as well as OpenMP. The former makes use of a static domain decomposition and a distributed essential tree. Two new collision detection modules based on a plane-sweep algorithm are also implemented. The performance of the plane-sweep algorithm is superior to a tree code for simulations in which one dimension is much longer than the other two and in simulations which are quasi-two dimensional with less than one million particles. In this work, we discuss the different algorithms implemented in REBOUND, the philosophy behind the code's structure as well as implementation specific details of the different modules. We present results of accuracy and scaling tests which show that the code can run efficiently on both desktop machines and large computing clusters.

  12. Interpreting observations of molecular outflow sources: the MHD shock code mhd_vode

    NASA Astrophysics Data System (ADS)

    Flower, D. R.; Pineau des Forêts, G.

    2015-06-01

    The planar MHD shock code mhd_vode has been developed in order to simulate both continuous (C) type shock waves and jump (J) type shock waves in the interstellar medium. The physical and chemical state of the gas in steady-state may also be computed and used as input to a shock wave model. The code is written principally in FORTRAN 90, although some routines remain in FORTRAN 77. The documented program and its input data are described and provided as supplementary material, and the results of exemplary test runs are presented. Our intention is to enable the interested user to run the code for any sensible parameter set and to comprehend the results. With applications to molecular outflow sources in mind, we have computed, and are making available as supplementary material, integrated atomic and molecular line intensities for grids of C- and J-type models; these computations are summarized in the Appendices. Appendix tables, a copy of the current version of the code, and of the two model grids are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/578/A63

  13. Sensitivity analysis and benchmarking of the BLT low-level waste source term code

    SciTech Connect

    Suen, C.J.; Sullivan, T.M.

    1993-07-01

    To evaluate the source term for low-level waste disposal, a comprehensive model had been developed and incorporated into a computer code, called BLT (Breach-Leach-Transport) Since the release of the original version, many new features and improvements had also been added to the Leach model of the code. This report consists of two different studies based on the new version of the BLT code: (1) a series of verification/sensitivity tests; and (2) benchmarking of the BLT code using field data. Based on the results of the verification/sensitivity tests, the authors concluded that the new version represents a significant improvement and it is capable of providing more realistic simulations of the leaching process. Benchmarking work was carried out to provide a reasonable level of confidence in the model predictions. In this study, the experimentally measured release curves for nitrate, technetium-99 and tritium from the saltstone lysimeters operated by Savannah River Laboratory were used. The model results are observed to be in general agreement with the experimental data, within the acceptable limits of uncertainty.

  14. Renovating To Meet ADA Standards.

    ERIC Educational Resources Information Center

    Huber, Judy; Jones, Garry

    2003-01-01

    Using the examples of Owen D. Young School in Van Hornesville, New York, and the Tonawanda City school district in Buffalo, New York, describes how school planners should take the accessibility standards mandated by the Americans with Disabilities Act (ADA) into account when renovating. (EV)

  15. Sources of financial pressure and up coding behavior in French public hospitals.

    PubMed

    Georgescu, Irène; Hartmann, Frank G H

    2013-05-01

    Drawing upon role theory and the literature concerning unintended consequences of financial pressure, this study investigates the effects of health care decision pressure from the hospital's administration and from the professional peer group on physician's inclination to engage in up coding. We explore two kinds of up coding, information-related and action-related, and develop hypothesis that connect these kinds of data manipulation to the sources of pressure via the intermediate effect of role conflict. Qualitative data from initial interviews with physicians and subsequent questionnaire evidence from 578 physicians in 14 French hospitals suggest that the source of pressure is a relevant predictor of physicians' inclination to engage in data-manipulation. We further find that this effect is partly explained by the extent to which these pressures create role conflict. Given the concern about up coding in treatment-based reimbursement systems worldwide, our analysis adds to understanding how the design of the hospital's management control system may enhance this undesired type of behavior.

  16. Dosimetric characterization of an 192Ir brachytherapy source with the Monte Carlo code PENELOPE.

    PubMed

    Casado, Francisco Javier; García-Pareja, Salvador; Cenizo, Elena; Mateo, Beatriz; Bodineau, Coral; Galán, Pedro

    2010-01-01

    Monte Carlo calculations are highly spread and settled practice to calculate brachytherapy sources dosimetric parameters. In this study, recommendations of the AAPM TG-43U1 report have been followed to characterize the Varisource VS2000 (192)Ir high dose rate source, provided by Varian Oncology Systems. In order to obtain dosimetric parameters for this source, Monte Carlo calculations with PENELOPE code have been carried out. TG-43 formalism parameters have been presented, i.e., air kerma strength, dose rate constant, radial dose function and anisotropy function. Besides, a 2D Cartesian coordinates dose rate in water table has been calculated. These quantities are compared to this source reference data, finding results in good agreement with them. The data in the present study complement published data in the next aspects: (i) TG-43U1 recommendations are followed regarding to phantom ambient conditions and to uncertainty analysis, including statistical (type A) and systematic (type B) contributions; (ii) PENELOPE code is benchmarked for this source; (iii) Monte Carlo calculation methodology differs from that usually published in the way to estimate absorbed dose, leaving out the track-length estimator; (iv) the results of the present work comply with the most recent AAPM and ESTRO physics committee recommendations about Monte Carlo techniques, in regards to dose rate uncertainty values and established differences between our results and reference data. The results stated in this paper provide a complete parameter collection, which can be used for dosimetric calculations as well as a means of comparison with other datasets from this source.

  17. Inferential multi-spectral image compression based on distributed source coding

    NASA Astrophysics Data System (ADS)

    Wu, Xian-yun; Li, Yun-song; Wu, Cheng-ke; Kong, Fan-qiang

    2008-08-01

    Based on the analyses of the interferential multispectral imagery(IMI), a new compression algorithm based on distributed source coding is proposed. There are apparent push motions between the IMI sequences, the relative shift between two images is detected by the block match algorithm at the encoder. Our algorithm estimates the rate of each bitplane with the estimated side information frame. then our algorithm adopts a ROI coding algorithm, in which the rate-distortion lifting procedure is carried out in rate allocation stage. Using our algorithm, the FBC can be removed from the traditional scheme. The compression algorithm developed in the paper can obtain up to 3dB's gain comparing with JPEG2000 and significantly reduce the complexity and storage consumption comparing with 3D-SPIHT at the cost of slight degrade in PSNR.

  18. User`s Manual for the SOURCE1 and SOURCE2 Computer Codes: Models for Evaluating Low-Level Radioactive Waste Disposal Facility Source Terms (Version 2.0)

    SciTech Connect

    Icenhour, A.S.; Tharp, M.L.

    1996-08-01

    The SOURCE1 and SOURCE2 computer codes calculate source terms (i.e. radionuclide release rates) for performance assessments of low-level radioactive waste (LLW) disposal facilities. SOURCE1 is used to simulate radionuclide releases from tumulus-type facilities. SOURCE2 is used to simulate releases from silo-, well-, well-in-silo-, and trench-type disposal facilities. The SOURCE codes (a) simulate the degradation of engineered barriers and (b) provide an estimate of the source term for LLW disposal facilities. This manual summarizes the major changes that have been effected since the codes were originally developed.

  19. Structuring the formal definition of Ada

    NASA Technical Reports Server (NTRS)

    Hansen, Kurt W.

    1986-01-01

    The structure of the formal definition of Ada are described. At present, a difficult subset of Ada has been defined and the experience gained so far by this work is reported. Currently, the work continues towards the formal definition of the Ada language.

  20. Ada--Programming Language of the Future.

    ERIC Educational Resources Information Center

    Rudd, David

    1983-01-01

    Ada is a programing language developed for the Department of Defense, with a registered trademark. It was named for Ada Augusta, coworker of Charles Babbage and the world's first programer. The Department of Defense hopes to prevent variations and to establish Ada as a consistent, standardized language. (MNS)

  1. Documentation generator for VHDL and MatLab source codes for photonic and electronic systems

    NASA Astrophysics Data System (ADS)

    Niton, B.; Pozniak, K. T.; Romaniuk, R. S.

    2011-06-01

    The UML, which is a complex system modeling and description technology, has recently been expanding its uses in the field of formalization and algorithmic approach to such systems like multiprocessor photonic, optoelectronic and advanced electronics carriers; distributed, multichannel measurement systems; optical networks, industrial electronics, novel R&D solutions. The paper describes a new concept of software dedicated for documenting the source codes written in VHDL and MatLab. The work starts with the analysis of available documentation generators for both programming languages, with an emphasis on the open source solutions. There are presented own solutions which base on the Doxygen program available as a free license with the source code. The supporting tools for parsers building were used like Bison and Flex. The documentation generator application is used for design of large optoelectronic and electronic measurement and control systems. The paper consists of three parts which describe the following components of the documentation generator for photonic and electronic systems: concept, MatLab application and VHDL application. This is part one which describes the system concept. Part two describes the MatLab application. MatLab is used for description of the measured phenomena. Part three describes the VHDL application. VHDL is used for behavioral description of the optoelectronic system. All the proposed approach and application documents big, complex software configurations for large systems.

  2. LENSED: a code for the forward reconstruction of lenses and sources from strong lensing observations

    NASA Astrophysics Data System (ADS)

    Tessore, Nicolas; Bellagamba, Fabio; Metcalf, R. Benton

    2016-12-01

    Robust modelling of strong lensing systems is fundamental to exploit the information they contain about the distribution of matter in galaxies and clusters. In this work, we present LENSED, a new code which performs forward parametric modelling of strong lenses. LENSED takes advantage of a massively parallel ray-tracing kernel to perform the necessary calculations on a modern graphics processing unit (GPU). This makes the precise rendering of the background lensed sources much faster, and allows the simultaneous optimization of tens of parameters for the selected model. With a single run, the code is able to obtain the full posterior probability distribution for the lens light, the mass distribution and the background source at the same time. LENSED is first tested on mock images which reproduce realistic space-based observations of lensing systems. In this way, we show that it is able to recover unbiased estimates of the lens parameters, even when the sources do not follow exactly the assumed model. Then, we apply it to a subsample of the Sloan Lens ACS Survey lenses, in order to demonstrate its use on real data. The results generally agree with the literature, and highlight the flexibility and robustness of the algorithm.

  3. Ada 9X Project Report: Ada 9X Requirements Document

    DTIC Science & Technology

    1990-08-27

    Reviewers which include E. Ploedereder - DR Chair ( Tartan ), G. Booch (Rational), B. Brosgol (Alsys), N. Cohen (IBM), R. Dewar (NYU), G. Dismukes...developing a report (the Ada Com- mentary Integration Document ( ACID )) that reflects the impact of the approved commentaries by suggesting revised...wording for appropriate sections of the standard. ACID would be a good starting point for satisfying this requirement. Of course, some of the approved

  4. Neutronic conceptual design of the ETRR-2 cold-neutron source using the MCNP code

    NASA Astrophysics Data System (ADS)

    Khalil, M. Y.; Shaat, M. K.; Abdelfattah, A. Y.

    2005-04-01

    A conceptual neutronic design of the cold-neutron source (CNS) for the Egyptian second research reactor (ETRR-2) was done using the MCNP code. Parametric analysis to chose the type and geometry of the moderator, and the required CNS dimensions to maximize the cold neutron production was performed. The moderator cell has a spherical annulus structure containing liquid hydrogen. The cold neutron gain and cold neutron brightness are calculated together with the nuclear heat load of the CNS. Analysis of the estimated performance of the CNS has been done regarding the effect of void fraction in the moderator cell together with the ortho: para ratio.

  5. Source Listings for Computer Code SPIRALI Incompressible, Turbulent Spiral Grooved Cylindrical and Face Seals

    NASA Technical Reports Server (NTRS)

    Walowit, Jed A.; Shapiro, Wibur

    2005-01-01

    This is the source listing of the computer code SPIRALI which predicts the performance characteristics of incompressible cylindrical and face seals with or without the inclusion of spiral grooves. Performance characteristics include load capacity (for face seals), leakage flow, power requirements and dynamic characteristics in the form of stiffness, damping and apparent mass coefficients in 4 degrees of freedom for cylindrical seals and 3 degrees of freedom for face seals. These performance characteristics are computed as functions of seal and groove geometry, load or film thickness, running and disturbance speeds, fluid viscosity, and boundary pressures.

  6. Health care under the ADA: a vision or a mirage?

    PubMed

    Mudrick, Nancy R; Schwartz, Michael A

    2010-10-01

    Problems in health care access are identified using recent studies documenting the health disparities experienced by people with disabilities. Some of these health care access barriers qualify as discrimination prohibited under the Americans with Disabilities Act. Focusing on the past decade of ADA enforcement, issues reported in the U.S. Department of Justice listing of resolved ADA complaints and settlements are compared to the profile of access problems. Key court case outcomes of the past decade also are presented. These sources indicate that the majority of resolved complaints and settlements involved failure to provide effective communication (often sign language interpretation). A smaller percentage of complaints and settlements addressed issues of refusal to provide treatment, physical access, equipment access, and provider procedures. Most of the key settlements involved hospitals and larger provider organizations, while many complaints also focused on individual physicians. Although the record indicates that the ADA can be, and has been, effectively used to increase access in many instances, other types of access problems have been lightly addressed through application of the ADA. This likely stems from enforcement choices made by the Department of Justice and the dynamics of the patient-doctor relationship. The broad challenge for the coming decade is to develop means to achieve effective communication and eliminate physical and programmatic barriers in more health care provider settings more consistently. The ADA can be a vigorous force in this effort as part of a multipronged strategy.

  7. Comparison of radiation spectra from selected source-term computer codes

    SciTech Connect

    Brady, M.C.; Hermann, O.W.; Wilson, W.B.

    1989-04-01

    This report compares the radiation spectra and intensities predicted by three radionuclide inventory/depletion codes, ORIGEN2, ORIGEN-S, and CINDER-2. The comparisons were made for a series of light-water reactor models (including three pressurized-water reactors (PWR) and two boiling-water reactors (BWR)) at cooling times ranging from 30 d to 100 years. The work presented here complements the results described in an earlier report that discusses in detail the three depletion codes, the various reactor models, and the comparison by nuclide of the inventories, activities, and decay heat predictions by nuclide for the three codes. In this report, the photon production rates from fission product nuclides and actinides were compared as well as the total photon production rates and energy spectra. Very good agreement was observed in the photon source terms predicted by ORIGEN2 and ORIGEN-S. The absence of bremsstrahlung radiation in the CINDER-2 calculations resulted in large differences in both the production rates and spectra in comparison with the ORIGEN2 and ORIGEN-S results. A comparison of the CINDER-2 photon production rates with an ORIGEN-S calculation neglecting bremsstrahlung radiation showed good agreement. An additional discrepancy was observed in the photon spectra predicted from the CINDER-2 calculations and has been attributed to the absence of spectral data for /sup 144/Pr in those calculations. 12 refs., 26 figs., 36 tabs.

  8. New Source Term Model for the RESRAD-OFFSITE Code Version 3

    SciTech Connect

    Yu, Charley; Gnanapragasam, Emmanuel; Cheng, Jing-Jy; Kamboj, Sunita; Chen, Shih-Yew

    2013-06-01

    This report documents the new source term model developed and implemented in Version 3 of the RESRAD-OFFSITE code. This new source term model includes: (1) "first order release with transport" option, in which the release of the radionuclide is proportional to the inventory in the primary contamination and the user-specified leach rate is the proportionality constant, (2) "equilibrium desorption release" option, in which the user specifies the distribution coefficient which quantifies the partitioning of the radionuclide between the solid and aqueous phases, and (3) "uniform release" option, in which the radionuclides are released from a constant fraction of the initially contaminated material during each time interval and the user specifies the duration over which the radionuclides are released.

  9. Ada and the rapid development lifecycle

    NASA Technical Reports Server (NTRS)

    Deforrest, Lloyd; Gref, Lynn

    1991-01-01

    JPL is under contract, through NASA, with the US Army to develop a state-of-the-art Command Center System for the US European Command (USEUCOM). The Command Center System will receive, process, and integrate force status information from various sources and provide this integrated information to staff officers and decision makers in a format designed to enhance user comprehension and utility. The system is based on distributed workstation class microcomputers, VAX- and SUN-based data servers, and interfaces to existing military mainframe systems and communication networks. JPL is developing the Command Center System utilizing an incremental delivery methodology called the Rapid Development Methodology with adherence to government and industry standards including the UNIX operating system, X Windows, OSF/Motif, and the Ada programming language. Through a combination of software engineering techniques specific to the Ada programming language and the Rapid Development Approach, JPL was able to deliver capability to the military user incrementally, with comparable quality and improved economies of projects developed under more traditional software intensive system implementation methodologies.

  10. Transmitter data collection using Ada

    NASA Technical Reports Server (NTRS)

    Conroy, B. L.

    1988-01-01

    A data collection system installed on the 400 kilowatt X-band transmitter of the Goldstone Solar System Radar is described. The data collection system is built around the off-the-shelf IEEE 488 instrumentation, linked with fiber optics, controlled by an inexpensive computer, and uses software written in the Ada language. The speed and accuracy of the system is discussed, along with programming techniques used for both data collection and reduction.

  11. Source Term Model for Vortex Generator Vanes in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2004-01-01

    A source term model for an array of vortex generators was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the side force created by a vortex generator vane. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on the local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low profile vortex generator vane on a flat plate. In addition, the model was compared to experimental data of an S-duct with 22 co-rotating, low profile vortex generators. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator on a flat plate without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream-wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data. The peak vorticity and its location were also predicted very well when compared to numerical simulations and experimental data. The circulation predicted by the source term model matches the prediction of the numerical simulation. The source term model predicted the engine fan face distortion and total pressure recovery of the S-duct with 22 co-rotating vortex generators very well. The source term model allows a researcher to quickly investigate different locations of individual or a row of vortex generators. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  12. Effectiveness Evaluation of Skin Covers against Intravascular Brachytherapy Sources Using VARSKIN3 Code

    PubMed Central

    Baghani, H R; Nazempour, A R; Aghamiri, S M R; Hosseini Daghigh, S M; Mowlavi, A A

    2013-01-01

    Background and Objective: The most common intravascular brachytherapy sources include 32P, 188Re, 106Rh and 90Sr/90Y. In this research, skin absorbed dose for different covering materials in dealing with these sources were evaluated and the best covering material for skin protection and reduction of absorbed dose by radiation staff was recognized and recommended. Method: Four materials including polyethylene, cotton and two different kinds of plastic were proposed as skin covers and skin absorbed dose at different depths for each kind of the materials was calculated separately using the VARSKIN3 code. Results: The results suggested that for all sources, skin absorbed dose was minimized when using polyethylene. Considering this material as skin cover, maximum and minimum doses at skin surface were related to 90Sr/90Y and 106Rh, respectively. Conclusion: polyethylene was found the most effective cover in reducing skin dose and protecting the skin. Furthermore, proper agreement between the results of VARSKIN3 and other experimental measurements indicated that VRASKIN3 is a powerful tool for skin dose calculations when working with beta emitter sources. Therefore, it can be utilized in dealing with the issue of radiation protection. PMID:25505758

  13. QUEST/Ada (Query Utility Environment for Software Testing) of Ada: The development of a program analysis environment for Ada

    NASA Technical Reports Server (NTRS)

    Brown, David B.

    1988-01-01

    A history of the Query Utility Environment for Software Testing (QUEST)/Ada is presented. A fairly comprehensive literature review which is targeted toward issues of Ada testing is given. The definition of the system structure and the high level interfaces are then presented. The design of the three major components is described. The QUEST/Ada IORL System Specifications to this point in time are included in the Appendix. A paper is also included in the appendix which gives statistical evidence of the validity of the test case generation approach which is being integrated into QUEST/Ada.

  14. RIES - Rijnland Internet Election System: A Cursory Study of Published Source Code

    NASA Astrophysics Data System (ADS)

    Gonggrijp, Rop; Hengeveld, Willem-Jan; Hotting, Eelco; Schmidt, Sebastian; Weidemann, Frederik

    The Rijnland Internet Election System (RIES) is a system designed for voting in public elections over the internet. A rather cursory scan of the source code to RIES showed a significant lack of security-awareness among the programmers which - among other things - appears to have left RIES vulnerable to near-trivial attacks. If it had not been for independent studies finding problems, RIES would have been used in the 2008 Water Board elections, possibly handling a million votes or more. While RIES was more extensively studied to find cryptographic shortcomings, our work shows that more down-to-earth secure design practices can be at least as important, and the aspects need to be examined much sooner than right before an election.

  15. COMPASS: An Ada based scheduler

    NASA Technical Reports Server (NTRS)

    Mcmahon, Mary Beth; Culbert, Chris

    1992-01-01

    COMPASS is a generic scheduling system developed by McDonnell Douglas and funded by the Software Technology Branch of NASA Johnson Space Center. The motivation behind COMPASS is to illustrate scheduling technology and provide a basis from which custom scheduling systems can be built. COMPASS was written in Ada to promote readability and to conform to DOD standards. COMPASS has some unique characteristics that distinguishes it from commercial products. This paper discusses these characteristics and uses them to illustrate some differences between scheduling tools.

  16. MARE2DEM: a 2-D inversion code for controlled-source electromagnetic and magnetotelluric data

    NASA Astrophysics Data System (ADS)

    Key, Kerry

    2016-10-01

    This work presents MARE2DEM, a freely available code for 2-D anisotropic inversion of magnetotelluric (MT) data and frequency-domain controlled-source electromagnetic (CSEM) data from onshore and offshore surveys. MARE2DEM parametrizes the inverse model using a grid of arbitrarily shaped polygons, where unstructured triangular or quadrilateral grids are typically used due to their ease of construction. Unstructured grids provide significantly more geometric flexibility and parameter efficiency than the structured rectangular grids commonly used by most other inversion codes. Transmitter and receiver components located on topographic slopes can be tilted parallel to the boundary so that the simulated electromagnetic fields accurately reproduce the real survey geometry. The forward solution is implemented with a goal-oriented adaptive finite-element method that automatically generates and refines unstructured triangular element grids that conform to the inversion parameter grid, ensuring accurate responses as the model conductivity changes. This dual-grid approach is significantly more efficient than the conventional use of a single grid for both the forward and inverse meshes since the more detailed finite-element meshes required for accurate responses do not increase the memory requirements of the inverse problem. Forward solutions are computed in parallel with a highly efficient scaling by partitioning the data into smaller independent modeling tasks consisting of subsets of the input frequencies, transmitters and receivers. Non-linear inversion is carried out with a new Occam inversion approach that requires fewer forward calls. Dense matrix operations are optimized for memory and parallel scalability using the ScaLAPACK parallel library. Free parameters can be bounded using a new non-linear transformation that leaves the transformed parameters nearly the same as the original parameters within the bounds, thereby reducing non-linear smoothing effects. Data

  17. Ada in Introductory Computer Science Courses

    DTIC Science & Technology

    1993-01-01

    Sacred Heart University’s current computer science curriculum has been modified in the 1992-1993 school year after receiving an ARPA grant(Advanced...grant entitled Ada in Introductory Computer Science Course, allowed for the modification of both introductory programming courses to use Ada as the...introductory computer science courses CS050 (Introduction to Computer Science ) and CS051 (Data Structures) were developed to include Ada and software

  18. An evaluation of Ada for Al applications

    NASA Technical Reports Server (NTRS)

    Wallace, David R.

    1986-01-01

    Expert system technology seems to be the most promising type of Artificial Intelligence (AI) application for Ada. An expert system implemented with an expert system shell provides a highly structured approach that fits well with the structured approach found in Ada systems. The current commercial expert system shells use Lisp. In this highly structured situation a shell could be built that used Ada just as well. On the other hand, if it is necessary to deal with some AI problems that are not suited to expert systems, the use of Ada becomes more problematical. Ada was not designed as an AI development language, and is not suited to that. It is possible that an application developed in say, Common Lisp could be translated to Ada for actual use in a particular application, but this could be difficult. Some standard Ada packages could be developed to make such a translation easier. If the most general AI programs need to be dealt with, a Common Lisp system integrated with the Ada Environment is probably necessary. Aside from problems with language features, Ada, by itself, is not well suited to the prototyping and incremental development that is well supported by Lisp.

  19. ART/Ada design project, phase 1

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An Ada-Based Expert System Building Tool Design Research Project was conducted. The goal was to investigate various issues in the context of the design of an Ada-based expert system building tool. An attempt was made to achieve a comprehensive understanding of the potential for embedding expert systems in Ada systems for eventual application in future projects. The current status of the project is described by introducing an operational prototype, ART/Ada. How the project was conducted is explained. The performance of the prototype is analyzed and compared with other related works. Future research directions are suggested.

  20. Neutrons Flux Distributions of the Pu-Be Source and its Simulation by the MCNP-4B Code

    NASA Astrophysics Data System (ADS)

    Faghihi, F.; Mehdizadeh, S.; Hadad, K.

    Neutron Fluence rate of a low intense Pu-Be source is measured by Neutron Activation Analysis (NAA) of 197Au foils. Also, the neutron fluence rate distribution versus energy is calculated using the MCNP-4B code based on ENDF/B-V library. Theoretical simulation as well as our experimental performance are a new experience for Iranians to make reliability with the code for further researches. In our theoretical investigation, an isotropic Pu-Be source with cylindrical volume distribution is simulated and relative neutron fluence rate versus energy is calculated using MCNP-4B code. Variation of the fast and also thermal neutrons fluence rate, which are measured by NAA method and MCNP code, are compared.

  1. Ada training evaluation and recommendations from the Gamma Ray Observatory Ada Development Team

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Ada training experiences of the Gamma Ray Observatory Ada development team are related, and recommendations are made concerning future Ada training for software developers. Training methods are evaluated, deficiencies in the training program are noted, and a recommended approach, including course outline, time allocation, and reference materials, is offered.

  2. Ada training evaluation and recommendations from the Gamma Ray Observatory Ada Development Team

    SciTech Connect

    Not Available

    1985-10-01

    The Ada training experiences of the Gamma Ray Observatory Ada development team are related, and recommendations are made concerning future Ada training for software developers. Training methods are evaluated, deficiencies in the training program are noted, and a recommended approach, including course outline, time allocation, and reference materials, is offered.

  3. What Does It Take to Develop a Million Lines of Open Source Code?

    NASA Astrophysics Data System (ADS)

    Fernandez-Ramil, Juan; Izquierdo-Cortazar, Daniel; Mens, Tom

    This article presents a preliminary and exploratory study of the relationship between size, on the one hand, and effort, duration and team size, on the other, for 11 Free/Libre/Open Source Software (FLOSS) projects with current size ranging between between 0.6 and 5.3 million lines of code (MLOC). Effort was operationalised based on the number of active committers per month. The extracted data did not fit well an early version of the closed-source cost estimation model COCOMO for proprietary software, overall suggesting that, at least to some extent, FLOSS communities are more productive than closed-source teams. This also motivated the need for FLOSS-specific effort models. As a first approximation, we evaluated 16 linear regression models involving different pairs of attributes. One of our experiments was to calculate the net size, that is, to remove any suspiciously large outliers or jumps in the growth trends. The best model we found involved effort against net size, accounting for 79 percent of the variance. This model was based on data excluding a possible outlier (Eclipse), the largest project in our sample. This suggests that different effort models may be needed for certain categories of FLOSS projects. Incidentally, for each of the 11 individual FLOSS projects we were able to model the net size trends with very high accuracy (R 2 ≥ 0.98). Of the 11 projects, 3 have grown superlinearly, 5 linearly and 3 sublinearly, suggesting that in the majority of the cases accumulated complexity is either well controlled or don’t constitute a growth constraining factor.

  4. Proceedings of the 2nd NASA Ada User's Symposium

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Several presentations, mostly in viewgraph form, on various topics relating to Ada applications are given. Topics covered include the use of Ada in NASA, Ada and the Space Station, the software support environment, Ada in the Software Engineering Laboratory, Ada at the Jet Propulsion Laboratory, the Flight Telerobotic Servicer, and lessons learned in prototyping the Space Station Remote Manipulator System control.

  5. Robust image transmission using a new joint source channel coding algorithm and dual adaptive OFDM

    NASA Astrophysics Data System (ADS)

    Farshchian, Masoud; Cho, Sungdae; Pearlman, William A.

    2004-01-01

    In this paper we consider the problem of robust image coding and packetization for the purpose of communications over slow fading frequency selective channels and channels with a shaped spectrum like those of digital subscribe lines (DSL). Towards this end, a novel and analytically based joint source channel coding (JSCC) algorithm to assign unequal error protection is presented. Under a block budget constraint, the image bitstream is de-multiplexed into two classes with different error responses. The algorithm assigns unequal error protection (UEP) in a way to minimize the expected mean square error (MSE) at the receiver while minimizing the probability of catastrophic failure. In order to minimize the expected mean square error at the receiver, the algorithm assigns unequal protection to the value bit class (VBC) stream. In order to minimizes the probability of catastrophic error which is a characteristic of progressive image coders, the algorithm assigns more protection to the location bit class (LBC) stream than the VBC stream. Besides having the advantage of being analytical and also numerically solvable, the algorithm is based on a new formula developed to estimate the distortion rate (D-R) curve for the VBC portion of SPIHT. The major advantage of our technique is that the worst case instantaneous minimum peak signal to noise ratio (PSNR) does not differ greatly from the averge MSE while this is not the case for the optimal single stream (UEP) system. Although both average PSNR of our method and the optimal single stream UEP are about the same, our scheme does not suffer erratic behavior because we have made the probability of catastrophic error arbitarily small. The coded image is sent via orthogonal frequency division multiplexing (OFDM) which is a known and increasing popular modulation scheme to combat ISI (Inter Symbol Interference) and impulsive noise. Using dual adaptive energy OFDM, we use the minimum energy necessary to send each bit stream at a

  6. Slow Temporal Integration Enables Robust Neural Coding and Perception of a Cue to Sound Source Location

    PubMed Central

    Tollin, Daniel J.

    2016-01-01

    In mammals, localization of sound sources in azimuth depends on sensitivity to interaural differences in sound timing (ITD) and level (ILD). Paradoxically, while typical ILD-sensitive neurons of the auditory brainstem require millisecond synchrony of excitatory and inhibitory inputs for the encoding of ILDs, human and animal behavioral ILD sensitivity is robust to temporal stimulus degradations (e.g., interaural decorrelation due to reverberation), or, in humans, bilateral clinical device processing. Here we demonstrate that behavioral ILD sensitivity is only modestly degraded with even complete decorrelation of left- and right-ear signals, suggesting the existence of a highly integrative ILD-coding mechanism. Correspondingly, we find that a majority of auditory midbrain neurons in the central nucleus of the inferior colliculus (of chinchilla) effectively encode ILDs despite complete decorrelation of left- and right-ear signals. We show that such responses can be accounted for by relatively long windows of bilateral excitatory-inhibitory interaction, which we explicitly measure using trains of narrowband clicks. Neural and behavioral data are compared with the outputs of a simple model of ILD processing with a single free parameter, the duration of excitatory-inhibitory interaction. Behavioral, neural, and modeling data collectively suggest that ILD sensitivity depends on binaural integration of excitation and inhibition within a ≳3 ms temporal window, significantly longer than observed in lower brainstem neurons. This relatively slow integration potentiates a unique role for the ILD system in spatial hearing that may be of particular importance when informative ITD cues are unavailable. SIGNIFICANCE STATEMENT In mammalian hearing, interaural differences in the timing (ITD) and level (ILD) of impinging sounds carry critical information about source location. However, natural sounds are often decorrelated between the ears by reverberation and background noise

  7. CMS-2 to Ada Translator Evaluation.

    DTIC Science & Technology

    1997-09-01

    these translators, and to provide information to CMS -2 project managers to assist them in the evaluation of costs and risks of translating CMS -2 to Ada....The objective of this evaluation was to determine the maturity of the CMS -2 to Ada translators and associated tools, to determine the capabilities of

  8. General-Purpose Ada Software Packages

    NASA Technical Reports Server (NTRS)

    Klumpp, Allan R.

    1991-01-01

    Collection of subprograms brings to Ada many features from other programming languages. All generic packages designed to be easily instantiated for types declared in user's facility. Most packages have widespread applicability, although some oriented for avionics applications. All designed to facilitate writing new software in Ada. Written on IBM/AT personal computer running under PC DOS, v.3.1.

  9. Ada (Trade Name) Bibliography. Volume 3.

    DTIC Science & Technology

    1986-02-01

    PROTOTYPING USING THE SETL PROGRAMMING LANGUAGE SCHULTZ. JAMES B., NONAFFILIATED 4642 -03 WEAPONS THAT THINK SCHULTZ, LENNART , CRONE&KOCH ORDRUPVEJ1O1,DK...AS A TARGET FOR ADA SUBRAHMANYAM, T CARTER, U. OF UTAH, SALT LAKE CITY, UTAH 84112 3455 -01 TRANSFORMATION OF ADA PROGRAMS INTO SILICON SVENSSON , GERT

  10. Comparison of TG-43 dosimetric parameters of brachytherapy sources obtained by three different versions of MCNP codes.

    PubMed

    Zaker, Neda; Zehtabian, Mehdi; Sina, Sedigheh; Koontz, Craig; Meigooni, Ali S

    2016-03-01

    Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross-sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross-sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in  125I and  103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code - MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low-energy sources such as  125I and  103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for  103Pd and 10 cm for  125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for  192Ir and less than 1.2% for  137Cs between the three codes. PACS number(s): 87.56.bg.

  11. Comparison of TG-43 dosimetric parameters of brachytherapy sources obtained by three different versions of MCNP codes.

    PubMed

    Zaker, Neda; Zehtabian, Mehdi; Sina, Sedigheh; Koontz, Craig; Meigooni, Ali S

    2016-03-08

    Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross-sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross-sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in 125I and 103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code - MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low-energy sources such as 125I and 103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for 103Pd and 10 cm for 125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for 192Ir and less than 1.2% for 137Cs between the three codes.

  12. Paranoia.Ada: A diagnostic program to evaluate Ada floating-point arithmetic

    NASA Technical Reports Server (NTRS)

    Hjermstad, Chris

    1986-01-01

    Many essential software functions in the mission critical computer resource application domain depend on floating point arithmetic. Numerically intensive functions associated with the Space Station project, such as emphemeris generation or the implementation of Kalman filters, are likely to employ the floating point facilities of Ada. Paranoia.Ada appears to be a valuabe program to insure that Ada environments and their underlying hardware exhibit the precision and correctness required to satisfy mission computational requirements. As a diagnostic tool, Paranoia.Ada reveals many essential characteristics of an Ada floating point implementation. Equipped with such knowledge, programmers need not tremble before the complex task of floating point computation.

  13. Understanding the Adoption of Ada: Results of an Industry Survey

    DTIC Science & Technology

    1990-05-01

    R& D ) expenditures of the participating firms were about $38 million with a standard error of $26 million. R& D expenditures represent an average of...5.3% of business unit revenues. An average of 28% of the R& D budget is spent on software development and 29% of the software research budget is...directed toward devel- oping Ada capabilities. Business unit R& D funds used for software development come from the following sources: ’A standard error is a

  14. PyVCI: A flexible open-source code for calculating accurate molecular infrared spectra

    NASA Astrophysics Data System (ADS)

    Sibaev, Marat; Crittenden, Deborah L.

    2016-06-01

    The PyVCI program package is a general purpose open-source code for simulating accurate molecular spectra, based upon force field expansions of the potential energy surface in normal mode coordinates. It includes harmonic normal coordinate analysis and vibrational configuration interaction (VCI) algorithms, implemented primarily in Python for accessibility but with time-consuming routines written in C. Coriolis coupling terms may be optionally included in the vibrational Hamiltonian. Non-negligible VCI matrix elements are stored in sparse matrix format to alleviate the diagonalization problem. CPU and memory requirements may be further controlled by algorithmic choices and/or numerical screening procedures, and recommended values are established by benchmarking using a test set of 44 molecules for which accurate analytical potential energy surfaces are available. Force fields in normal mode coordinates are obtained from the PyPES library of high quality analytical potential energy surfaces (to 6th order) or by numerical differentiation of analytic second derivatives generated using the GAMESS quantum chemical program package (to 4th order).

  15. A Particle-In-Cell Gun Code for Surface-Converter H- Ion Source Modeling

    NASA Astrophysics Data System (ADS)

    Chacon-Golcher, Edwin; Bowers, Kevin J.

    2007-08-01

    We present the current status of a particle-in-cell with Monte Carlo collisions (PIC-MCC) gun code under development at Los Alamos for the study of surface-converter H- ion sources. The program preserves a first-principles approach to a significant extent and simulates the production processes without ad hoc models within the plasma region. Some of its features include: solution of arbitrary electrostatic and magnetostatic fields in an axisymmetric (r,z) geometry to describe the self-consistent time evolution of a plasma; simulation of a multi-species (e-,H+,H2+,H3+,H-) plasma discharge from a neutral hydrogen gas and filament-originated seed electrons; full 2-dimensional (r,z) 3-velocity (vr,vz,vφ) dynamics for all species with exact conservation of the canonical angular momentum pφ; detailed collision physics between charged particles and neutrals and the ability to represent multiple smooth (not stair-stepped) electrodes of arbitrary shape and voltage whose surfaces may be secondary-particle emitters (H- and e-). The status of this development is discussed in terms of its physics content and current implementation details.

  16. Assume-Guarantee Verification of Source Code with Design-Level Assumptions

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.

    2004-01-01

    Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.

  17. Acoustic Scattering by Three-Dimensional Stators and Rotors Using the SOURCE3D Code. Volume 2; Scattering Plots

    NASA Technical Reports Server (NTRS)

    Meyer, Harold D.

    1999-01-01

    This second volume of Acoustic Scattering by Three-Dimensional Stators and Rotors Using the SOURCE3D Code provides the scattering plots referenced by Volume 1. There are 648 plots. Half are for the 8750 rpm "high speed" operating condition and the other half are for the 7031 rpm "mid speed" operating condition.

  18. Ada Compiler Validation Summary Report: New York University (NYU Ada/Ed) Compiler, Version 1.4 for VAX 11/780, Using VMS 3.5.

    DTIC Science & Technology

    1984-08-10

    AB.ADA P CT=9 EC=1 B55AOK- ABADA CT= EC= B55AO1K-AB.ADA P CT=10 EC=1 B55AO1L-AB.ADA P CT=10 EC=1 B55AO1M-AB.ADA P CT: 15 EC=1 B55AO10-AB.ADA P CT=26 EC=1... ABADA CT= EC= B71001L-AB.ADA P CT=10 EC=4 B71001M-AB.ADA P CT:10 EC=1 B71001N-AB.ADA P CT:? EC=1 B71001O-AB.ADA P CT=6 EC=1 B71001P-AB.ADA P CT=8 EC:1

  19. VULCAN: An Open-source, Validated Chemical Kinetics Python Code for Exoplanetary Atmospheres

    NASA Astrophysics Data System (ADS)

    Tsai, Shang-Min; Lyons, James R.; Grosheintz, Luc; Rimmer, Paul B.; Kitzmann, Daniel; Heng, Kevin

    2017-02-01

    We present an open-source and validated chemical kinetics code for studying hot exoplanetary atmospheres, which we name VULCAN. It is constructed for gaseous chemistry from 500 to 2500 K, using a reduced C–H–O chemical network with about 300 reactions. It uses eddy diffusion to mimic atmospheric dynamics and excludes photochemistry. We have provided a full description of the rate coefficients and thermodynamic data used. We validate VULCAN by reproducing chemical equilibrium and by comparing its output versus the disequilibrium-chemistry calculations of Moses et al. and Rimmer & Helling. It reproduces the models of HD 189733b and HD 209458b by Moses et al., which employ a network with nearly 1600 reactions. We also use VULCAN to examine the theoretical trends produced when the temperature–pressure profile and carbon-to-oxygen ratio are varied. Assisted by a sensitivity test designed to identify the key reactions responsible for producing a specific molecule, we revisit the quenching approximation and find that it is accurate for methane but breaks down for acetylene, because the disequilibrium abundance of acetylene is not directly determined by transport-induced quenching, but is rather indirectly controlled by the disequilibrium abundance of methane. Therefore we suggest that the quenching approximation should be used with caution and must always be checked against a chemical kinetics calculation. A one-dimensional model atmosphere with 100 layers, computed using VULCAN, typically takes several minutes to complete. VULCAN is part of the Exoclimes Simulation Platform (ESP; exoclime.net) and publicly available at https://github.com/exoclime/VULCAN.

  20. Experiences with Ada in an embedded system

    NASA Technical Reports Server (NTRS)

    Labaugh, Robert J.

    1988-01-01

    Recent experiences with using Ada in a real time environment are described. The application was the control system for an experimental robotic arm. The objectives of the effort were to experiment with developing embedded applications in Ada, evaluating the suitability of the language for the application, and determining the performance of the system. Additional objectives were to develop a control system based on the NASA/NBS Standard Reference Model for Telerobot Control System Architecture (NASREM) in Ada, and to experiment with the control laws and how to incorporate them into the NASREM architecture.

  1. Development of an Ada package library

    NASA Technical Reports Server (NTRS)

    Burton, Bruce; Broido, Michael

    1986-01-01

    A usable prototype Ada package library was developed and is currently being evaluated for use in large software development efforts. The library system is comprised of an Ada-oriented design language used to facilitate the collection of reuse information, a relational data base to store reuse information, a set of reusable Ada components and tools, and a set of guidelines governing the system's use. The prototyping exercise is discussed and the lessons learned from it have led to the definition of a comprehensive tool set to facilitate software reuse.

  2. ANEMOS: A computer code to estimate air concentrations and ground deposition rates for atmospheric nuclides emitted from multiple operating sources

    SciTech Connect

    Miller, C.W.; Sjoreen, A.L.; Begovich, C.L.; Hermann, O.W.

    1986-11-01

    This code estimates concentrations in air and ground deposition rates for Atmospheric Nuclides Emitted from Multiple Operating Sources. ANEMOS is one component of an integrated Computerized Radiological Risk Investigation System (CRRIS) developed for the US Environmental Protection Agency (EPA) for use in performing radiological assessments and in developing radiation standards. The concentrations and deposition rates calculated by ANEMOS are used in subsequent portions of the CRRIS for estimating doses and risks to man. The calculations made in ANEMOS are based on the use of a straight-line Gaussian plume atmospheric dispersion model with both dry and wet deposition parameter options. The code will accommodate a ground-level or elevated point and area source or windblown source. Adjustments may be made during the calculations for surface roughness, building wake effects, terrain height, wind speed at the height of release, the variation in plume rise as a function of downwind distance, and the in-growth and decay of daughter products in the plume as it travels downwind. ANEMOS can also accommodate multiple particle sizes and clearance classes, and it may be used to calculate the dose from a finite plume of gamma-ray-emitting radionuclides passing overhead. The output of this code is presented for 16 sectors of a circular grid. ANEMOS can calculate both the sector-average concentrations and deposition rates at a given set of downwind distances in each sector and the average of these quantities over an area within each sector bounded by two successive downwind distances. ANEMOS is designed to be used primarily for continuous, long-term radionuclide releases. This report describes the models used in the code, their computer implementation, the uncertainty associated with their use, and the use of ANEMOS in conjunction with other codes in the CRRIS. A listing of the code is included in Appendix C.

  3. Fan Noise Prediction System Development: Source/Radiation Field Coupling and Workstation Conversion for the Acoustic Radiation Code

    NASA Technical Reports Server (NTRS)

    Meyer, H. D.

    1993-01-01

    The Acoustic Radiation Code (ARC) is a finite element program used on the IBM mainframe to predict far-field acoustic radiation from a turbofan engine inlet. In this report, requirements for developers of internal aerodynamic codes regarding use of their program output an input for the ARC are discussed. More specifically, the particular input needed from the Bolt, Beranek and Newman/Pratt and Whitney (turbofan source noise generation) Code (BBN/PWC) is described. In a separate analysis, a method of coupling the source and radiation models, that recognizes waves crossing the interface in both directions, has been derived. A preliminary version of the coupled code has been developed and used for initial evaluation of coupling issues. Results thus far have shown that reflection from the inlet is sufficient to indicate that full coupling of the source and radiation fields is needed for accurate noise predictions ' Also, for this contract, the ARC has been modified for use on the Sun and Silicon Graphics Iris UNIX workstations. Changes and additions involved in this effort are described in an appendix.

  4. Ada Compiler Validation Summary Report: Certificate Number: 880318W1. 09041, International Business Machines Corporation, IBM Development System for the Ada Language, Version 2.1.0, IBM 4381 under VM/HPO, Host and Target

    DTIC Science & Technology

    1988-03-28

    International Business Machines Corporation IBM Development System for the Ada Language, Version 2.1.0 IBM 4381 under VM/HPO, host and target DTIC...necessary and identify by block number) International Business Machines Corporation, IBM Development System for the Ada Language, Version 2.1.0, IBM...in the compiler listed in this declaration. I declare that International Business Machines Corporation is the owner of record of the object code of the

  5. Software engineering and Ada in design

    NASA Technical Reports Server (NTRS)

    Oneill, Don

    1986-01-01

    Modern software engineering promises significant reductions in software costs and improvements in software quality. The Ada language is the focus for these software methodology and tool improvements. The IBM FSD approach, including the software engineering practices that guide the systematic design and development of software products and the management of the software process are examined. The revised Ada design language adaptation is revealed. This four level design methodology is detailed including the purpose of each level, the management strategy that integrates the software design activity with the program milestones, and the technical strategy that maps the Ada constructs to each level of design. A complete description of each design level is provided along with specific design language recording guidelines for each level. Finally, some testimony is offered on education, tools, architecture, and metrics resulting from project use of the four level Ada design language adaptation.

  6. Real-Time Ada Problem Study

    DTIC Science & Technology

    1989-03-24

    define this set of problems. The authors were chosen because of their proven expertise in real-time development with Ada. They could enrich the results of...for Real-Time Embedded Systems". LabTek Corporation, the author , had proven expertise in embedded system design utilizing Motorola MC680XO- based...processors. The second report is entitledSoftware Enineering Problems Using Ada in Computers Integral to Weapons Systems. Its author , Sonicraft, had

  7. A W-Grammar Description for ADA.

    DTIC Science & Technology

    1986-12-01

    Language Reference Manual. In MT opinion, the W- gramars fall short of this goal since they are less readable than BNF for determining Ada’s syntax, and...37 Summary . . . . . . . . . . . . . . .... . 39 V. Conclusion .. . . . . . . . . . .. . . .* . . . . 40 Ada Constructs Not Covered in W- gramar B...library unit. The problem with the Language Reference Manual description is not that BNF is too antiquated for language definition, but that English

  8. Ada developers' supplement to the recommended approach

    NASA Technical Reports Server (NTRS)

    Kester, Rush; Landis, Linda

    1993-01-01

    This document is a collection of guidelines for programmers and managers who are responsible for the development of flight dynamics applications in Ada. It is intended to be used in conjunction with the Recommended Approach to Software Development (SEL-81-305), which describes the software development life cycle, its products, reviews, methods, tools, and measures. The Ada Developers' Supplement provides additional detail on such topics as reuse, object-oriented analysis, and object-oriented design.

  9. Exo-Transmit: An Open-Source Code for Calculating Transmission Spectra for Exoplanet Atmospheres of Varied Composition

    NASA Astrophysics Data System (ADS)

    Kempton, Eliza M.-R.; Lupu, Roxana; Owusu-Asare, Albert; Slough, Patrick; Cale, Bryson

    2017-04-01

    We present Exo-Transmit, a software package to calculate exoplanet transmission spectra for planets of varied composition. The code is designed to generate spectra of planets with a wide range of atmospheric composition, temperature, surface gravity, and size, and is therefore applicable to exoplanets ranging in mass and size from hot Jupiters down to rocky super-Earths. Spectra can be generated with or without clouds or hazes with options to (1) include an optically thick cloud deck at a user-specified atmospheric pressure or (2) to augment the nominal Rayleigh scattering by a user-specified factor. The Exo-Transmit code is written in C and is extremely easy to use. Typically the user will only need to edit parameters in a single user input file in order to run the code for a planet of their choosing. Exo-Transmit is available publicly on Github with open-source licensing at https://github.com/elizakempton/Exo_Transmit.

  10. Ada Compiler Validation Summary Report: ROLM Ada Compiler, Version 4.52 V-003.

    DTIC Science & Technology

    1983-06-03

    00 00:06 1 B97101C-AB.ADA P 11 00:00 00:04 1 89V1O1D- ABADA P 11 00:00 00:04 1 B97101E-AB.ADA P 13 00:01 00:04 4 B97102A-AB.ADA P 29 00:01 00:05 13... ABADA P 42 00:13 01:13 03:00 4 D55A03D-AB.ADA P 34 00:14 01:18 02:57 4 D55AO3E-AB.ADA P 53 00:22 01:25 02:36 3 D55AO3F-AB.ADA P 56 00:23 01:33 02:56 3...17 3 CC2002A-AB.ADA P 22 00:04 01:08 02:19 3 CC3004A-B.ADA P 28 00:07 01:16 02:20 2 CC3007A- ABADA P 53 00:11 01:32 02:22 2 CC 3011A... 5ADA P 60 00

  11. A novel Multi-Agent Ada-Boost algorithm for predicting protein structural class with the information of protein secondary structure.

    PubMed

    Fan, Ming; Zheng, Bin; Li, Lihua

    2015-10-01

    Knowledge of the structural class of a given protein is important for understanding its folding patterns. Although a lot of efforts have been made, it still remains a challenging problem for prediction of protein structural class solely from protein sequences. The feature extraction and classification of proteins are the main problems in prediction. In this research, we extended our earlier work regarding these two aspects. In protein feature extraction, we proposed a scheme by calculating the word frequency and word position from sequences of amino acid, reduced amino acid, and secondary structure. For an accurate classification of the structural class of protein, we developed a novel Multi-Agent Ada-Boost (MA-Ada) method by integrating the features of Multi-Agent system into Ada-Boost algorithm. Extensive experiments were taken to test and compare the proposed method using four benchmark datasets in low homology. The results showed classification accuracies of 88.5%, 96.0%, 88.4%, and 85.5%, respectively, which are much better compared with the existing methods. The source code and dataset are available on request.

  12. Parallel Ada benchmarks for the SVMS

    NASA Technical Reports Server (NTRS)

    Collard, Philippe E.

    1990-01-01

    The use of parallel processing paradigm to design and develop faster and more reliable computers appear to clearly mark the future of information processing. NASA started the development of such an architecture: the Spaceborne VHSIC Multi-processor System (SVMS). Ada will be one of the languages used to program the SVMS. One of the unique characteristics of Ada is that it supports parallel processing at the language level through the tasking constructs. It is important for the SVMS project team to assess how efficiently the SVMS architecture will be implemented, as well as how efficiently Ada environment will be ported to the SVMS. AUTOCLASS II, a Bayesian classifier written in Common Lisp, was selected as one of the benchmarks for SVMS configurations. The purpose of the R and D effort was to provide the SVMS project team with the version of AUTOCLASS II, written in Ada, that would make use of Ada tasking constructs as much as possible so as to constitute a suitable benchmark. Additionally, a set of programs was developed that would measure Ada tasking efficiency on parallel architectures as well as determine the critical parameters influencing tasking efficiency. All this was designed to provide the SVMS project team with a set of suitable tools in the development of the SVMS architecture.

  13. 49 CFR 37.123 - ADA paratransit eligibility: Standards.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 1 2014-10-01 2014-10-01 false ADA paratransit eligibility: Standards. 37.123... INDIVIDUALS WITH DISABILITIES (ADA) Paratransit as a Complement to Fixed Route Service § 37.123 ADA... complementary paratransit service shall provide the service to the ADA paratransit eligible...

  14. 49 CFR 37.123 - ADA paratransit eligibility: Standards.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 1 2012-10-01 2012-10-01 false ADA paratransit eligibility: Standards. 37.123... INDIVIDUALS WITH DISABILITIES (ADA) Paratransit as a Complement to Fixed Route Service § 37.123 ADA... complementary paratransit service shall provide the service to the ADA paratransit eligible...

  15. 49 CFR 37.125 - ADA paratransit eligibility: Process.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 1 2012-10-01 2012-10-01 false ADA paratransit eligibility: Process. 37.125... INDIVIDUALS WITH DISABILITIES (ADA) Paratransit as a Complement to Fixed Route Service § 37.125 ADA... § 37.121 of this part shall establish a process for determining ADA paratransit eligibility. (a)...

  16. 49 CFR 37.125 - ADA paratransit eligibility: Process.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 1 2011-10-01 2011-10-01 false ADA paratransit eligibility: Process. 37.125... INDIVIDUALS WITH DISABILITIES (ADA) Paratransit as a Complement to Fixed Route Service § 37.125 ADA... § 37.121 of this part shall establish a process for determining ADA paratransit eligibility. (a)...

  17. Toward the efficient implementation of expert systems in Ada

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel

    1990-01-01

    Here, the authors describe Ada language issues encountered during the development of ART-Ada, an expert system tool for Ada deployment. ART-Ada is being used to implement several expert system applications for the Space Station Freedom and the U.S. Air Force. Additional information is given on dynamic memory allocation.

  18. 49 CFR 37.123 - ADA paratransit eligibility: Standards.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 1 2011-10-01 2011-10-01 false ADA paratransit eligibility: Standards. 37.123... INDIVIDUALS WITH DISABILITIES (ADA) Paratransit as a Complement to Fixed Route Service § 37.123 ADA... complementary paratransit service shall provide the service to the ADA paratransit eligible...

  19. 49 CFR 37.125 - ADA paratransit eligibility: Process.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 1 2014-10-01 2014-10-01 false ADA paratransit eligibility: Process. 37.125... INDIVIDUALS WITH DISABILITIES (ADA) Paratransit as a Complement to Fixed Route Service § 37.125 ADA... § 37.121 of this part shall establish a process for determining ADA paratransit eligibility. (a)...

  20. 49 CFR 37.125 - ADA paratransit eligibility: Process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false ADA paratransit eligibility: Process. 37.125... INDIVIDUALS WITH DISABILITIES (ADA) Paratransit as a Complement to Fixed Route Service § 37.125 ADA... § 37.121 of this part shall establish a process for determining ADA paratransit eligibility. (a)...

  1. 49 CFR 37.123 - ADA paratransit eligibility: Standards.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false ADA paratransit eligibility: Standards. 37.123... INDIVIDUALS WITH DISABILITIES (ADA) Paratransit as a Complement to Fixed Route Service § 37.123 ADA... complementary paratransit service shall provide the service to the ADA paratransit eligible...

  2. Translating an AI application from Lisp to Ada: A case study

    NASA Technical Reports Server (NTRS)

    Davis, Gloria J.

    1991-01-01

    A set of benchmarks was developed to test the performance of a newly designed computer executing both Lisp and Ada. Among these was AutoClassII -- a large Artificial Intelligence (AI) application written in Common Lisp. The extraction of a representative subset of this complex application was aided by a Lisp Code Analyzer (LCA). The LCA enabled rapid analysis of the code, putting it in a concise and functionally readable form. An equivalent benchmark was created in Ada through manual translation of the Lisp version. A comparison of the execution results of both programs across a variety of compiler-machine combinations indicate that line-by-line translation coupled with analysis of the initial code can produce relatively efficient and reusable target code.

  3. Comparing Ada and FORTRAN Lines of Code: Some Experimental Results

    DTIC Science & Technology

    1993-11-01

    34"yseidued predisute IA ohilews’ fhey usually eam"d resuts sfts 0is "let 11h14h (8) haew a direst bud"e SO decIsIons a1146t10g MWjo 1 0ra, (hi addrees...or IDA Document are nsed for the cowaveniene of the spoases or the analysts ( a ) to record substantive work done in quick reaction studies, (h) to...orm &j rey alar Ia i I* a embedo., -d w~ma. in"Ag suagaulbota Worduutg thills d , Wasllhirtgb d•ead s ft8mseewbas. 01 N IVm -1 sadm O W R9e 1216

  4. User Manual and Source Code for a LAMMPS Implementation of Constant Energy Dissipative Particle Dynamics (DPD-E)

    DTIC Science & Technology

    2014-06-01

    Distribution List 20 iv INTENTIONALLY LEFT BLANK. 1 The constant energy dissipative particle dynamics ( DPD -E) method is implemented into the Large-Scale...User Manual and Source Code for a LAMMPS Implementation of Constant Energy Dissipative Particle Dynamics ( DPD -E) by James P. Larentzos...Energy Dissipative Particle Dynamics ( DPD -E) James P. Larentzos Engility Corporation John K. Brennan, Joshua D. Moore, and William D. Mattson

  5. Structured Hierarchical Ada Presentation Using Pictographs (SHARP) definition, Application and Automation

    DTIC Science & Technology

    1986-09-01

    excepcion if »SS= 2000 (b FIGURE 37. EXAMPLE OF ANNOTATED PSEUOO CODE 83 case COCOM_MODE is when ORGANIC -> 0 K - 2.U E - 1.05 when...where KGEN - K EGEN - E excepcion handler Call Procedure MODULE_ESTIMATE end excepcion handler __^__^___ Package P10 End Procedure SOFT DEV...else *SS - «SS*ADA_CALIBRATE end if raise excepcion if *SS-2000 © FIGURE 64. STEP 4 - ESTABLISH ANNOTATED PSEUDO CODE FOR PROGRAM UNIT

  6. Supporting the Cybercrime Investigation Process: Effective Discrimination of Source Code Authors Based on Byte-Level Information

    NASA Astrophysics Data System (ADS)

    Frantzeskou, Georgia; Stamatatos, Efstathios; Gritzalis, Stefanos

    Source code authorship analysis is the particular field that attempts to identify the author of a computer program by treating each program as a linguistically analyzable entity. This is usually based on other undisputed program samples from the same author. There are several cases where the application of such a method could be of a major benefit, such as tracing the source of code left in the system after a cyber attack, authorship disputes, proof of authorship in court, etc. In this paper, we present our approach which is based on byte-level n-gram profiles and is an extension of a method that has been successfully applied to natural language text authorship attribution. We propose a simplified profile and a new similarity measure which is less complicated than the algorithm followed in text authorship attribution and it seems more suitable for source code identification since is better able to deal with very small training sets. Experiments were performed on two different data sets, one with programs written in C++ and the second with programs written in Java. Unlike the traditional language-dependent metrics used by previous studies, our approach can be applied to any programming language with no additional cost. The presented accuracy rates are much better than the best reported results for the same data sets.

  7. Transparent ICD and DRG coding using information technology: linking and associating information sources with the eXtensible Markup Language.

    PubMed

    Hoelzer, Simon; Schweiger, Ralf K; Dudeck, Joachim

    2003-01-01

    With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or "semantically associated" parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach.

  8. Interim storage of spent and disused sealed sources: optimisation of external dose distribution in waste grids using the MCNPX code.

    PubMed

    Paiva, I; Oliveira, C; Trindade, R; Portugal, L

    2005-01-01

    Radioactive sealed sources are in use worldwide in different fields of application. When no further use is foreseen for these sources, they become spent or disused sealed sources and are subject to a specific waste management scheme. Portugal does have a Radioactive Waste Interim Storage Facility where spent or disused sealed sources are conditioned in a cement matrix inside concrete drums and following the geometrical disposition of a grid. The gamma dose values around each grid depend on the drum's enclosed activity and radionuclides considered, as well as on the drums distribution in the various layers of the grid. This work proposes a method based on the Monte Carlo simulation using the MCNPX code to estimate the best drum arrangement through the optimisation of dose distribution in a grid. Measured dose rate values at 1 m distance from the surface of the chosen optimised grid were used to validate the corresponding computational grid model.

  9. Arabidopsis thaliana transcriptional co-activators ADA2b and SGF29a are implicated in salt stress responses.

    PubMed

    Kaldis, Athanasios; Tsementzi, Despoina; Tanriverdi, Oznur; Vlachonasios, Konstantinos E

    2011-04-01

    The transcriptional co-activator ADA2b is a component of GCN5-containing complexes in eukaryotes. In Arabidopsis, ada2b mutants result in pleiotropic developmental defects and altered responses to low-temperature stress. SGF29 has recently been identified as another component of GCN5-containing complexes. In the Arabidopsis genome there are two orthologs of yeast SGF29, designated as SGF29a and SGF29b. We hypothesized that, in Arabidopsis, one or both SGF29 proteins may work in concert with ADA2b to regulate genes in response to abiotic stress, and we set out to explore the role of SGF29a and ADA2b in salt stress responses. In root growth and seed germination assays, sgf29a-1 mutants were more resistant to salt stress than their wild-type counterparts, whereas ada2b-1 mutant was hypersensitive. The sgf29a;ada2b double mutant displayed similar phenotypes to ada2b-1 mutant with reduced salt sensitivity. The expression of several abiotic stress-responsive genes was reduced in ada2b-1 mutants after 3 h of salt stress in comparison with sgf29a-1 and wild-type plants. In the sgf29a-1;ada2b-1 double mutant, the salt-induced gene expression was affected similarly to ada2b-1. These results suggest that under salt stress the function of SGF29a was masked by ADA2b and perhaps SGF29a could play an auxiliary role to ADA2b action. In chromatin immunoprecipitation assays, reduced levels of histone H3 and H4 acetylation in the promoter and coding region of COR6.6, RAB18, and RD29b genes were observed in ada2b-1 mutants relative to wild-type plants. In conclusion, ADA2b positively regulates salt-induced gene expression by maintaining the locus-specific acetylation of histones H4 and H3.

  10. Ada Run Time Support Environments and a common APSE Interface Set. [Ada Programming Support Environment

    NASA Technical Reports Server (NTRS)

    Mckay, C. W.; Bown, R. L.

    1985-01-01

    The paper discusses the importance of linking Ada Run Time Support Environments to the Common Ada Programming Support Environment (APSE) Interface Set (CAIS). A non-stop network operating systems scenario is presented to serve as a forum for identifying the important issues. The network operating system exemplifies the issues involved in the NASA Space Station data management system.

  11. The Impact of Causality on Information-Theoretic Source and Channel Coding Problems

    ERIC Educational Resources Information Center

    Palaiyanur, Harikrishna R.

    2011-01-01

    This thesis studies several problems in information theory where the notion of causality comes into play. Causality in information theory refers to the timing of when information is available to parties in a coding system. The first part of the thesis studies the error exponent (or reliability function) for several communication problems over…

  12. Beacon- and Schema-Based Method for Recognizing Algorithms from Students' Source Code

    ERIC Educational Resources Information Center

    Taherkhani, Ahmad; Malmi, Lauri

    2013-01-01

    In this paper, we present a method for recognizing algorithms from students programming submissions coded in Java. The method is based on the concept of "programming schemas" and "beacons". Schemas are high-level programming knowledge with detailed knowledge abstracted out, and beacons are statements that imply specific…

  13. Proceedings of the First NASA Ada Users' Symposium

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Ada has the potential to be a part of the most significant change in software engineering technology within NASA in the last twenty years. Thus, it is particularly important that all NASA centers be aware of Ada experience and plans at other centers. Ada activity across NASA are covered, with presenters representing five of the nine major NASA centers and the Space Station Freedom Program Office. Projects discussed included - Space Station Freedom Program Office: the implications of Ada on training, reuse, management and the software support environment; Johnson Space Center (JSC): early experience with the use of Ada, software engineering and Ada training and the evaluation of Ada compilers; Marshall Space Flight Center (MSFC): university research with Ada and the application of Ada to Space Station Freedom, the Orbital Maneuvering Vehicle, the Aero-Assist Flight Experiment and the Secure Shuttle Data System; Lewis Research Center (LeRC): the evolution of Ada software to support the Space Station Power Management and Distribution System; Jet Propulsion Laboratory (JPL): the creation of a centralized Ada development laboratory and current applications of Ada including the Real-time Weather Processor for the FAA; and Goddard Space Flight Center (GSFC): experiences with Ada in the Flight Dynamics Division and the Extreme Ultraviolet Explorer (EUVE) project and the implications of GSFC experience for Ada use in NASA. Despite the diversity of the presentations, several common themes emerged from the program: Methodology - NASA experience in general indicates that the effective use of Ada requires modern software engineering methodologies; Training - It is the software engineering principles and methods that surround Ada, rather than Ada itself, which requires the major training effort; Reuse - Due to training and transition costs, the use of Ada may initially actually decrease productivity, as was clearly found at GSFC; and real-time work at LeRC, JPL and GSFC shows

  14. Pre-coding method and apparatus for multiple source or time-shifted single source data and corresponding inverse post-decoding method and apparatus

    NASA Technical Reports Server (NTRS)

    Yeh, Pen-Shu (Inventor)

    1997-01-01

    A pre-coding method and device for improving data compression performance by removing correlation between a first original data set and a second original data set, each having M members, respectively. The pre-coding method produces a compression-efficiency-enhancing double-difference data set. The method and device produce a double-difference data set, i.e., an adjacent-delta calculation performed on a cross-delta data set or a cross-delta calculation performed on two adjacent-delta data sets, from either one of (1) two adjacent spectral bands coming from two discrete sources, respectively, or (2) two time-shifted data sets coming from a single source. The resulting double-difference data set is then coded using either a distortionless data encoding scheme (entropy encoding) or a lossy data compression scheme. Also, a post-decoding method and device for recovering a second original data set having been represented by such a double-difference data set.

  15. An Ada inference engine for expert systems

    NASA Technical Reports Server (NTRS)

    Lavallee, David B.

    1986-01-01

    The purpose is to investigate the feasibility of using Ada for rule-based expert systems with real-time performance requirements. This includes exploring the Ada features which give improved performance to expert systems as well as optimizing the tradeoffs or workarounds that the use of Ada may require. A prototype inference engine was built using Ada, and rule firing rates in excess of 500 per second were demonstrated on a single MC68000 processor. The knowledge base uses a directed acyclic graph to represent production lines. The graph allows the use of AND, OR, and NOT logical operators. The inference engine uses a combination of both forward and backward chaining in order to reach goals as quickly as possible. Future efforts will include additional investigation of multiprocessing to improve performance and creating a user interface allowing rule input in an Ada-like syntax. Investigation of multitasking and alternate knowledge base representations will help to analyze some of the performance issues as they relate to larger problems.

  16. Applicability of the MCPNX particle transport code for determination of the source correction effect in positron lifetime measurements on thin polymer films

    SciTech Connect

    J.M. Urban-Klaehn

    2007-09-01

    The method presented herein uses the MCNPX Monte Carlo particle transport code to track individual positrons and other particles through geometry that accounts for the detectors, backing foils, samples and sources with their actual sizes, positions and material characteristics. Polymer material, polydimethylsiloxane (PDMS), with different thickness of films served as samples. The excellent agreement between the experimental results and the MCNPX simulation of source correction effects for varied positron sources and different film thicknesses validates the applicability of the MCNPX code.

  17. kspectrum: an open-source code for high-resolution molecular absorption spectra production

    NASA Astrophysics Data System (ADS)

    Eymet, V.; Coustet, C.; Piaud, B.

    2016-01-01

    We present the kspectrum, scientific code that produces high-resolution synthetic absorption spectra from public molecular transition parameters databases. This code was originally required by the atmospheric and astrophysics communities, and its evolution is now driven by new scientific projects among the user community. Since it was designed without any optimization that would be specific to any particular application field, its use could also be extended to other domains. kspectrum produces spectral data that can subsequently be used either for high-resolution radiative transfer simulations, or for producing statistic spectral model parameters using additional tools. This is a open project that aims at providing an up-to-date tool that takes advantage of modern computational hardware and recent parallelization libraries. It is currently provided by Méso-Star (http://www.meso-star.com) under the CeCILL license, and benefits from regular updates and improvements.

  18. Atomic Data and Modelling for Fusion: the ADAS Project

    SciTech Connect

    Summers, H. P.; O'Mullane, M. G.

    2011-05-11

    The paper is an update on the Atomic Data and Analysis Structure, ADAS, since ICAM-DATA06 and a forward look to its evolution in the next five years. ADAS is an international project supporting principally magnetic confinement fusion research. It has participant laboratories throughout the world, including ITER and all its partner countries. In parallel with ADAS, the ADAS-EU Project provides enhanced support for fusion research at Associated Laboratories and Universities in Europe and ITER. OPEN-ADAS, sponsored jointly by the ADAS Project and IAEA, is the mechanism for open access to principal ADAS atomic data classes and facilitating software for their use. EXTENDED-ADAS comprises a variety of special, integrated application software, beyond the purely atomic bounds of ADAS, tuned closely to specific diagnostic analyses and plasma models.The current scientific content and scope of these various ADAS and ADAS related activities are briefly reviewed. These span a number of themes including heavy element spectroscopy and models, charge exchange spectroscopy, beam emission spectroscopy and special features which provide a broad baseline of atomic modelling and support. Emphasis will be placed on 'lifting the fundamental data baseline'--a principal ADAS task for the next few years. This will include discussion of ADAS and ADAS-EU coordinated and shared activities and some of the methods being exploited.

  19. Atomic Data and Modelling for Fusion: the ADAS Project

    NASA Astrophysics Data System (ADS)

    Summers, H. P.; O'Mullane, M. G.

    2011-05-01

    The paper is an update on the Atomic Data and Analysis Structure, ADAS, since ICAM-DATA06 and a forward look to its evolution in the next five years. ADAS is an international project supporting principally magnetic confinement fusion research. It has participant laboratories throughout the world, including ITER and all its partner countries. In parallel with ADAS, the ADAS-EU Project provides enhanced support for fusion research at Associated Laboratories and Universities in Europe and ITER. OPEN-ADAS, sponsored jointly by the ADAS Project and IAEA, is the mechanism for open access to principal ADAS atomic data classes and facilitating software for their use. EXTENDED-ADAS comprises a variety of special, integrated application software, beyond the purely atomic bounds of ADAS, tuned closely to specific diagnostic analyses and plasma models. The current scientific content and scope of these various ADAS and ADAS related activities are briefly reviewed. These span a number of themes including heavy element spectroscopy and models, charge exchange spectroscopy, beam emission spectroscopy and special features which provide a broad baseline of atomic modelling and support. Emphasis will be placed on `lifting the fundamental data baseline'—a principal ADAS task for the next few years. This will include discussion of ADAS and ADAS-EU coordinated and shared activities and some of the methods being exploited.

  20. A database management capability for Ada

    NASA Technical Reports Server (NTRS)

    Chan, Arvola; Danberg, SY; Fox, Stephen; Landers, Terry; Nori, Anil; Smith, John M.

    1986-01-01

    The data requirements of mission critical defense systems have been increasing dramatically. Command and control, intelligence, logistics, and even weapons systems are being required to integrate, process, and share ever increasing volumes of information. To meet this need, systems are now being specified that incorporate data base management subsystems for handling storage and retrieval of information. It is expected that a large number of the next generation of mission critical systems will contain embedded data base management systems. Since the use of Ada has been mandated for most of these systems, it is important to address the issues of providing data base management capabilities that can be closely coupled with Ada. A comprehensive distributed data base management project has been investigated. The key deliverables of this project are three closely related prototype systems implemented in Ada. These three systems are discussed.

  1. Ada (Trade Name) Bibliography. Volume 2.

    DTIC Science & Technology

    1984-03-01

    for every journal. The publisher information V,.0 appears if the document is a textbook. N % Xle r , , % 1= &-6 Ada Bibliography Volume 11 9 3. DOCUMENT...THE ADA LkNGUAGE SYSTEM PROJECT RAKITIN. STEVEN R . 6TH INT’L CONF ON SOFTWARE ENGINEERING: POSTER SESSION, PP. 49-50. 09/16/82 This paper discusses...VALIDATION AVAILABLE FROM: NATL.TECHNCL INF.SVC.5285 PORT ROYAL RD,SPRINGFIELD.VA SPONSORS: U.S.ARMY,COMMUNICATIONS R & D COMND, FT.MONMOUTH, NJ DOCUMENT

  2. Knowledge representation into Ada parallel processing

    NASA Technical Reports Server (NTRS)

    Masotto, Tom; Babikyan, Carol; Harper, Richard

    1990-01-01

    The Knowledge Representation into Ada Parallel Processing project is a joint NASA and Air Force funded project to demonstrate the execution of intelligent systems in Ada on the Charles Stark Draper Laboratory fault-tolerant parallel processor (FTPP). Two applications were demonstrated - a portion of the adaptive tactical navigator and a real time controller. Both systems are implemented as Activation Framework Objects on the Activation Framework intelligent scheduling mechanism developed by Worcester Polytechnic Institute. The implementations, results of performance analyses showing speedup due to parallelism and initial efficiency improvements are detailed and further areas for performance improvements are suggested.

  3. Ada (Trade Name) Bibliography. Volume 1.

    DTIC Science & Technology

    1983-05-01

    if dtmfn fro Roe( MENTARY NOTESBM F1J LL COp~y IS. KEY WORDS fCoolnan vo usoo aid* it necesary and idenify by bloctmonbo architecture, programming...AINST LANAN RD NAY 83 ND9S-8-C-936 UNCLASSIFIED F/6 12/5 ML -4.4 ’-4-4----" ’°p..l i . d N N L. 131 t ’ ll /II~ Ada Bibliography Volume I 95 𔃾- This...Structure Editor(GLSE) and the meta-tool called PEGASYS . GLSE operates • on a class of languages that can be described with an abstract syntax (Ada and

  4. ADA Integrated Environment I. System Specification.

    DTIC Science & Technology

    1981-12-01

    returned. UNCLASSIFIED SECt AITY CLASSIFICATION O r THiS PAGE ("On’ Dese ,9nA.,d)l ,_ REPORT DOCUMENTATION PAGE BEFO scOL’No RM 1. RqEPl~ORT, NUM1191N 2...statements, symbols, names, etc.; 2. build and develop (and maintain) Ada programs by linking (and maintaining) collections of separate Ada...program build by selecting a consistent set of library units for input to the Linker. Only those units actually used shall be included; 22 INTERMETRICS

  5. Gamma ray observatory dynamics simulator in Ada (GRODY)

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This experiment involved the parallel development of dynamics simulators for the Gamma Ray Observatory in both FORTRAN and Ada for the purpose of evaluating the applicability of Ada to the NASA/Goddard Space Flight Center's flight dynamics environment. The experiment successfully demonstrated that Ada is a viable, valuable technology for use in this environment. In addition to building a simulator, the Ada team evaluated training approaches, developed an Ada methodology appropriate to the flight dynamics environment, and established a baseline for evaluating future Ada projects.

  6. Gamma ray observatory dynamics simulator in Ada (GRODY)

    SciTech Connect

    Not Available

    1990-09-01

    This experiment involved the parallel development of dynamics simulators for the Gamma Ray Observatory in both FORTRAN and Ada for the purpose of evaluating the applicability of Ada to the NASA/Goddard Space Flight Center's flight dynamics environment. The experiment successfully demonstrated that Ada is a viable, valuable technology for use in this environment. In addition to building a simulator, the Ada team evaluated training approaches, developed an Ada methodology appropriate to the flight dynamics environment, and established a baseline for evaluating future Ada projects.

  7. Ada in AI or AI in Ada. On developing a rationale for integration

    NASA Technical Reports Server (NTRS)

    Collard, Philippe E.; Goforth, Andre

    1988-01-01

    The use of Ada as an Artificial Intelligence (AI) language is gaining interest in the NASA Community, i.e., by parties who have a need to deploy Knowledge Based-Systems (KBS) compatible with the use of Ada as the software standard for the Space Station. A fair number of KBS and pseudo-KBS implementations in Ada exist today. Currently, no widely used guidelines exist to compare and evaluate these with one another. The lack of guidelines illustrates a fundamental problem inherent in trying to compare and evaluate implementations of any sort in languages that are procedural or imperative in style, such as Ada, with those in languages that are functional in style, such as Lisp. Discussed are the strengths and weakness of using Ada as an AI language and a preliminary analysis provided of factors needed for the development of criteria for the integration of these two families of languages and the environments in which they are implemented. The intent for developing such criteria is to have a logical rationale that may be used to guide the development of Ada tools and methodology to support KBS requirements, and to identify those AI technology components that may most readily and effectively be deployed in Ada.

  8. BIOTC: An open-source CFD code for simulating biomass fast pyrolysis

    NASA Astrophysics Data System (ADS)

    Xiong, Qingang; Aramideh, Soroush; Passalacqua, Alberto; Kong, Song-Charng

    2014-06-01

    The BIOTC code is a computer program that combines a multi-fluid model for multiphase hydrodynamics and global chemical kinetics for chemical reactions to simulate fast pyrolysis of biomass at reactor scale. The object-oriented characteristic of BIOTC makes it easy for researchers to insert their own sub-models, while the user-friendly interface provides users a friendly environment as in commercial software. A laboratory-scale bubbling fluidized bed reactor for biomass fast pyrolysis was simulated using BIOTC to demonstrate its capability.

  9. Two Drosophila Ada2 Homologues Function in Different Multiprotein Complexes

    PubMed Central

    Kusch, Thomas; Guelman, Sebastián; Abmayr, Susan M.; Workman, Jerry L.

    2003-01-01

    The reversible acetylation of the N-terminal tails of histones is crucial for transcription, DNA repair, and replication. The enzymatic reaction is catalyzed by large multiprotein complexes, of which the best characterized are the Gcn5-containing N-acetyltransferase (GNAT) complexes. GNAT complexes from yeast to humans share several conserved subunits, such as Ada2, Ada3, Spt3, and Tra1/TRRAP. We have characterized these factors in Drosophila and found that the flies have two distinct Ada2 variants (dAda2a and dAda2b). Using a combination of biochemical and cell biological approaches we demonstrate that only one of the two Drosophila Ada2 homologues, dAda2b, is a component of Spt-Ada-Gcn5-acetyltransferase (SAGA) complexes. The other Ada2 variant, dAda2a, can associate with dGcn5 but is not incorporated into dSAGA-type complexes. This is the first example of a complex-specific association of the Ada-type transcriptional adapter proteins with GNATs. In addition, dAda2a is part of Gcn5-independent complexes, which are concentrated at transcriptionally active regions on polytene chromosomes. This implicates novel functions for dAda2a in transcription. Humans and mice also possess two Ada2 variants with high homology to dAda2a and dAda2b, respectively. This suggests that the mammalian and fly homologues of the transcriptional adapter Ada2 form two functionally distinct subgroups with unique characteristics. PMID:12697829

  10. On the Efficacy of Source Code Optimizations for Cache-Based Systems

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Saphir, William C.; Saini, Subhash (Technical Monitor)

    1998-01-01

    Obtaining high performance without machine-specific tuning is an important goal of scientific application programmers. Since most scientific processing is done on commodity microprocessors with hierarchical memory systems, this goal of "portable performance" can be achieved if a common set of optimization principles is effective for all such systems. It is widely believed, or at least hoped, that portable performance can be realized. The rule of thumb for optimization on hierarchical memory systems is to maximize temporal and spatial locality of memory references by reusing data and minimizing memory access stride. We investigate the effects of a number of optimizations on the performance of three related kernels taken from a computational fluid dynamics application. Timing the kernels on a range of processors, we observe an inconsistent and often counterintuitive impact of the optimizations on performance. In particular, code variations that have a positive impact on one architecture can have a negative impact on another, and variations expected to be unimportant can produce large effects. Moreover, we find that cache miss rates-as reported by a cache simulation tool, and confirmed by hardware counters-only partially explain the results. By contrast, the compiler-generated assembly code provides more insight by revealing the importance of processor-specific instructions and of compiler maturity, both of which strongly, and sometimes unexpectedly, influence performance. We conclude that it is difficult to obtain performance portability on modern cache-based computers, and comment on the implications of this result.

  11. On the Efficacy of Source Code Optimizations for Cache-Based Systems

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Saphir, William C.

    1998-01-01

    Obtaining high performance without machine-specific tuning is an important goal of scientific application programmers. Since most scientific processing is done on commodity microprocessors with hierarchical memory systems, this goal of "portable performance" can be achieved if a common set of optimization principles is effective for all such systems. It is widely believed, or at least hoped, that portable performance can be realized. The rule of thumb for optimization on hierarchical memory systems is to maximize temporal and spatial locality of memory references by reusing data and minimizing memory access stride. We investigate the effects of a number of optimizations on the performance of three related kernels taken from a computational fluid dynamics application. Timing the kernels on a range of processors, we observe an inconsistent and often counterintuitive impact of the optimizations on performance. In particular, code variations that have a positive impact on one architecture can have a negative impact on another, and variations expected to be unimportant can produce large effects. Moreover, we find that cache miss rates - as reported by a cache simulation tool, and confirmed by hardware counters - only partially explain the results. By contrast, the compiler-generated assembly code provides more insight by revealing the importance of processor-specific instructions and of compiler maturity, both of which strongly, and sometimes unexpectedly, influence performance. We conclude that it is difficult to obtain performance portability on modern cache-based computers, and comment on the implications of this result.

  12. The ADA Mandate for Social Change.

    ERIC Educational Resources Information Center

    Wehman, Paul, Ed.

    This book analyzes the effectiveness and implications for social change of the Americans with Disabilities Act (ADA). It outlines several issues--legal implications, physical accessibility, transportation options, employment opportunities, and recreation--that stimulate community action for full inclusion. Part I, titled "Definitions and…

  13. Is Your Queuing System ADA-Compliant?

    ERIC Educational Resources Information Center

    Lawrence, David

    2002-01-01

    Discusses the Americans with Disabilities (ADA) and Uniform Federal Accessibility Standards (UFAS) regulations regarding public facilities' crowd control stanchions and queuing systems. The major elements are protruding objects and wheelchair accessibility. Describes how to maintain compliance with the regulations and offers a list of additional…

  14. AdaNET prototype library administration manual

    NASA Technical Reports Server (NTRS)

    Hanley, Lionel

    1989-01-01

    The functions of the AdaNET Prototype Library of Reusable Software Parts is described. Adopted from the Navy Research Laboratory's Reusability Guidebook (V.5.0), this is a working document, customized for use the the AdaNET Project. Within this document, the term part is used to denote the smallest unit controlled by a library and retrievable from it. A part may have several constituents, which may not be individually tracked. Presented are the types of parts which may be stored in the library and the relationships among those parts; a concept of trust indicators which provide measures of confidence that a user of a previously developed part may reasonably apply to a part for a new application; search and retrieval, configuration management, and communications among those who interact with the AdaNET Prototype Library; and the AdaNET Prototype, described from the perspective of its three major users: the part reuser and retriever, the part submitter, and the librarian and/or administrator.

  15. SMILEI: A collaborative, open-source, multi-purpose PIC code for the next generation of super-computers

    NASA Astrophysics Data System (ADS)

    Grech, Mickael; Derouillat, J.; Beck, A.; Chiaramello, M.; Grassi, A.; Niel, F.; Perez, F.; Vinci, T.; Fle, M.; Aunai, N.; Dargent, J.; Plotnikov, I.; Bouchard, G.; Savoini, P.; Riconda, C.

    2016-10-01

    Over the last decades, Particle-In-Cell (PIC) codes have been central tools for plasma simulations. Today, new trends in High-Performance Computing (HPC) are emerging, dramatically changing HPC-relevant software design and putting some - if not most - legacy codes far beyond the level of performance expected on the new and future massively-parallel super computers. SMILEI is a new open-source PIC code co-developed by both plasma physicists and HPC specialists, and applied to a wide range of physics-related studies: from laser-plasma interaction to astrophysical plasmas. It benefits from an innovative parallelization strategy that relies on a super-domain-decomposition allowing for enhanced cache-use and efficient dynamic load balancing. Beyond these HPC-related developments, SMILEI also benefits from additional physics modules allowing to deal with binary collisions, field and collisional ionization and radiation back-reaction. This poster presents the SMILEI project, its HPC capabilities and illustrates some of the physics problems tackled with SMILEI.

  16. Semi-device-independent randomness expansion with partially free random sources using 3 →1 quantum random access code

    NASA Astrophysics Data System (ADS)

    Zhou, Yu-Qian; Gao, Fei; Li, Dan-Dan; Li, Xin-Hui; Wen, Qiao-Yan

    2016-09-01

    We have proved that new randomness can be certified by partially free sources using 2 →1 quantum random access code (QRAC) in the framework of semi-device-independent (SDI) protocols [Y.-Q. Zhou, H.-W. Li, Y.-K. Wang, D.-D. Li, F. Gao, and Q.-Y. Wen, Phys. Rev. A 92, 022331 (2015), 10.1103/PhysRevA.92.022331]. To improve the effectiveness of the randomness generation, here we propose the SDI randomness expansion using 3 →1 QRAC and obtain the corresponding classical and quantum bounds of the two-dimensional quantum witness. Moreover, we get the condition which should be satisfied by the partially free sources to successfully certify new randomness, and the analytic relationship between the certified randomness and the two-dimensional quantum witness violation.

  17. Evaluation of ADA gene expression and transduction efficiency in ADA/SCID patients undergoing gene therapy.

    PubMed

    Carlucci, F; Tabucchi, A; Aiuti, A; Rosi, F; Floccari, F; Pagani, R; Marinello, E

    2004-10-01

    A capillary electrophoresis (CE) method was developed for ADA/SCID diagnosis and monitoring of enzyme replacement therapy, as well as for exploring the transfection efficiency for different retroviral vectors in gene therapy.

  18. Rehosting and retargeting an Ada compiler: A design study

    NASA Technical Reports Server (NTRS)

    Robinson, Ray

    1986-01-01

    The goal of this study was to develop a plan for rehosting and retargeting the Air Force Armaments Lab. Ada cross compiler. This compiler was validated in Sept. 1985 using ACVC 1.6, is written in Pascal, is hosted on a CDC Cyber 170, and is targeted to an embedded Zilog Z8002. The study was performed to determine the feasibility, cost, time, and tasks required to retarget the compiler to a DEC VAX 11/78x and rehost it to an embedded U.S. Navy AN/UYK-44 computer. Major tasks identified were rehosting the compiler front end, rewriting the back end (code generator), translating the run time environment from Z8002 assembly language to AN/UYK-44 assembly language, and developing a library manager.

  19. Optimization of a photoneutron source based on 10 MeV electron beam using Geant4 Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Askri, Boubaker

    2015-10-01

    Geant4 Monte Carlo code has been used to conceive and optimize a simple and compact neutron source based on a 10 MeV electron beam impinging on a tungsten target adjoined to a beryllium target. For this purpose, a precise photonuclear reaction cross-section model issued from the International Atomic Energy Agency (IAEA) database was linked to Geant4 to accurately simulate the interaction of low energy bremsstrahlung photons with beryllium material. A benchmark test showed that a good agreement was achieved when comparing the emitted neutron flux spectra predicted by Geant4 and Fluka codes for a beryllium cylinder bombarded with a 5 MeV photon beam. The source optimization was achieved through a two stage Monte Carlo simulation. In the first stage, the distributions of the seven phase space coordinates of the bremsstrahlung photons at the boundaries of the tungsten target were determined. In the second stage events corresponding to photons emitted according to these distributions were tracked. A neutron yield of 4.8 × 1010 neutrons/mA/s was obtained at 20 cm from the beryllium target. A thermal neutron yield of 1.5 × 109 neutrons/mA/s was obtained after introducing a spherical shell of polyethylene as a neutron moderator.

  20. Validation and verification of RELAP5 for Advanced Neutron Source accident analysis: Part I, comparisons to ANSDM and PRSDYN codes

    SciTech Connect

    Chen, N.C.J.; Ibn-Khayat, M.; March-Leuba, J.A.; Wendel, M.W.

    1993-12-01

    As part of verification and validation, the Advanced Neutron Source reactor RELAP5 system model was benchmarked by the Advanced Neutron Source dynamic model (ANSDM) and PRSDYN models. RELAP5 is a one-dimensional, two-phase transient code, developed by the Idaho National Engineering Laboratory for reactor safety analysis. Both the ANSDM and PRSDYN models use a simplified single-phase equation set to predict transient thermal-hydraulic performance. Brief descriptions of each of the codes, models, and model limitations were included. Even though comparisons were limited to single-phase conditions, a broad spectrum of accidents was benchmarked: a small loss-of-coolant-accident (LOCA), a large LOCA, a station blackout, and a reactivity insertion accident. The overall conclusion is that the three models yield similar results if the input parameters are the same. However, ANSDM does not capture pressure wave propagation through the coolant system. This difference is significant in very rapid pipe break events. Recommendations are provided for further model improvements.

  1. Ada (Trade Name) Compiler Validation Summary Report: Intermetrics, Inc. I2Ada Compiler, Version 17.08 for the IBM 370 Architecture under UTS 2.3.

    DTIC Science & Technology

    1985-12-10

    SPEC-AE.ADA P REPORTBODY-B.ADA P *CHECK FILE-B.ADA P CZ110A- ABADA *CZ11O1A-AB.ADA P CZ11O2A-B.ADA P *CZ121A-B.ADA P CZ12O1A-AB.ADA P CZ12O1B-AB.ADA P...B55AO1E-AB.ADA P B59001H- ABADA P B52003B-AB.ADA P B55AO1F-AB.ADA P B590011-AB.ADA P B52003C-AB.ADA P B55AO1G-AB.ADA P C51002A-AB.ADA P *B52OO04A...AB.ADA P P C95012A-B.ADA P C96005B-B.TST P C97203E-AB.ADA P *C95013A-B.ADA P C96005C-B.TST P C97204A-B.ADA P C95021A-B.ADA P C96005D-B.ADA P C97303A- ABADA

  2. Preliminary Version: Ada (Trade Name)/SQL: A Standard, Portable Ada-DBMS Interface.

    DTIC Science & Technology

    1987-04-01

    process the SQL functions of ’ SELECT, UPDATE, INSERT, and DELETE. 2.4. Application and Tool Portability Concerns Ada/SQL is more than just an interface...database to the underlying DBMS. Once the database has been defined, the application programs may use Ada/SQL statements to process the data stored...totally transportable. Output can also be targeted for bulk load of a database, if warranted by the data volumes and processing speed. As already noted

  3. The computerization of programming: Ada (R) lessons learned

    NASA Technical Reports Server (NTRS)

    Struble, Dennis D.

    1986-01-01

    One of the largest systems yet written in Ada has been constructed. This system is the Intermetrics Ada compiler. Many lessons have been learned during the implementation of this Ada compiler. Some of these lessons, concentrating on those lessons relevant to large system implementations are described. The characteristics of the Ada compiler implementation project at Intermetrics are also described. Some specific experiences during the implementation are pointed out.

  4. Immunologic reconstitution during PEG-ADA therapy in an unusual mosaic ADA deficient patient.

    PubMed

    Liu, Ping; Santisteban, Ines; Burroughs, Lauri M; Ochs, Hans D; Torgerson, Troy R; Hershfield, Michael S; Rawlings, David J; Scharenberg, Andrew M

    2009-02-01

    We report detailed genetic and immunologic studies in a patient diagnosed with adenosine deaminase (ADA) deficiency and combined immune deficiency at age 5 years. At the time of diagnosis, although all other lymphocyte subsets were depleted, circulating CD8(+) T cells with a terminally differentiated phenotype were abundant and expressed normal ADA activity due to a reversion mutation in a CD8(+) T cell or precursor. Over the first 9 months of replacement therapy with PEG-ADA, the patient steadily accumulated mature naïve CD4(+) and CD8(+) T cells, as well as CD4(+)/FOXP3(+) regulatory T cells, consistent with restoration of a functional cellular immune system. While CD19(+) naïve B cells also accumulated in response to PEG-ADA therapy, a high proportion of these B cells exhibited an immature surface marker phenotype even after 9 months, and immunization with neoantigen bacteriophage varphiX174 demonstrated a markedly subnormal humoral immune response. Our observations in this single patient have important implications for gene therapy of human ADA deficiency, as they indicate that ADA expression within even a large circulating lymphocyte population may not be sufficient to support adequate immune reconstitution. They also suggest that an immature surface marker phenotype of the peripheral B cell compartment may be a useful surrogate marker for incomplete humoral immune reconstitution during enzyme replacement, and possibly other forms of hematopoietic cell therapies.

  5. Knowledge, programming, and programming cultures: LISP, C, and Ada

    NASA Technical Reports Server (NTRS)

    Rochowiak, Daniel

    1990-01-01

    The results of research 'Ada as an implementation language for knowledge based systems' are presented. The purpose of the research was to compare Ada to other programming languages. The report focuses on the programming languages Ada, C, and Lisp, the programming cultures that surround them, and the programming paradigms they support.

  6. Homozygosity for a novel adenosine deaminase (ADA) nonsense mutation (Q3>X) in a child with severe combined immunodeficiency (SCID)

    SciTech Connect

    Santisteban, I.; Arrendondo-Vega, F.X.; Kelly, S. |

    1994-09-01

    A Somali girl was diagnosed with ADA-deficient SCID at 7 mo; she responded well to PEG-ADA replacement and is now 3.3 yr old. ADA mRNA was undetectable (Northern) in her cultured T cells, but was present in T cells of her parents and two sibs. All PCR-amplified exon 1 genomic clones from the patient had a C>T transition at bp 7 relative to the start of translation, replacing Gln at codon 3 (AGA) with a termination codon (TGA, Q3>X). Patient cDNA (prepared by RT-PCR with a 5{prime} primer that covered codons 1-7) had a previously described polymorphism, K80>R, but was otherwise normal, indicating that no other coding mutations were present. A predicted new genomic BfaI restriction site was used to establish her homozygosity for Q3>X and to analyze genotypes of family members. We also analyzed the segregation of a variable Alu polyA-associated TAAA repeat (AluVpA) situated 5{prime} of the ADA gene. Three different AluVpA alleles were found, one of which was only present in the father and was not associated with his Q3>X allele. Because the father`s RBCs had only {approximately}15% of normal ADA activity, we analyzed his ADA cDNA. We found a G>A transition at bp 425 that substitutes Gln for Arg142, a solvent-accessible residue, and eliminates a BsmAI site in exon 5. ADA activity of the R142>Q in vitro translation product was 20-25% of wild type ADA translation product, suggesting that R142>Q is a new {open_quote}partial{close_quote} ADA deficiency mutation. As expected, Q3>X mRNA did not yield a detectable in vitro translation product. We conclude that the patient`s father is a compound heterozygote carrying the ADA Q3>X/R142>Q genotype. {open_quote}Partial{close_quote} ADA deficiency unassociated with immunodeficiency is relatively common in individuals of African descent. The present findings and previous observations suggest that {open_quote}partial{close_quote} ADA deficiency may have had an evolutionary advantage.

  7. Acoustic Scattering by Three-Dimensional Stators and Rotors Using the SOURCE3D Code. Volume 1; Analysis and Results

    NASA Technical Reports Server (NTRS)

    Meyer, Harold D.

    1999-01-01

    This report provides a study of rotor and stator scattering using the SOURCE3D Rotor Wake/Stator Interaction Code. SOURCE3D is a quasi-three-dimensional computer program that uses three-dimensional acoustics and two-dimensional cascade load response theory to calculate rotor and stator modal reflection and transmission (scattering) coefficients. SOURCE3D is at the core of the TFaNS (Theoretical Fan Noise Design/Prediction System), developed for NASA, which provides complete fully coupled (inlet, rotor, stator, exit) noise solutions for turbofan engines. The reason for studying scattering is that we must first understand the behavior of the individual scattering coefficients provided by SOURCE3D, before eventually understanding the more complicated predictions from TFaNS. To study scattering, we have derived a large number of scattering curves for vane and blade rows. The curves are plots of output wave power divided by input wave power (in dB units) versus vane/blade ratio. Some of these plots are shown in this report. All of the plots are provided in a separate volume. To assist in understanding the plots, formulas have been derived for special vane/blade ratios for which wavefronts are either parallel or normal to rotor or stator chords. From the plots, we have found that, for the most part, there was strong transmission and weak reflection over most of the vane/blade ratio range for the stator. For the rotor, there was little transmission loss.

  8. Retroposition as a source of antisense long non-coding RNAs with possible regulatory functions.

    PubMed

    Bryzghalov, Oleksii; Szcześniak, Michał Wojciech; Makałowska, Izabela

    2016-01-01

    Long non-coding RNAs (lncRNAs) are a class of intensely studied, yet enigmatic molecules that make up a substantial portion of the human transcriptome. In this work, we link the origins and functions of some lncRNAs to retroposition, a process resulting in the creation of intronless copies (retrocopies) of the so-called parental genes. We found 35 human retrocopies transcribed in antisense and giving rise to 58 lncRNA transcripts. These lncRNAs share sequence similarity with the corresponding parental genes but in the sense/antisense orientation, meaning they have the potential to interact with each other and to form RNA:RNA duplexes. We took a closer look at these duplexes and found that 10 of the lncRNAs might regulate parental gene expression and processing at the pre-mRNA and mRNA levels. Further analysis of the co-expression and expression correlation provided support for the existence of functional coupling between lncRNAs and their mate parental gene transcripts.

  9. Ada (Trade Name) Compiler Validation Summary Report: Harris Corporation. Harris Ada Compiler, Version 3.1. Harris H1200.

    DTIC Science & Technology

    1987-06-03

    DATE Ada Joint Pi’ogram Office 33 June 1987 United States Department of Defense 13. NUMBER Of PAGLS Washington, DC 20301-3081 37 14. MONITORING...Prepared By: Ada Validation Facility ASD/SCOL l Wright-Patterson AFB OH 454133-6503 Prepared For: Ada Joint Program Office United States Department of...Defense Washington, D.C. MAda. is a registered trademark of the United States Government (Ada Joint Program Office). + + + Place NTIS form here

  10. Personalized reminiscence therapy M-health application for patients living with dementia: Innovating using open source code repository.

    PubMed

    Zhang, Melvyn W B; Ho, Roger C M

    2017-01-01

    Dementia is known to be an illness which brings forth marked disability amongst the elderly individuals. At times, patients living with dementia do also experience non-cognitive symptoms, and these symptoms include that of hallucinations, delusional beliefs as well as emotional liability, sexualized behaviours and aggression. According to the National Institute of Clinical Excellence (NICE) guidelines, non-pharmacological techniques are typically the first-line option prior to the consideration of adjuvant pharmacological options. Reminiscence and music therapy are thus viable options. Lazar et al. [3] previously performed a systematic review with regards to the utilization of technology to delivery reminiscence based therapy to individuals who are living with dementia and has highlighted that technology does have benefits in the delivery of reminiscence therapy. However, to date, there has been a paucity of M-health innovations in this area. In addition, most of the current innovations are not personalized for each of the person living with Dementia. Prior research has highlighted the utility for open source repository in bioinformatics study. The authors hoped to explain how they managed to tap upon and make use of open source repository in the development of a personalized M-health reminiscence therapy innovation for patients living with dementia. The availability of open source code repository has changed the way healthcare professionals and developers develop smartphone applications today. Conventionally, a long iterative process is needed in the development of native application, mainly because of the need for native programming and coding, especially so if the application needs to have interactive features or features that could be personalized. Such repository enables the rapid and cost effective development of application. Moreover, developers are also able to further innovate, as less time is spend in the iterative process.

  11. Languages for artificial intelligence: Implementing a scheduler in LISP and in Ada

    NASA Technical Reports Server (NTRS)

    Hays, Dan

    1988-01-01

    A prototype scheduler for space experiments originally programmed in a dialect of LISP using some of the more traditional techniques of that language, was recast using an object-oriented LISP, Common LISP with Flavors on the Symbolics. This object-structured version was in turn partially implemented in Ada. The Flavors version showed a decided improvement in both speed of execution and readability of code. The recasting into Ada involved various practical problems of implementation as well as certain challenges of reconceptualization in going from one language to the other. Advantages were realized, however, in greater clarity of the code, especially where more standard flow of control was used. This exercise raised issues about the influence of programming language on the design of flexible and sensitive programs such as schedule planners, and called attention to the importance of factors external to the languages themselves such as system embeddedness, hardware context, and programmer practice.

  12. Self characterization of a coded aperture array for neutron source imaging.

    PubMed

    Volegov, P L; Danly, C R; Fittinghoff, D N; Guler, N; Merrill, F E; Wilde, C H

    2014-12-01

    The neutron imaging system at the National Ignition Facility (NIF) is an important diagnostic tool for measuring the two-dimensional size and shape of the neutrons produced in the burning deuterium-tritium plasma during the stagnation stage of inertial confinement fusion implosions. Since the neutron source is small (∼100 μm) and neutrons are deeply penetrating (>3 cm) in all materials, the apertures used to achieve the desired 10-μm resolution are 20-cm long, triangular tapers machined in gold foils. These gold foils are stacked to form an array of 20 apertures for pinhole imaging and three apertures for penumbral imaging. These apertures must be precisely aligned to accurately place the field of view of each aperture at the design location, or the location of the field of view for each aperture must be measured. In this paper we present a new technique that has been developed for the measurement and characterization of the precise location of each aperture in the array. We present the detailed algorithms used for this characterization and the results of reconstructed sources from inertial confinement fusion implosion experiments at NIF.

  13. Self characterization of a coded aperture array for neutron source imaging

    NASA Astrophysics Data System (ADS)

    Volegov, P. L.; Danly, C. R.; Fittinghoff, D. N.; Guler, N.; Merrill, F. E.; Wilde, C. H.

    2014-12-01

    The neutron imaging system at the National Ignition Facility (NIF) is an important diagnostic tool for measuring the two-dimensional size and shape of the neutrons produced in the burning deuterium-tritium plasma during the stagnation stage of inertial confinement fusion implosions. Since the neutron source is small (˜100 μm) and neutrons are deeply penetrating (>3 cm) in all materials, the apertures used to achieve the desired 10-μm resolution are 20-cm long, triangular tapers machined in gold foils. These gold foils are stacked to form an array of 20 apertures for pinhole imaging and three apertures for penumbral imaging. These apertures must be precisely aligned to accurately place the field of view of each aperture at the design location, or the location of the field of view for each aperture must be measured. In this paper we present a new technique that has been developed for the measurement and characterization of the precise location of each aperture in the array. We present the detailed algorithms used for this characterization and the results of reconstructed sources from inertial confinement fusion implosion experiments at NIF.

  14. Self characterization of a coded aperture array for neutron source imaging

    SciTech Connect

    Volegov, P. L. Danly, C. R.; Guler, N.; Merrill, F. E.; Wilde, C. H.; Fittinghoff, D. N.

    2014-12-15

    The neutron imaging system at the National Ignition Facility (NIF) is an important diagnostic tool for measuring the two-dimensional size and shape of the neutrons produced in the burning deuterium-tritium plasma during the stagnation stage of inertial confinement fusion implosions. Since the neutron source is small (∼100 μm) and neutrons are deeply penetrating (>3 cm) in all materials, the apertures used to achieve the desired 10-μm resolution are 20-cm long, triangular tapers machined in gold foils. These gold foils are stacked to form an array of 20 apertures for pinhole imaging and three apertures for penumbral imaging. These apertures must be precisely aligned to accurately place the field of view of each aperture at the design location, or the location of the field of view for each aperture must be measured. In this paper we present a new technique that has been developed for the measurement and characterization of the precise location of each aperture in the array. We present the detailed algorithms used for this characterization and the results of reconstructed sources from inertial confinement fusion implosion experiments at NIF.

  15. Ada and software management in NASA: Assessment and recommendations

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Recent NASA missions have required software systems that are larger, more complex, and more critical than NASA software systems of the past. The Ada programming language and the software methods and support environments associated with it are seen as potential breakthroughs in meeting NASA's software requirements. The findings of a study by the Ada and Software Management Assessment Working Group (ASMAWG) are presented. The study was chartered to perform three tasks: (1) assess the agency's ongoing and planned Ada activities; (2) assess the infrastructure (standards, policies, and internal organizations) supporting software management and the Ada activities; and (3) present an Ada implementation and use strategy appropriate for NASA over the next 5 years.

  16. Ada (Trade Name)/SQL (Structured Query Language) Binding Specification

    DTIC Science & Technology

    1988-06-01

    which it is contained. 6) A libary package is an <authorization package>; a <schema package>; a <global variable pack- age>; the Ada/SQL definition...Ada/SQL DML unit>s, however, may reference arbitrary Ada library units. SRld is designed to enable Ada/SQL automated tools to readily determine...restriction is designed to minimize confusion about what is being refer- enced, as well as to simplify the development of Ada/SQL automated took. 3) A non

  17. Ada Compiler Validation Summary Report, Dansk Datamatik Center, VAX 11 Compiler Version 1.1.

    DTIC Science & Technology

    1984-11-06

    B37303A.ADA P B3T307B-AB.ADA P B37309B-AB.ADA P B373103-B.ADA P B37311A-AB.ADA P B38001A.ADA P B38003A-AB.ADA P B38008A-B.ADA P B38008B- ABADA P B38101A-B.ADA...B452O8C-BADA P 8J45208G-AB.ADA P B45208H-B. ADA P B45208I-B.ADA P B45208M-AB.ADA P B45208N-AB.ADA P B45208S- ABADA P D45208T-AB.ADA P B45261A-kB.ADA P...B55AOIC-AB-ADA P B55AO1D-AB.ADA P B55AOlE-A3.ADA P B55A01F-AB.ADA P B55A0O-AB.ADA P B55A0IH-AB.ADA P B55A01I-AB-ADA P B55A014-AB.ADA P B55A01K- ABADA P

  18. Efficient Ada multitasking on a RISC register window architecture

    NASA Technical Reports Server (NTRS)

    Kearns, J. P.; Quammen, D.

    1987-01-01

    This work addresses the problem of reducing context switch overhead on a processor which supports a large register file - a register file much like that which is part of the Berkeley RISC processors and several other emerging architectures (which are not necessarily reduced instruction set machines in the purest sense). Such a reduction in overhead is particularly desirable in a real-time embedded application, in which task-to-task context switch overhead may result in failure to meet crucial deadlines. A storage management technique by which a context switch may be implemented as cheaply as a procedure call is presented. The essence of this technique is the avoidance of the save/restore of registers on the context switch. This is achieved through analysis of the static source text of an Ada tasking program. Information gained during that analysis directs the optimized storage management strategy for that program at run time. A formal verification of the technique in terms of an operational control model and an evaluation of the technique's performance via simulations driven by synthetic Ada program traces are presented.

  19. Ada for Embedded Systems: Issues and Questions.

    DTIC Science & Technology

    1987-12-01

    appropriate order based on the task dependency hierarchy, which the ART maintains and adds to when tasks are created [ Reino 86]. The master-slave, parent...termination are needed [ Reino 86]. Tasks provide a considerable overhead in the runtime system because the status must be maintained for context...Practitioners Approach. McGraw-Hill International, 1982. [ Reino 86] Kurki-Suonio, R. An Operational Model for Ada Tasking. Technical Report 1/1986, Tampere

  20. Ada education in a software life-cycle context

    NASA Technical Reports Server (NTRS)

    Clough, Anne J.

    1986-01-01

    Some of the experience gained from a comprehensive educational program undertaken at The Charles Stark Draper Lab. to introduce the Ada language and to transition modern software engineering technology into the development of Ada and non-Ada applications is described. Initially, a core group, which included manager, engineers and programmers, received training in Ada. An Ada Office was established to assume the major responsibility for training, evaluation, acquisition and benchmarking of tools, and consultation on Ada projects. As a first step in this process, and in-house educational program was undertaken to introduce Ada to the Laboratory. Later, a software engineering course was added to the educational program as the need to address issues spanning the entire software life cycle became evident. Educational efforts to date are summarized, with an emphasis on the educational approach adopted. Finally, lessons learned in administering this program are addressed.

  1. The Katydid system for compiling KEE applications to Ada

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    Components of a system known as Katydid are developed in an effort to compile knowledge-based systems developed in a multimechanism integrated environment (KEE) to Ada. The Katydid core is an Ada library supporting KEE object functionality, and the other elements include a rule compiler, a LISP-to-Ada translator, and a knowledge-base dumper. Katydid employs translation mechanisms that convert LISP knowledge structures and rules to Ada and utilizes basic prototypes of a run-time KEE object-structure library module for Ada. Preliminary results include the semiautomatic compilation of portions of a simple expert system to run in an Ada environment with the described algorithms. It is suggested that Ada can be employed for AI programming and implementation, and the Katydid system is being developed to include concurrency and synchronization mechanisms.

  2. Ada (trademark) projects at NASA. Runtime environment issues and recommendations

    NASA Technical Reports Server (NTRS)

    Roy, Daniel M.; Wilke, Randall W.

    1988-01-01

    Ada practitioners should use this document to discuss and establish common short term requirements for Ada runtime environments. The major current Ada runtime environment issues are identified through the analysis of some of the Ada efforts at NASA and other research centers. The runtime environment characteristics of major compilers are compared while alternate runtime implementations are reviewed. Modifications and extensions to the Ada Language Reference Manual to address some of these runtime issues are proposed. Three classes of projects focusing on the most critical runtime features of Ada are recommended, including a range of immediately feasible full scale Ada development projects. Also, a list of runtime features and procurement issues is proposed for consideration by the vendors, contractors and the government.

  3. Connectivity Reveals Sources of Predictive Coding Signals in Early Visual Cortex During Processing of Visual Optic Flow.

    PubMed

    Schindler, Andreas; Bartels, Andreas

    2016-05-24

    Superimposed on the visual feed-forward pathway, feedback connections convey higher level information to cortical areas lower in the hierarchy. A prominent framework for these connections is the theory of predictive coding where high-level areas send stimulus interpretations to lower level areas that compare them with sensory input. Along these lines, a growing body of neuroimaging studies shows that predictable stimuli lead to reduced blood oxygen level-dependent (BOLD) responses compared with matched nonpredictable counterparts, especially in early visual cortex (EVC) including areas V1-V3. The sources of these modulatory feedback signals are largely unknown. Here, we re-examined the robust finding of relative BOLD suppression in EVC evident during processing of coherent compared with random motion. Using functional connectivity analysis, we show an optic flow-dependent increase of functional connectivity between BOLD suppressed EVC and a network of visual motion areas including MST, V3A, V6, the cingulate sulcus visual area (CSv), and precuneus (Pc). Connectivity decreased between EVC and 2 areas known to encode heading direction: entorhinal cortex (EC) and retrosplenial cortex (RSC). Our results provide first evidence that BOLD suppression in EVC for predictable stimuli is indeed mediated by specific high-level areas, in accord with the theory of predictive coding.

  4. SEL Ada reuse analysis and representations

    NASA Technical Reports Server (NTRS)

    Kester, Rush

    1990-01-01

    Overall, it was revealed that the pattern of Ada reuse has evolved from initial reuse of utility components into reuse of generalized application architectures. Utility components were both domain-independent utilities, such as queues and stacks, and domain-specific utilities, such as those that implement spacecraft orbit and attitude mathematical functions and physics or astronomical models. The level of reuse was significantly increased with the development of a generalized telemetry simulator architecture. The use of Ada generics significantly increased the level of verbatum reuse, which is due to the ability, using Ada generics, to parameterize the aspects of design that are configurable during reuse. A key factor in implementing generalized architectures was the ability to use generic subprogram parameters to tailor parts of the algorithm embedded within the architecture. The use of object oriented design (in which objects model real world entities) significantly improved the modularity for reuse. Encapsulating into packages the data and operations associated with common real world entities creates natural building blocks for reuse.

  5. ADA interpretative system for image algebra

    NASA Astrophysics Data System (ADS)

    Murillo, Juan J.; Wilson, Joseph N.

    1992-06-01

    An important research problem in image processing is to find appropriate tools to support algorithm development. There have been efforts to build algorithm development support systems for image algebra in several languages, but these systems still have the disadvantage of the time consuming algorithm development style associated with compilation-oriented programming. This paper starts with a description of the Run-Time Support Library (RTSL), which serves as the base for executing programs on both the Image Algebra Ada Translator (IAAT) and Image Algebra Ada Interpreter (IAAI). A presentation on the current status of IAAT and its capabilities is followed by a brief introduction to the utilization of the Image Display Manager (IDM) for image manipulation and analysis. We then discuss in detail the current development stage of IAAI and its relation with RTSL and IDM. The last section describes the design of a syntax-directed graphical user interface for IAAI. We close with an analysis of the current performance of IAAI, and future trends are discussed. Appendix A gives a brief introduction to Image Algebra (IA), and in Appendix B the reader is presented to the Image Algebra Ada (IAA) grammar.

  6. Review of the status of validation of the computer codes used in the severe accident source term reassessment study (BMI-2104). [PWR; BWR

    SciTech Connect

    Kress, T. S.

    1985-04-01

    The determination of severe accident source terms must, by necessity it seems, rely heavily on the use of complex computer codes. Source term acceptability, therefore, rests on the assessed validity of such codes. Consequently, one element of NRC's recent efforts to reassess LWR severe accident source terms is to provide a review of the status of validation of the computer codes used in the reassessment. The results of this review is the subject of this document. The separate review documents compiled in this report were used as a resource along with the results of the BMI-2104 study by BCL and the QUEST study by SNL to arrive at a more-or-less independent appraisal of the status of source term modeling at this time.

  7. Comparison of Orbiter PRCS Plume Flow Fields Using CFD and Modified Source Flow Codes

    NASA Technical Reports Server (NTRS)

    Rochelle, Wm. C.; Kinsey, Robin E.; Reid, Ethan A.; Stuart, Phillip C.; Lumpkin, Forrest E.

    1997-01-01

    The Space Shuttle Orbiter will use Reaction Control System (RCS) jets for docking with the planned International Space Station (ISS). During approach and backout maneuvers, plumes from these jets could cause high pressure, heating, and thermal loads on ISS components. The object of this paper is to present comparisons of RCS plume flow fields used to calculate these ISS environments. Because of the complexities of 3-D plumes with variable scarf-angle and multi-jet combinations, NASA/JSC developed a plume flow-field methodology for all of these Orbiter jets. The RCS Plume Model (RPM), which includes effects of scarfed nozzles and dual jets, was developed as a modified source-flow engineering tool to rapidly generate plume properties and impingement environments on ISS components. This paper presents flow-field properties from four PRCS jets: F3U low scarf-angle single jet, F3F high scarf-angle single jet, DTU zero scarf-angle dual jet, and F1F/F2F high scarf-angle dual jet. The RPM results compared well with plume flow fields using four CFD programs: General Aerodynamic Simulation Program (GASP), Cartesian (CART), Unified Solution Algorithm (USA), and Reacting and Multi-phase Program (RAMP). Good comparisons of predicted pressures are shown with STS 64 Shuttle Plume Impingement Flight Experiment (SPIFEX) data.

  8. Designing HTS Roebel cables for low-field applications with open-source code

    NASA Astrophysics Data System (ADS)

    Grilli, Francesco; Zermeño, Victor M. R.; Kario, Anna

    2016-11-01

    In HTS Roebel cables for low-field applications the tightly packed strands produce a substantial self-field, which can be comparable to the background field the cable operates in. As a result, the self-field critical current (Ic) of a cable is lower than the value obtained by multiplying the self-field Ic of the strands by the number of strands. In addition, the in-field reduction of the superconducting material plays an important role for a cable's Ic. Numerical models can accurately calculate the magnetic field distribution inside a cable and estimate its Ic. In this contribution we employ a recently developed open-source model to estimate the Ic of Roebel cables. In particular, we investigate the influence of the spacing between the superconducting layers, of the Jc(B, θ) dependence, and of the presence of an external magnetic field. For the present analysis, we consider tapes from two manufactures, whose properties were characterized in our group.

  9. Object-oriented programming with mixins in Ada

    NASA Technical Reports Server (NTRS)

    Seidewitz, ED

    1992-01-01

    Recently, I wrote a paper discussing the lack of 'true' object-oriented programming language features in Ada 83, why one might desire them in Ada, and how they might be added in Ada 9X. The approach I took in this paper was to build the new object-oriented features of Ada 9X as much as possible on the basic constructs and philosophy of Ada 83. The object-oriented features proposed for Ada 9X, while different in detail, are based on the same kind of approach. Further consideration of this approach led me on a long reflection on the nature of object-oriented programming and its application to Ada. The results of this reflection, presented in this paper, show how a fairly natural object-oriented style can indeed be developed even in Ada 83. The exercise of developing this style is useful for at least three reasons: (1) it provides a useful style for programming object-oriented applications in Ada 83 until new features become available with Ada 9X; (2) it demystifies many of the mechanisms that seem to be 'magic' in most object-oriented programming languages by making them explicit; and (3) it points out areas that are and are not in need of change in Ada 83 to make object-oriented programming more natural in Ada 9X. In the next four sections I will address in turn the issues of object-oriented classes, mixins, self-reference and supertyping. The presentation is through a sequence of examples. This results in some overlap with that paper, but all the examples in the present paper are written entirely in Ada 83. I will return to considerations for Ada 9X in the last section of the paper.

  10. Digital data sets that describe aquifer characteristics of the Vamoosa-Ada aquifer in east-central Oklahoma

    USGS Publications Warehouse

    Abbott, Marvin M.; Runkle, D.L.; Rea, Alan

    1997-01-01

    Nonproprietary format files This diskette contains digitized aquifer boundaries and maps of hydraulic conductivity, recharge, and ground-water level elevation contours for the Vamoosa-Ada aquifer in east-central Oklahoma. The Vamoosa-Ada aquifer is an important source of water that underlies about 2,320-square miles of parts of Osage, Pawnee, Payne, Creek, Lincoln, Okfuskee, and Seminole Counties. Approximately 75 percent of the water withdrawn from the Vamoosa-Ada aquifer is for municipal use. Rural domestic use and water for stock animals account for most of the remaining water withdrawn. The Vamoosa-Ada aquifer is defined in a ground-water report as consisting principally of the rocks of the Late Pennsylvanian-age Vamoosa Formation and overlying Ada Group. The Vamoosa-Ada aquifer consists of a complex sequence of fine- to very fine-grained sandstone, siltstone, shale, and conglomerate interbedded with very thin limestones. The water-yielding capabilities of the aquifer are generally controlled by lateral and vertical distribution of the sandstone beds and their physical characteristics. The Vamoosa-Ada aquifer is unconfined where it outcrops in about an 1,700-square-mile area. Most of the lines in the aquifer boundary, hydraulic conductivity, and recharge data sets were extracted from published digital surficial geology data sets based on a scale of 1:250,000, and represent geologic contacts. Some of lines in the data sets were interpolated in areas where the Vamoosa-Ada aquifer is overlain by alluvial and terrace deposits near streams and rivers. These data sets include only the outcrop area of the Vamoosa-Ada aquifer and where the aquifer is overlain by alluvial and terrace deposits. The hydraulic conductivity value and recharge rate are from a ground-water report about the Vamoosa-Ada aquifer. The water-level elevation contours were digitized from a mylar map, at a scale of 1:250,000, used to publish a plate in a ground-water report about the Vamoosa-Ada

  11. GRODY - GAMMA RAY OBSERVATORY DYNAMICS SIMULATOR IN ADA

    NASA Technical Reports Server (NTRS)

    Stark, M.

    1994-01-01

    analyst can send results output in graphical or tabular form to a terminal, disk, or hardcopy device, and can choose to have any or all items plotted against time or against each other. Goddard researchers developed GRODY on a VAX 8600 running VMS version 4.0. For near real time performance, GRODY requires a VAX at least as powerful as a model 8600 running VMS 4.0 or a later version. To use GRODY, the VAX needs an Ada Compilation System (ACS), Code Management System (CMS), and 1200K memory. GRODY is written in Ada and FORTRAN.

  12. An open-source, massively parallel code for non-LTE synthesis and inversion of spectral lines and Zeeman-induced Stokes profiles

    NASA Astrophysics Data System (ADS)

    Socas-Navarro, H.; de la Cruz Rodríguez, J.; Asensio Ramos, A.; Trujillo Bueno, J.; Ruiz Cobo, B.

    2015-05-01

    With the advent of a new generation of solar telescopes and instrumentation, interpreting chromospheric observations (in particular, spectropolarimetry) requires new, suitable diagnostic tools. This paper describes a new code, NICOLE, that has been designed for Stokes non-LTE radiative transfer, for synthesis and inversion of spectral lines and Zeeman-induced polarization profiles, spanning a wide range of atmospheric heights from the photosphere to the chromosphere. The code features a number of unique features and capabilities and has been built from scratch with a powerful parallelization scheme that makes it suitable for application on massive datasets using large supercomputers. The source code is written entirely in Fortran 90/2003 and complies strictly with the ANSI standards to ensure maximum compatibility and portability. It is being publicly released, with the idea of facilitating future branching by other groups to augment its capabilities. The source code is currently hosted at the following repository: http://https://github.com/hsocasnavarro/NICOLE

  13. QUEST/Ada: Query utility environment for software testing of Ada

    NASA Technical Reports Server (NTRS)

    Brown, David B.

    1989-01-01

    Results of research and development efforts are presented for Task 1, Phase 2 of a general project entitled, The Development of a Program Analysis Environment for Ada. A prototype of the QUEST/Ada system was developed to collect data to determine the effectiveness of the rule-based testing paradigm. The prototype consists of five parts: the test data generator, the parser/scanner, the test coverage analyzer, a symbolic evaluator, and a data management facility, known as the Librarian. These components are discussed at length. Also presented is an experimental design for the evaluations, an overview of the project, and a schedule for its completion.

  14. HELIOS: An Open-source, GPU-accelerated Radiative Transfer Code for Self-consistent Exoplanetary Atmospheres

    NASA Astrophysics Data System (ADS)

    Malik, Matej; Grosheintz, Luc; Mendonça, João M.; Grimm, Simon L.; Lavie, Baptiste; Kitzmann, Daniel; Tsai, Shang-Min; Burrows, Adam; Kreidberg, Laura; Bedell, Megan; Bean, Jacob L.; Stevenson, Kevin B.; Heng, Kevin

    2017-02-01

    We present the open-source radiative transfer code named HELIOS, which is constructed for studying exoplanetary atmospheres. In its initial version, the model atmospheres of HELIOS are one-dimensional and plane-parallel, and the equation of radiative transfer is solved in the two-stream approximation with nonisotropic scattering. A small set of the main infrared absorbers is employed, computed with the opacity calculator HELIOS-K and combined using a correlated-k approximation. The molecular abundances originate from validated analytical formulae for equilibrium chemistry. We compare HELIOS with the work of Miller-Ricci & Fortney using a model of GJ 1214b, and perform several tests, where we find: model atmospheres with single-temperature layers struggle to converge to radiative equilibrium; k-distribution tables constructed with ≳ 0.01 cm‑1 resolution in the opacity function (≲ {10}3 points per wavenumber bin) may result in errors ≳ 1%–10% in the synthetic spectra; and a diffusivity factor of 2 approximates well the exact radiative transfer solution in the limit of pure absorption. We construct “null-hypothesis” models (chemical equilibrium, radiative equilibrium, and solar elemental abundances) for six hot Jupiters. We find that the dayside emission spectra of HD 189733b and WASP-43b are consistent with the null hypothesis, while the latter consistently underpredicts the observed fluxes of WASP-8b, WASP-12b, WASP-14b, and WASP-33b. We demonstrate that our results are somewhat insensitive to the choice of stellar models (blackbody, Kurucz, or PHOENIX) and metallicity, but are strongly affected by higher carbon-to-oxygen ratios. The code is publicly available as part of the Exoclimes Simulation Platform (exoclime.net).

  15. Joint source/channel coding for prioritized wireless transmission of multiple 3-D regions of interest in 3-D medical imaging data.

    PubMed

    Sanchez, V

    2013-02-01

    This paper presents a 3-D medical image coding method featuring two major improvements to previous work on 3-D region of interest (RoI) coding for telemedicine applications. Namely, 1) a data prioritization scheme that allows coding of multiple 3-D-RoIs; and 2) a joint/source channel coding scheme that allows prioritized transmission of multiple 3-D-RoIs over wireless channels. The method, which is based on the 3-D integer wavelet transform and embedded block coding with optimized truncation with 3-D context modeling, generates scalable and error-resilient bit streams with 3-D-RoI decoding capabilities. Coding of multiple 3-D-RoIs is attained by prioritizing the wavelet-transformed data according to a Gaussian mixed distribution, whereas error resiliency is attained by employing the error correction capabilities of rate-compatible punctured turbo codes. The robustness of the proposed method is evaluated for transmission of real 3-D medical images over Rayleigh-fading channels with a priori knowledge of the channel condition. Evaluation results show that the proposed coding method provides a superior performance compared to equal error protection and unequal error protection techniques.

  16. Association of G22A and A4223C ADA1 gene polymorphisms and ADA activity with PCOS.

    PubMed

    Salehabadi, Mahshid; Farimani, Marzieh; Tavilani, Heidar; Ghorbani, Marzieh; Poormonsefi, Faranak; Poorolajal, Jalal; Shafiei, Gholamreza; Ghasemkhani, Neda; Khodadadi, Iraj

    2016-06-01

    Adenosine deaminase-1 (ADA1) regulates the concentration of adenosine as the main modulator of oocyte maturation. There is compelling evidence for the association of ADA1 gene polymorphisms with many diseases but the importance of ADA1 polymorphisms in polycystic ovary syndrome (PCOS) has not been studied before. This study investigates serum total ADA activity (tADA), ADA1 and ADA2 isoenzyme activities, and genotype and allele frequencies of G22A and A4223C polymorphisms in healthy and PCOS women. In this case-control study 200 PCOS patients and 200 healthy women were enrolled. Genomic DNA was extracted from whole blood and the PCR-RFLP technique was used to determine the G22A and A4223C variants. The genotype frequencies were calculated and the association between polymorphic genotypes and enzyme activities were determined. tADA activity was significantly lower in the PCOS group compared with the control group (27.76±6.0 vs. 39.63±7.48, respectively). PCOS patients also showed reduced activity of ADA1 and ADA2. PCOS was not associated with G22A polymorphism whereas AA, AC, and CC genotypes of A4223C polymorphism were found distributed differently between the control and the PCOS women where the C allele showed a strong protective role for PCOS (odds ratio=1.876, p=0.033). The present study for the first time showed that lower ADA activity may be involved in pathogenesis of PCOS by maintaining a higher concentration of adenosine affecting follicular growth. As a novel finding, we also showed great differences in genotype distribution and allele frequencies of A4223C polymorphism between groups indicating a protective role for C allele against PCOS. AbbreviationsADA: adenosine deaminase PCOS: polycystic ovary syndrome PCR-RFLP: polymerase chain reaction-restriction fragment length polymorphism tADA: total adenosine deaminase.

  17. Real-Time Ada Demonstration Project

    DTIC Science & Technology

    1989-05-31

    CENER OR OFTAREENGINEERING ADVANCED SOFTWARE TECHNOLOGY Subject: Final Report - Real-Time Ada Demonstration Proj e-t- --.-. SEP 0 1989 D SEA)~ CIN...C02 0921I 6))00 I 31 MAY 1989 *:i ’C O~ 0"ed ~ 842 190 ?’ 45 DEMONSTRATION PROJECT FINAL REPORT PREPARED FOR: U.S. Army HQ CECOM Center for Software ...Engineering Advanced Software Technology Fort Monmouth, NJ 07703-5000 Accession For NTIS G1A&I DTIC TAB PREPARED BY: unannou:1r2d E LabTek Corporation

  18. Restoring balance to B cells in ADA deficiency.

    PubMed

    Luning Prak, Eline T

    2012-06-01

    It is paradoxical that immunodeficiency disorders are associated with autoimmunity. Adenosine deaminase (ADA) deficiency, a cause of X-linked severe combined immunodeficiency (SCID), is a case in point. In this issue of the JCI, Sauer and colleagues investigate the B cell defects in ADA-deficient patients. They demonstrate that ADA patients receiving enzyme replacement therapy had B cell tolerance checkpoint defects. Remarkably, gene therapy with a retrovirus that expresses ADA resulted in the apparent correction of these defects, with normalization of peripheral B cell autoantibody frequencies. In vitro, agents that either block ADA or overexpress adenosine resulted in altered B cell receptor and TLR signaling. Collectively, these data implicate a B cell-intrinsic mechanism for alterations in B cell tolerance in the setting of partial ADA deficiency that is corrected by gene therapy.

  19. Software engineering and the role of Ada: Executive seminar

    NASA Technical Reports Server (NTRS)

    Freedman, Glenn B.

    1987-01-01

    The objective was to introduce the basic terminology and concepts of software engineering and Ada. The life cycle model is reviewed. The application of the goals and principles of software engineering is applied. An introductory understanding of the features of the Ada language is gained. Topics addressed include: the software crises; the mandate of the Space Station Program; software life cycle model; software engineering; and Ada under the software engineering umbrella.

  20. Towards a formal semantics for Ada 9X

    NASA Technical Reports Server (NTRS)

    Guaspari, David; Mchugh, John; Wolfgang, Polak; Saaltink, Mark

    1995-01-01

    The Ada 9X language precision team was formed during the revisions of Ada 83, with the goal of analyzing the proposed design, identifying problems, and suggesting improvements, through the use of mathematical models. This report defines a framework for formally describing Ada 9X, based on Kahn's 'natural semantics', and applies the framework to portions of the language. The proposals for exceptions and optimization freedoms are also analyzed, using a different technique.

  1. Ada (Tradename) Compiler Validation Summary Report. Symbolics, Inc., Symbolics Ada, Version 1.0. Symbolics 3600.

    DTIC Science & Technology

    1986-06-11

    foo.a.b.c ........ An illegal extiernal file name that either contains invalid characters or is too long. $FILE NAME WITH WILD CARD CHAR "eno: >testing...ada>-tests>c*.." An extiernal file name that either contains a wild card character or is too long. $GREATER THAN DURATION 864~01.0 A univeral real value

  2. Assessment of the Ada (Trade Name) Validation Process.

    DTIC Science & Technology

    1985-12-01

    Sincerely,’ Warren Berger Intermetr±os, Inc. Ada Systems Division733 Concord Ave Cambridge, MA 02146 (617) 661-1840 ’C-79 Arpanet: ima!inmet!ada-uts!wkb...Warren Berger S Intermetrics, Inc. Ada Systems Division 733 Concord Ave I Cambridge, MA 02146 (617) 661-1840 Arpanet: imalinmetiada-utsiwkb@CCA-UNIX.ARPA...MA 02146 (617) 661-1840 Arpanet: ima!inmetiada-uts wkb@CCA-UNIX.ARPA -- CA5004B-B.ADA -- CHECK THAT PRAGMA ELABORATE IS ACCEPTED AND OBEYED EVEN IF

  3. ADA (adenosine deaminase) gene therapy enters the competition

    SciTech Connect

    Culliton, B.J.

    1990-08-31

    Around the world, some 70 children are members of a select and deadly club. Born with an immune deficiency so severe that they will die of infection unless their immune systems can be repaired, they have captured the attention of would-be gene therapists who believe that a handful of these kids--the 15 or 20 who lack functioning levels of the enzyme adenosine deaminase (ADA)--could be saved by a healthy ADA gene. A team of gene therapists is ready to put the theory to the test. In April 1987, a team of NIH researchers headed by R. Michael Blaese and W. French Anderson came up with the first formal protocol to introduce a healthy ADA gene into an unhealthy human. After 3 years of line-by-line scrutiny by five review committees, they have permission to go ahead. Two or three children will be treated in the next year, and will be infused with T lymphocytes carrying the gene for ADA. If the experiment works, the ADA gene will begin producing normal amounts of ADA. An interesting feature of ADA deficiency, that makes it ideal for initial gene studies, is that the amount of ADA one needs for a healthy immune system is quite variable. Hence, once inside a patient's T cells, the new ADA gene needs only to express the enzyme in moderate amounts. No precise gene regulation is necessary.

  4. Implementation of a production Ada project: The GRODY study

    NASA Technical Reports Server (NTRS)

    Godfrey, Sara; Brophy, Carolyn Elizabeth

    1989-01-01

    The use of the Ada language and design methodologies that encourage full use of its capabilities have a strong impact on all phases of the software development project life cycle. At the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC), the Software Engineering Laboratory (SEL) conducted an experiment in parallel development of two flight dynamics systems in FORTRAN and Ada. The differences observed during the implementation, unit testing, and integration phases of the two projects are described and the lessons learned during the implementation phase of the Ada development are outlined. Included are recommendations for future Ada development projects.

  5. Transferring data objects: A focused Ada investigation

    NASA Technical Reports Server (NTRS)

    Legrand, Sue

    1988-01-01

    The use of the Ada language does not guarantee that data objects will be in the same form or have the same value after they have been stored or transferred to another system. There are too many possible variables in such things as the formats used and other protocol conditions. Differences may occur at many different levels of support. These include program level, object level, application level, and system level. A standard language is only one aspect of making a complex system completely homogeneous. Many components must be standardized and the various standards must be integrated. The principal issues in providing for interaction between systems are of exchanging files and data objects between systems which may not be compatible in terms of their host computer, operating system or other factors. A typical resolution of the problem of invalidating data involves at least a common external form, for data objects and for representing the relationships and attributes of data collections. Some of the issues dealing with the transfer of data are listed and consideration is given on how these issues may be handled in the Ada language.

  6. Modeling of a three-source perfusion and blood oxygenation sensor for transplant monitoring using multilayer Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Ibey, Bennett L.; Lee, Seungjoon; Ericson, M. Nance; Wilson, Mark A.; Cote, Gerard L.

    2004-06-01

    A Multi-Layer Monte Carlo (MLMC) model was developed to predict the results of in vivo blood perfusion and oxygenation measurement of transplanted organs as measured by an indwelling optical sensor. A sensor has been developed which uses three-source excitation in the red and infrared ranges (660, 810, 940 nm). In vitro data was taken using this sensor by changing the oxygenation state of whole blood and passing it through a single-tube pump system wrapped in bovine liver tissue. The collected data showed that the red signal increased as blood oxygenation increased and infrared signal decreased. The center wavelength of 810 nanometers was shown to be quite indifferent to blood oxygenation change. A model was developed using MLMC code that sampled the wavelength range from 600-1000 nanometers every 6 nanometers. Using scattering and absorption data for blood and liver tissue within this wavelength range, a five-layer model was developed (tissue, clear tubing, blood, clear tubing, tissue). The theoretical data generated from this model was compared to the in vitro data and showed good correlation with changing blood oxygenation.

  7. Ada (Trade Name) Complier Validation Summary Report: Verdix Ada Development System, Version 5.2 for the Sequent Balance under DYNIX, Release 1.3.2.

    DTIC Science & Technology

    1985-11-15

    ADAP B4 2 -B.DA B5001AAB-DA A54BO2A-B.ADA P B54A21A-B.ADA P B57001A-BADA P A55ISO2A- ABADA P B54A25BA-ADA P B5T001B-B.ADA P A55Bl3A-AB.ADA P B54A27B...B91001B-AB.ADA P B950BAA-B.ADA P C92003A.ADA P B91001C-AB.ADA P B950DHA-B.ADA P C92OAJA-B.ADA P B91001D- ABADA P B96002A-B.ADA P C920BAA-B.ADA P *B91001E...TESTS AND RESULTS Chapter 11 660 BB201A-k.AD P C100A- ABADA P CB003-kB.DA 82002A-AB.ADA P CB1003A-AB.ADA P CB4003A-B.ADA P BB2003A-AB.ADA P CB200IIA

  8. Feasibility of Leveraging Crowd Sourcing for the Creation of a Large Scale Annotated Resource for Hindi English Code Switched Data: A Pilot Annotation

    DTIC Science & Technology

    2011-11-01

    Crowd Sourcing for the Creation of a Large Scale Annotated Resource for Hindi English Code Switched Data: A Pilot Annotation Mona Diab Center for...We target Hindi English (Hinglish) LCS. We investigate the feasibility of leverag- ing crowd sourcing as a means for anno- tating the data on the...countries like India where Hindi is a common first language (L1) and English acts as a second language (L2) among na- tive Hindi speakers. For

  9. Implementation of a double Gaussian source model for the BEAMnrc Monte Carlo code and its influence on small fields dose distributions.

    PubMed

    Doerner, Edgardo; Caprile, Paola

    2016-09-01

    The shape of the radiation source of a linac has a direct impact on the delivered dose distributions, especially in the case of small radiation fields. Traditionally, a single Gaussian source model is used to describe the electron beam hitting the target, although different studies have shown that the shape of the electron source can be better described by a mixed distribution consisting of two Gaussian components. Therefore, this study presents the implementation of a double Gaussian source model into the BEAMnrc Monte Carlo code. The impact of the double Gaussian source model for a 6 MV beam is assessed through the comparison of different dosimetric parameters calculated using a single Gaussian source, previously commissioned, the new double Gaussian source model and measurements, performed with a diode detector in a water phantom. It was found that the new source can be easily implemented into the BEAMnrc code and that it improves the agreement between measurements and simulations for small radiation fields. The impact of the change in source shape becomes less important as the field size increases and for increasing distance of the collimators to the source, as expected. In particular, for radiation fields delivered using stereotactic collimators located at a distance of 59 cm from the source, it was found that the effect of the double Gaussian source on the calculated dose distributions is negligible, even for radiation fields smaller than 5 mm in diameter. Accurate determination of the shape of the radiation source allows us to improve the Monte Carlo modeling of the linac, especially for treatment modalities such as IMRT, were the radiation beams used could be very narrow, becoming more sensitive to the shape of the source. PACS number(s): 87.53.Bn, 87.55.K, 87.56.B-, 87.56.jf.

  10. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  11. The circulating transcriptome as a source of non-invasive cancer biomarkers: concepts and controversies of non-coding and coding RNA in body fluids

    PubMed Central

    Fernandez-Mercado, Marta; Manterola, Lorea; Larrea, Erika; Goicoechea, Ibai; Arestin, María; Armesto, María; Otaegui, David; Lawrie, Charles H

    2015-01-01

    The gold standard for cancer diagnosis remains the histological examination of affected tissue, obtained either by surgical excision, or radiologically guided biopsy. Such procedures however are expensive, not without risk to the patient, and require consistent evaluation by expert pathologists. Consequently, the search for non-invasive tools for the diagnosis and management of cancer has led to great interest in the field of circulating nucleic acids in plasma and serum. An additional benefit of blood-based testing is the ability to carry out screening and repeat sampling on patients undergoing therapy, or monitoring disease progression allowing for the development of a personalized approach to cancer patient management. Despite having been discovered over 60 years ago, the clear clinical potential of circulating nucleic acids, with the notable exception of prenatal diagnostic testing, has yet to translate into the clinic. The recent discovery of non-coding (nc) RNA (in particular micro(mi)RNAs) in the blood has provided fresh impetuous for the field. In this review, we discuss the potential of the circulating transcriptome (coding and ncRNA), as novel cancer biomarkers, the controversy surrounding their origin and biology, and most importantly the hurdles that remain to be overcome if they are really to become part of future clinical practice. PMID:26119132

  12. 49 CFR 37.125 - ADA paratransit eligibility: Process.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 1 2013-10-01 2013-10-01 false ADA paratransit eligibility: Process. 37.125... paratransit eligibility: Process. Each public entity required to provide complementary paratransit service by § 37.121 of this part shall establish a process for determining ADA paratransit eligibility. (a)...

  13. Artificial Intelligence in ADA: Pattern-Directed Processing. Final Report.

    ERIC Educational Resources Information Center

    Reeker, Larry H.; And Others

    To demonstrate to computer programmers that the programming language Ada provides superior facilities for use in artificial intelligence applications, the three papers included in this report investigate the capabilities that exist within Ada for "pattern-directed" programming. The first paper (Larry H. Reeker, Tulane University) is…

  14. Alma Flor Ada and the Quest for Change

    ERIC Educational Resources Information Center

    Manna, Anthony, L.; Hill, Janet; Kellogg, Kathy

    2004-01-01

    Alma Flor Ada, a folklorist, novelist, scholar, teacher, and children's book author has passionate dedication to education for social justice, equality, and peace. As a faculty member at the University of San Francisco, Ada has developed programs that help students and others transform their lives and has written several bilingual legends and…

  15. A Practical Guide to the ADA and Visual Impairment.

    ERIC Educational Resources Information Center

    Joffee, Elga

    Designed to be used as a companion to the Americans with Disabilities Act (ADA), this guide provides information on how the law applies to individuals with visual impairments. Section 1, "The ADA and Visual Impairment," gives an overview of the Americans with Disabilities Act and discusses visual impairment and accessibility. Examples of…

  16. Fine-Tuning ADAS Algorithm Parameters for Optimizing Traffic ...

    EPA Pesticide Factsheets

    With the development of the Connected Vehicle technology that facilitates wirelessly communication among vehicles and road-side infrastructure, the Advanced Driver Assistance Systems (ADAS) can be adopted as an effective tool for accelerating traffic safety and mobility optimization at various highway facilities. To this end, the traffic management centers identify the optimal ADAS algorithm parameter set that enables the maximum improvement of the traffic safety and mobility performance, and broadcast the optimal parameter set wirelessly to individual ADAS-equipped vehicles. After adopting the optimal parameter set, the ADAS-equipped drivers become active agents in the traffic stream that work collectively and consistently to prevent traffic conflicts, lower the intensity of traffic disturbances, and suppress the development of traffic oscillations into heavy traffic jams. Successful implementation of this objective requires the analysis capability of capturing the impact of the ADAS on driving behaviors, and measuring traffic safety and mobility performance under the influence of the ADAS. To address this challenge, this research proposes a synthetic methodology that incorporates the ADAS-affected driving behavior modeling and state-of-the-art microscopic traffic flow modeling into a virtually simulated environment. Building on such an environment, the optimal ADAS algorithm parameter set is identified through an optimization programming framework to enable th

  17. 59 FR- Realty Action; Ada and Owyhee, ID

    Federal Register 2010, 2011, 2012, 2013, 2014

    1994-05-03

    ...-332A-02; IDI-29516] Realty Action; Ada and Owyhee, ID AGENCY: Bureau of Land Management, Interior. ACTION: Notice of Realty Action--IDI-29516; Exchange of public and private lands in Ada and Owyhee.... Containing 640 acres, more or less, in Owyhee County. The purpose of this exchange is to dispose of...

  18. The ADA and IDEA Basics: Inclusion of Children with Disabilities

    ERIC Educational Resources Information Center

    Motwani, Mona

    2007-01-01

    This article discusses the American with Disabilities Act (ADA) and the Individuals with Disabilities Education Act (IDEA). The ADA is a federal civil rights law that was passed in 1990 with the aim of securing equal rights for persons with disabilities in the employment, housing, government, transportation, and public accommodation contexts. It…

  19. Validation of the MCNP-DSP Monte Carlo code for calculating source-driven noise parameters of subcritical systems

    SciTech Connect

    Valentine, T.E.; Mihalczo, J.T.

    1995-12-31

    This paper describes calculations performed to validate the modified version of the MCNP code, the MCNP-DSP, used for: the neutron and photon spectra of the spontaneous fission of californium 252; the representation of the detection processes for scattering detectors; the timing of the detection process; and the calculation of the frequency analysis parameters for the MCNP-DSP code.

  20. A design for a reusable Ada library

    NASA Technical Reports Server (NTRS)

    Litke, John D.

    1986-01-01

    A goal of the Ada language standardization effort is to promote reuse of software, implying the existence of substantial software libraries and the storage/retrieval mechanisms to support them. A searching/cataloging mechanism is proposed that permits full or partial distribution of the database, adapts to a variety of searching mechanisms, permits a changine taxonomy with minimal disruption, and minimizes the requirement of specialized cataloger/indexer skills. The important observation is that key words serve not only as indexing mechanism, but also as an identification mechanism, especially via concatenation and as support for a searching mechanism. By deliberately separating these multiple uses, the modifiability and ease of growth that current libraries require, is achieved.

  1. Service dogs, psychiatric hospitalization, and the ADA.

    PubMed

    Muramatsu, Russ S; Thomas, Kelly Jones; Leong, Stephanie L; Ragukonis, Frank

    2015-01-01

    A service dog is defined as "any dog that is individually trained to do work or perform tasks for the benefit of an individual with a disability, including a physical, sensory, psychiatric, intellectual, or other mental disability." Some psychiatric patients may depend on a service dog for day-to-day functioning. The Americans with Disabilities Act (ADA) established certain rights and responsibilities for individuals with disabilities and health care providers. Psychiatric hospitalization of a patient with a service dog may pose a problem and involves balancing the requirement to provide safe and appropriate psychiatric care with the rights of individuals with disabilities. This Open Forum examines issues that arise in such circumstances, reviews the literature, and provides a foundation for the development of policies and procedures.

  2. Sources of Signal in 62 Protein-Coding Nuclear Genes for Higher-Level Phylogenetics of Arthropods

    PubMed Central

    Regier, Jerome C.; Zwick, Andreas

    2011-01-01

    Background This study aims to investigate the strength of various sources of phylogenetic information that led to recent seemingly robust conclusions about higher-level arthropod phylogeny and to assess the role of excluding or downweighting synonymous change for arriving at those conclusions. Methodology/Principal Findings The current study analyzes DNA sequences from 68 gene segments of 62 distinct protein-coding nuclear genes for 80 species. Gene segments analyzed individually support numerous nodes recovered in combined-gene analyses, but few of the higher-level nodes of greatest current interest. However, neither is there support for conflicting alternatives to these higher-level nodes. Gene segments with higher rates of nonsynonymous change tend to be more informative overall, but those with lower rates tend to provide stronger support for deeper nodes. Higher-level nodes with bootstrap values in the 80% – 99% range for the complete data matrix are markedly more sensitive to substantial drops in their bootstrap percentages after character subsampling than those with 100% bootstrap, suggesting that these nodes are likely not to have been strongly supported with many fewer data than in the full matrix. Data set partitioning of total data by (mostly) synonymous and (mostly) nonsynonymous change improves overall node support, but the result remains much inferior to analysis of (unpartitioned) nonsynonymous change alone. Clusters of genes with similar nonsynonymous rate properties (e.g., faster vs. slower) show some distinct patterns of node support but few conflicts. Synonymous change is shown to contribute little, if any, phylogenetic signal to the support of higher-level nodes, but it does contribute nonphylogenetic signal, probably through its underlying heterogeneous nucleotide composition. Analysis of seemingly conservative indels does not prove useful. Conclusions Generating a robust molecular higher-level phylogeny of Arthropoda is currently possible

  3. Advances in gene therapy for ADA-deficient SCID.

    PubMed

    Aiuti, Alessandro

    2002-10-01

    Adenosine deaminase (ADA)-deficient severe combined immunodeficiency (SCID) was the first inherited disease treated with gene therapy. The pilot gene therapy studies demonstrated the safety, therapeutic potential and limitations of ADA gene transfer into hematopoietic cells using retroviral vectors. This review describes the latest progress in ADA-SCID dinical trials using peripheral blood lymphocytes (PBLs) and hematopoietic stem cells (HSCs). PBL gene therapy was able to restore T-cell functions after discontinuation of ADA enzyme replacement therapy, but only partially corrected the purine metabolic defect. The development of improved HSC gene transfer protocols, combined with low intensity conditioning, allowed full correction of the immunological and metabolic ADA defects, with clinic benefit. These results have important implications for future applications of gene therapy in other disorders involving the hemapoietic system.

  4. An Assessment of Some Design Constraints on Heat Production of a 3D Conceptual EGS Model Using an Open-Source Geothermal Reservoir Simulation Code

    SciTech Connect

    Yidong Xia; Mitch Plummer; Robert Podgorney; Ahmad Ghassemi

    2016-02-01

    Performance of heat production process over a 30-year period is assessed in a conceptual EGS model with a geothermal gradient of 65K per km depth in the reservoir. Water is circulated through a pair of parallel wells connected by a set of single large wing fractures. The results indicate that the desirable output electric power rate and lifespan could be obtained under suitable material properties and system parameters. A sensitivity analysis on some design constraints and operation parameters indicates that 1) the fracture horizontal spacing has profound effect on the long-term performance of heat production, 2) the downward deviation angle for the parallel doublet wells may help overcome the difficulty of vertical drilling to reach a favorable production temperature, and 3) the thermal energy production rate and lifespan has close dependence on water mass flow rate. The results also indicate that the heat production can be improved when the horizontal fracture spacing, well deviation angle, and production flow rate are under reasonable conditions. To conduct the reservoir modeling and simulations, an open-source, finite element based, fully implicit, fully coupled hydrothermal code, namely FALCON, has been developed and used in this work. Compared with most other existing codes that are either closed-source or commercially available in this area, this new open-source code has demonstrated a code development strategy that aims to provide an unparalleled easiness for user-customization and multi-physics coupling. Test results have shown that the FALCON code is able to complete the long-term tests efficiently and accurately, thanks to the state-of-the-art nonlinear and linear solver algorithms implemented in the code.

  5. Ada (Tradename) Compiler Validation Summary Report. Digital Equipment Corporation VAX Ada VI.3.

    DTIC Science & Technology

    1986-11-07

    Summary Report: 7 NOV 1986 to 7 NOV 1987 Digital Equipmont Corp. VAX Ada Vl.3 6. PERFORMING ORG . REPORT NUMBER 7. AUTHOR(s) 8. CONTRACT OR GRANT NUMBER(s...the identifiers TIME or SPA-CE as the single arg,:ent. I lhi< ragma is only allowed within a d,. iara !ive part and it applies to the block or body

  6. ADA 9X project report: ADA 9X requirements document. Draft report

    SciTech Connect

    Not Available

    1990-08-27

    This document contains a distillation of requests for language changes submitted by the general public and from special workshops held to identify potential areas for revision. The purpose of this document is to specify needs that are considered to be the appropriate focus of the Ada 9X revision effort and to identify revision requirements that are to be satisfied by the Mapping/Revision Team.

  7. Translation and execution of distributed Ada programs - Is it still Ada?

    NASA Technical Reports Server (NTRS)

    Volz, Richard A.; Mudge, Trevor N.; Buzzard, Gregory D.; Krishnan, Padmanabhan

    1987-01-01

    Some of the fundamental issues and tradeoffs for distributed execution systems for the Ada language are examined. Steps that need to be taken to deal with heterogeneity of addressing program objects, of processing resources, and of the individual processor environment are considered. The ways in which program elements can be assigned are examined in the context of four issues: implied remote object access, object visibility and recursive execution, task termination problems, and distributed types.

  8. Ada (Trade Name) Compiler Validation Summary Report: Verdix Corporation Verdix Ada Development System, Version 5.2 for the Tektronix 6130 under UTek, Release 2.1.1.

    DTIC Science & Technology

    1985-11-16

    AB.ADA P C64103A-B.ADA P C67003B-B.ADA P D- 12 %7I COMPLETE LIST OF TESTS AND RESULTS C67003C- ABADA P D64OO5FOM C D64005GD C V C67003D-BADA P D64005FA C...P C100 - ABADA P CB003-AB.DA BB2002A-AB.ADA P CB1003A-AB.ADA P CB1IOO3A-B.ADA P BB2003A-AB.ADA P CB2004A-B.ADA P CB4005A-B.ADA P BB2003A-AB.ADA P

  9. First International Conference on Ada (R) Programming Language Applications for the NASA Space Station, volume 1

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L. (Editor)

    1986-01-01

    Topics discussed include: test and verification; environment issues; distributed Ada issues; life cycle issues; Ada in Europe; management/training issues; common Ada interface set; and run time issues.

  10. Ada Compiler Validation Summary Report: Certificate Number: 890919W1. 10156 R. R. Software, Inc., Janus/ADA, Version 2.1.3 Zenith Z-386/25.

    DTIC Science & Technology

    1989-09-19

    Compiler Validation Summary Repcrt, Ada Cor piler Validation Capability, ACVC, Validation Testing, Ada Validation Office, AVO , Ada Validation Facility...results of the validation tesiing performed on an Ada compiler. Testing was carried out for the following purposes: • To attempt to identify any language...the Ada Validation Organization ( AVO ). On-site testing was completed 19 September 1989 at Madison WI. 1.2 USE OF THIS VALIDATION SUMMARY REPORT

  11. ART/Ada design project, phase 1. Task 1 report: Overall design

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.

    1988-01-01

    The design methodology for the ART/Ada project is introduced, and the selected design for ART/Ada is described in detail. The following topics are included: object-oriented design, reusable software, documentation techniques, impact of Ada, design approach, and differences between ART-IM 1.5 and ART/Ada 1.0 prototype. Also, Ada generator and ART/Ada runtime systems are discussed.

  12. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  13. Coded aperture imaging of fusion source in a plasma focus operated with pure D{sub 2} and a D{sub 2}-Kr gas admixture

    SciTech Connect

    Springham, S. V.; Talebitaher, A.; Shutler, P. M. E.; Rawat, R. S.; Lee, P.; Lee, S.

    2012-09-10

    The coded aperture imaging (CAI) technique has been used to investigate the spatial distribution of DD fusion in a 1.6 kJ plasma focus (PF) device operated in, alternatively, pure deuterium or deuterium-krypton admixture. The coded mask pattern is based on a singer cyclic difference set with 25% open fraction and positioned close to 90 Degree-Sign to the plasma focus axis, with CR-39 detectors used to register tracks of protons from the D(d, p)T reaction. Comparing the coded aperture imaging proton images for pure D{sub 2} and D{sub 2}-Kr admixture operation reveals clear differences in size, density, and shape between the fusion sources for these two cases.

  14. Ada Compiler Validation Summary Report: Certificate Number 880318W1. 09042, International Business Machines Corporation, IBM Development System for the Ada Language, Version 2.1.0, IBM 4381 under MVS/XA, Host and Target

    DTIC Science & Technology

    1988-03-28

    International Business Machines Corporation IBM Development System for the Ada Language, Version 2.1.0 IBM 4381 under MVS/XA, host and target Completion...Joint Program Office, AJPO 20. ABSTRACT (Continue on reverse side if necessary and identify by block number) International Business Machines Corporation...in the compiler listed in this declaration. I declare that International Business Machines Corporation is the owner of record of the object code of

  15. Ada and software management in NASA: Symposium/forum

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The promises of Ada to improve software productivity and quality, and the claims that a transition to Ada would require significant changes in NASA's training programs and ways of doing business were investigated. The study assesses the agency's ongoing and planned Ada activities. A series of industry representatives (Computer Sciences Corporation, General Electric Aerospace, McDonnell Douglas Space Systems Company, TRW, Lockheed, and Boeing) reviewed the recommendations and assessed their impact from the Company's perspective. The potential effects on NASA programs were then discussed.

  16. Applications of an architecture design and assessment system (ADAS)

    NASA Technical Reports Server (NTRS)

    Gray, F. Gail; Debrunner, Linda S.; White, Tennis S.

    1988-01-01

    A new Architecture Design and Assessment System (ADAS) tool package is introduced, and a range of possible applications is illustrated. ADAS was used to evaluate the performance of an advanced fault-tolerant computer architecture in a modern flight control application. Bottlenecks were identified and possible solutions suggested. The tool was also used to inject faults into the architecture and evaluate the synchronization algorithm, and improvements are suggested. Finally, ADAS was used as a front end research tool to aid in the design of reconfiguration algorithms in a distributed array architecture.

  17. Software Engineering Laboratory (SEL) Ada performance study report

    NASA Technical Reports Server (NTRS)

    Booth, Eric W.; Stark, Michael E.

    1991-01-01

    The goals of the Ada Performance Study are described. The methods used are explained. Guidelines for future Ada development efforts are given. The goals and scope of the study are detailed, and the background of Ada development in the Flight Dynamics Division (FDD) is presented. The organization and overall purpose of each test are discussed. The purpose, methods, and results of each test and analyses of these results are given. Guidelines for future development efforts based on the analysis of results from this study are provided. The approach used on the performance tests is discussed.

  18. Description of a MIL-STD-1553B Data Bus Ada Driver for the LeRC EPS Testbed

    NASA Technical Reports Server (NTRS)

    Mackin, Michael A.

    1995-01-01

    This document describes the software designed to provide communication between control computers in the NASA Lewis Research Center Electrical Power System Testbed using MIL-STD-1553B. The software drivers are coded in the Ada programming language and were developed on a MSDOS-based computer workstation. The Electrical Power System (EPS) Testbed is a reduced-scale prototype space station electrical power system. The power system manages and distributes electrical power from the sources (batteries or photovoltaic arrays) to the end-user loads. The electrical system primary operates at 120 volts DC, and the secondary system operates at 28 volts DC. The devices which direct the flow of electrical power are controlled by a network of six control computers. Data and control messages are passed between the computers using the MIL-STD-1553B network. One of the computers, the Power Management Controller (PMC), controls the primary power distribution and another, the Load Management Controller (LMC), controls the secondary power distribution. Each of these computers communicates with two other computers which act as subsidiary controllers. These subsidiary controllers are, in turn, connected to the devices which directly control the flow of electrical power.

  19. DNA sequence-based "bar codes" for tracking the origins of expressed sequence tags from a maize cDNA library constructed using multiple mRNA sources.

    PubMed

    Qiu, Fang; Guo, Ling; Wen, Tsui-Jung; Liu, Feng; Ashlock, Daniel A; Schnable, Patrick S

    2003-10-01

    To enhance gene discovery, expressed sequence tag (EST) projects often make use of cDNA libraries produced using diverse mixtures of mRNAs. As such, expression data are lost because the origins of the resulting ESTs cannot be determined. Alternatively, multiple libraries can be prepared, each from a more restricted source of mRNAs. Although this approach allows the origins of ESTs to be determined, it requires the production of multiple libraries. A hybrid approach is reported here. A cDNA library was prepared using 21 different pools of maize (Zea mays) mRNAs. DNA sequence "bar codes" were added during first-strand cDNA synthesis to uniquely identify the mRNA source pool from which individual cDNAs were derived. Using a decoding algorithm that included error correction, it was possible to identify the source mRNA pool of more than 97% of the ESTs. The frequency at which a bar code is represented in an EST contig should be proportional to the abundance of the corresponding mRNA in the source pool. Consistent with this, all ESTs derived from several genes (zein and adh1) that are known to be exclusively expressed in kernels or preferentially expressed under anaerobic conditions, respectively, were exclusively tagged with bar codes associated with mRNA pools prepared from kernel and anaerobically treated seedlings, respectively. Hence, by allowing for the retention of expression data, the bar coding of cDNA libraries can enhance the value of EST projects.

  20. Ada (trademark) Compiler Validation Summary Report. Verdix Ada Compiler, VADS, Version V03.06 for VAX-11/785, Using ULTRIX 1.0.

    DTIC Science & Technology

    1985-06-14

    CCA207AOM C L3 04- ABADA /A A300B1 CA2007AOM C LA3004A-ABD N/A LA3007B2 C CA2007A1 C LA3004AO-AB N/A LA3007B2 C CA2007A2 C LA3004A2-AB N/A LA3007B4 C...B.ADA P CC3120B-B.ADA P CC3504A-B.ADA PI.BC3503B-B.ADA P CC32A-B.ADA P CC3504B-B.ADA P BC3503D-B.ADA P CC3203A- ABADA P CC3504C-B.ADA P BC3503D-B.ADA P

  1. Molecular basis for paradoxical carriers of adenosine deaminase (ADA) deficiency that show extremely low levels of ADA activity in peripheral blood cells without immunodeficiency.

    PubMed

    Ariga, T; Oda, N; Sanstisteban, I; Arredondo-Vega, F X; Shioda, M; Ueno, H; Terada, K; Kobayashi, K; Hershfield, M S; Sakiyama, Y

    2001-02-01

    Adenosine deaminase (ADA) deficiency causes an autosomal recessive form of severe combined immunodeficiency and also less severe phenotypes, depending to a large degree on genotype. In general, ADA activity in cells of carriers is approximately half-normal. Unexpectedly, healthy first-degree relatives of two unrelated ADA-deficient severe combined immunodeficient patients (mother and brother in family I; mother in family II) had only 1-2% of normal ADA activity in PBMC, lower than has previously been found in PBMC of healthy individuals with so-called "partial ADA deficiency." The level of deoxyadenosine nucleotides in erythrocytes of these paradoxical carriers was slightly elevated, but much lower than levels found in immunodeficient patients with ADA deficiency. ADA activity in EBV-lymphoblastoid cell lines (LCL) and T cell lines established from these carriers was 10-20% of normal. Each of these carriers possessed two mutated ADA alleles. Expression of cloned mutant ADA cDNAs in an ADA-deletion strain of Escherichia coli indicated that the novel mutations G239S and M310T were responsible for the residual ADA activity. ADA activity in EBV-LCL extracts of the paradoxical carriers was much more labile than ADA from normal EBV-LCL. Immunoblotting suggested that this lability was due to denaturation rather than to degradation of the mutant protein. These results further define the threshold level of ADA activity necessary for sustaining immune function.

  2. Ada (trade name) Compiler Validation Summary Report. Verdix Corporation Verdix Ada Development System, Version 5.2 for the VAX-11/750 under VMS V4.1. Completion of On-Site Validation: 17 Novemember 1985.

    DTIC Science & Technology

    1985-11-17

    AB.ADA P *C52104L- ABADA P C55B04A-AB.ADA P C58006B-AB.ADA P *C52104KABAD P C55B05A -ADA P C59001B-A.D P C521OLIP-AB.ADA P C55B06A-AB.ADA P C59002A-AB.ADA...P B97103D-AB.ADA P C9�B-B.ADA P B91OACA-B.ADA P B97103E- ABADA P C94003A-B.ADA P B910AEA-B.ADA P B9710’IA-AB.ADA P C94IOO4A-B.ADA P *B91OBCA-B.ADA

  3. Microevolution of the Chromosomal Region of Acute Disease Antigen A (adaA) in the Query (Q) Fever Agent Coxiella burnetii

    PubMed Central

    Frangoulidis, Dimitrios; Splettstoesser, Wolf D.; Landt, Olfert; Dehnhardt, Jasmin; Henning, Klaus; Hilbert, Angela; Bauer, Tilman; Antwerpen, Markus; Meyer, Hermann

    2013-01-01

    The acute disease antigen A (adaA) gene is believed to be associated with Coxiella burnetii strains causing acute Q fever. The detailed analysis of the adaA genomic region of 23 human- and 86 animal-derived C. burnetii isolates presented in this study reveals a much more polymorphic appearance and distribution of the adaA gene, resulting in a classification of C. burnetii strains of better differentiation than previously anticipated. Three different genomic variants of the adaA gene were identified which could be detected in isolates from acute and chronic patients, rendering the association of adaA positive strains with acute Q fever disease disputable. In addition, all adaA positive strains in humans and animals showed the occurrence of the QpH1 plasmid. All adaA positive isolates of acute human patients except one showed a distinct SNP variation at position 431, also predominant in sheep strains, which correlates well with the observation that sheep are a major source of human infection. Furthermore, the phylogenetic analysis of the adaA gene revealed three deletion events and supported the hypothesis that strain Dugway 5J108-111 might be the ancestor of all known C. burnetii strains. Based on our findings, we could confirm the QpDV group and we were able to define a new genotypic cluster. The adaA gene polymorphisms shown here improve molecular typing of Q fever, and give new insights into microevolutionary adaption processes in C. burnetii. PMID:23301072

  4. A report on NASA software engineering and Ada training requirements

    NASA Technical Reports Server (NTRS)

    Legrand, Sue; Freedman, Glenn B.; Svabek, L.

    1987-01-01

    NASA's software engineering and Ada skill base are assessed and information that may result in new models for software engineering, Ada training plans, and curricula are provided. A quantitative assessment which reflects the requirements for software engineering and Ada training across NASA is provided. A recommended implementation plan including a suggested curriculum with associated duration per course and suggested means of delivery is also provided. The distinction between education and training is made. Although it was directed to focus on NASA's need for the latter, the key relationships to software engineering education are also identified. A rationale and strategy for implementing a life cycle education and training program are detailed in support of improved software engineering practices and the transition to Ada.

  5. The development of a program analysis environment for Ada

    NASA Technical Reports Server (NTRS)

    Brown, David B.; Carlisle, Homer W.; Chang, Kai-Hsiung; Cross, James H.; Deason, William H.; Haga, Kevin D.; Huggins, John R.; Keleher, William R. A.; Starke, Benjamin B.; Weyrich, Orville R.

    1989-01-01

    A unit level, Ada software module testing system, called Query Utility Environment for Software Testing of Ada (QUEST/Ada), is described. The project calls for the design and development of a prototype system. QUEST/Ada design began with a definition of the overall system structure and a description of component dependencies. The project team was divided into three groups to resolve the preliminary designs of the parser/scanner: the test data generator, and the test coverage analyzer. The Phase 1 report is a working document from which the system documentation will evolve. It provides history, a guide to report sections, a literature review, the definition of the system structure and high level interfaces, descriptions of the prototype scope, the three major components, and the plan for the remainder of the project. The appendices include specifications, statistics, two papers derived from the current research, a preliminary users' manual, and the proposal and work plan for Phase 2.

  6. Interesting viewpoints to those who will put Ada into practice

    NASA Technical Reports Server (NTRS)

    Carlsson, Arne

    1986-01-01

    Ada will most probably be used as the programming language for computers in the NASA Space Station. It is reasonable to suppose that Ada will be used for at least embedded computers, because the high software costs for these embedded computers were the reason why Ada activities were initiated about ten years ago. The on-board computers are designed for use in space applications, where maintenance by man is impossible. All manipulation of such computers has to be performed in an autonomous way or remote with commands from the ground. In a manned Space Station some maintenance work can be performed by service people on board, but there are still a lot of applications, which require autonomous computers, for example, vital Space Station functions and unmanned orbital transfer vehicles. Those aspect which have come out of the analysis of Ada characteristics together with the experience of requirements for embedded on-board computers in space applications are examined.

  7. The Adam language: Ada extended with support for multiway activities

    NASA Technical Reports Server (NTRS)

    Charlesworth, Arthur

    1993-01-01

    The Adam language is an extension of Ada that supports multiway activities, which are cooperative activities involving two or more processes. This support is provided by three new constructs: diva procedures, meet statements, and multiway accept statements. Diva procedures are recursive generic procedures having a particular restrictive syntax that facilitates translation for parallel computers. Meet statements and multiway accept statements provide two ways to express a multiway rendezvous, which is an n-way rendezvous generalizing Ada's 2-way rendezvous. While meet statements tend to have simpler rules than multiway accept statements, the latter approach is a more straightforward extension of Ada. The only nonnull statements permitted within meet statements and multiway accept statements are calls on instantiated diva procedures. A call on an instantiated diva procedure is also permitted outside a multiway rendezvous; thus sequential Adam programs using diva procedures can be written. Adam programs are translated into Ada programs appropriate for use on parallel computers.

  8. In vivo transduction by intravenous injection of a lentiviral vector expressing human ADA into neonatal ADA gene knockout mice: a novel form of enzyme replacement therapy for ADA deficiency.

    PubMed

    Carbonaro, Denise A; Jin, Xiangyang; Petersen, Denise; Wang, Xingchao; Dorey, Fred; Kil, Ki Soo; Aldrich, Melissa; Blackburn, Michael R; Kellems, Rodney E; Kohn, Donald B

    2006-06-01

    Using a mouse model of adenosine deaminase-deficient severe combined immune deficiency syndrome (ADA-deficient SCID), we have developed a noninvasive method of gene transfer for the sustained systemic expression of human ADA as enzyme replacement therapy. The method of delivery is a human immunodeficiency virus 1-based lentiviral vector given systemically by intravenous injection on day 1 to 2 of life. In this article we characterize the biodistribution of the integrated vector, the expression levels of ADA enzyme activity in various tissues, as well as the efficacy of systemic ADA expression to correct the ADA-deficient phenotype in this mouse model. The long-term expression of enzymatically active ADA achieved by this method, primarily from transduction of liver and lung, restored immunologic function and significantly extended survival. These studies illustrate the potential for sustained in vivo production of enzymatically active ADA, as an alternative to therapy by frequent injection of exogenous ADA protein.

  9. Benchmarking Ada tasking on tightly coupled multiprocessor architectures

    NASA Technical Reports Server (NTRS)

    Collard, Philippe; Goforth, Andre; Marquardt, Matthew

    1989-01-01

    The development of benchmarks and performance measures for parallel Ada tasking is reported with emphasis on the macroscopic behavior of the benchmark across a set of load parameters. The application chosen for the study was the NASREM model for telerobot control, relevant to many NASA missions. The results of the study demonstrate the potential of parallel Ada in accomplishing the task of developing a control system for a system such as the Flight Telerobotic Servicer using the NASREM framework.

  10. Compiling knowledge-based systems specified in KEE to Ada

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Feldman, Roy D.

    1991-01-01

    The first year of the PrKAda project is recounted. The primary goal was to develop a system for delivering Artificial Intelligence applications developed in the ProKappa system in a pure-Ada environment. The following areas are discussed: the ProKappa core and ProTalk programming language; the current status of the implementation; the limitations and restrictions of the current system; and the development of Ada-language message handlers in the ProKappa environment.

  11. Evaluation of Ada as a Communications Programming Language.

    DTIC Science & Technology

    1981-03-31

    drastically, altered or circumvented to provide a communicatin system with sufficient resources to operate at a required level of performance. 3.3.1.2...Command (CORADCO4) and standard Digital Equipment Corporation VAX 11/780 system * software. Plans include the hosting of Ada/ED on the VAX 11/780 at the... Corporation , Bedford, MA, August, 1978. b. /LOGl79a/ LOGICON, "Formal Specification of GUARD Trusted Software (Draft)," ARPA-78C032303, September, 1979

  12. Ada Compiler Validation Summary Report: Certificate Number 89020W1. 10073: International Business Machines Corporation, IBM Development System for the Ada Language, VM/CMS Ada Compiler, Version 2.1.1, IBM 3083 (Host and Target)

    DTIC Science & Technology

    1989-04-20

    International Business Machines Corporation) IBM Development System for the Ada Language, VN11/CMS Ada Compiler, Version 2.1.1, Wright-Patterson AFB, IBM 3083...890420W1.10073 International Business Machines Corporation IBM Development System for the Ada Language VM/CMS Ada Compiler Version 2.1.1 IBM 3083... International Business Machines Corporation and reviewed by the validation team. The compiler was tested using all default option settings except for the

  13. Ada Compiler Validation Summary Report: Certificate Number: 890420W1. 10066 International Business Machines Corporation, IBM Development System for the Ada Language, AIX/RT Ada Compiler, Version 1.1.1, IBM RT PC 6150-125

    DTIC Science & Technology

    1989-04-20

    International Business Machines Corporation, IBM Development System. for the Ada Language AIX/RT Ada Compiler, Version 1.1.1, Wright-Patterson APB...Certificate Number: 890420V1.10066 International Business Machines Corporation IBM Development System for the Ada Language AIX/RT Ada Compiler, Version 1.1.1...TEST INFORMATION The compiler was tested using command scripts provided by International Business Machines Corporation and reviewed by the validation

  14. Hydraulic Capacity of an ADA Compliant Street Drain Grate

    SciTech Connect

    Lottes, Steven A.; Bojanowski, Cezary

    2015-09-01

    Resurfacing of urban roads with concurrent repairs and replacement of sections of curb and sidewalk may require pedestrian ramps that are compliant with the American Disabilities Act (ADA), and when street drains are in close proximity to the walkway, ADA compliant street grates may also be required. The Minnesota Department of Transportation ADA Operations Unit identified a foundry with an available grate that meets ADA requirements. Argonne National Laboratory’s Transportation Research and Analysis Computing Center used full scale three dimensional computational fluid dynamics to determine the performance of the ADA compliant grate and compared it to that of a standard vane grate. Analysis of a parametric set of cases was carried out, including variation in longitudinal, gutter, and cross street slopes and the water spread from the curb. The performance of the grates was characterized by the fraction of the total volume flow approaching the grate from the upstream that was captured by the grate and diverted into the catch basin. The fraction of the total flow entering over the grate from the side and the fraction of flow directly over a grate diverted into the catch basin were also quantities of interest that aid in understanding the differences in performance of the grates. The ADA compliant grate performance lagged that of the vane grate, increasingly so as upstream Reynolds number increased. The major factor leading to the performance difference between the two grates was the fraction of flow directly over the grates that is captured by the grates.

  15. Programming in a proposed 9X distributed Ada

    NASA Technical Reports Server (NTRS)

    Waldrop, Raymond S.; Volz, Richard A.; Goldsack, Stephen J.; Holzbach-Valero, A. A.

    1991-01-01

    The studies of the proposed Ada 9X constructs for distribution, now referred to as AdaPT are reported. The goals for this time period were to revise the chosen example scenario and to begin studying about how the proposed constructs might be implemented. The example scenario chosen is the Submarine Combat Information Center (CIC) developed by IBM for the Navy. The specification provided by IBM was preliminary and had several deficiencies. To address these problems, some changes to the scenario specification were made. Some of the more important changes include: (1) addition of a system database management function; (2) addition of a fourth processing unit to the standard resources; (3) addition of an operator console interface function; and (4) removal of the time synchronization function. To implement the CIC scenario in AdaPT, the decided strategy were publics, partitions, and nodes. The principle purpose for implementing the CIC scenario was to demonstrate how the AdaPT constructs interact with the program structure. While considering ways that the AdaPt constructs might be translated to Ada 83, it was observed that the partition construct could reasonably be modeled as an abstract data type. Although this gives a useful method of modeling partitions, it does not at all address the configuration aspects on the node construct.

  16. Examining the reliability of ADAS-Cog change scores.

    PubMed

    Grochowalski, Joseph H; Liu, Ying; Siedlecki, Karen L

    2016-09-01

    The purpose of this study was to estimate and examine ways to improve the reliability of change scores on the Alzheimer's Disease Assessment Scale, Cognitive Subtest (ADAS-Cog). The sample, provided by the Alzheimer's Disease Neuroimaging Initiative, included individuals with Alzheimer's disease (AD) (n = 153) and individuals with mild cognitive impairment (MCI) (n = 352). All participants were administered the ADAS-Cog at baseline and 1 year, and change scores were calculated as the difference in scores over the 1-year period. Three types of change score reliabilities were estimated using multivariate generalizability. Two methods to increase change score reliability were evaluated: reweighting the subtests of the scale and adding more subtests. Reliability of ADAS-Cog change scores over 1 year was low for both the AD sample (ranging from .53 to .64) and the MCI sample (.39 to .61). Reweighting the change scores from the AD sample improved reliability (.68 to .76), but lengthening provided no useful improvement for either sample. The MCI change scores had low reliability, even with reweighting and adding additional subtests. The ADAS-Cog scores had low reliability for measuring change. Researchers using the ADAS-Cog should estimate and report reliability for their use of the change scores. The ADAS-Cog change scores are not recommended for assessment of meaningful clinical change.

  17. Proceedings of the third international IEEE conference on Ada applications and environments

    SciTech Connect

    Not Available

    1988-01-01

    These proceedings collect papers on software applications. Topics include: An interleaving symbolic execution approach for the formal verification of Ada programs with tasking, fault-tolerant Ada software, object-oriented frameworks for Ada, and generating multitasking Ada programs from high-level specifications.

  18. Ada and knowledge-based systems: A prototype combining the best of both worlds

    NASA Technical Reports Server (NTRS)

    Brauer, David C.

    1986-01-01

    A software architecture is described which facilitates the construction of distributed expert systems using Ada and selected knowledge based systems. This architecture was utilized in the development of a Knowledge-based Maintenance Expert System (KNOMES) prototype for the Space Station Mobile Service Center (MSC). The KNOMES prototype monitors a simulated data stream from MSC sensors and built-in test equipment. It detects anomalies in the data and performs diagnosis to determine the cause. The software architecture which supports the KNOMES prototype allows for the monitoring and diagnosis tasks to be performed concurrently. The basic concept of this software architecture is named ACTOR (Ada Cognitive Task ORganization Scheme). An individual ACTOR is a modular software unit which contains both standard data processing and artificial intelligence components. A generic ACTOR module contains Ada packages for communicating with other ACTORs and accessing various data sources. The knowledge based component of an ACTOR determines the role it will play in a system. In this prototype, an ACTOR will monitor the MSC data stream.

  19. A Mode Propagation Database Suitable for Code Validation Utilizing the NASA Glenn Advanced Noise Control Fan and Artificial Sources

    NASA Technical Reports Server (NTRS)

    Sutliff, Daniel L.

    2014-01-01

    The NASA Glenn Research Center's Advanced Noise Control Fan (ANCF) was developed in the early 1990s to provide a convenient test bed to measure and understand fan-generated acoustics, duct propagation, and radiation to the farfield. A series of tests were performed primarily for the use of code validation and tool validation. Rotating Rake mode measurements were acquired for parametric sets of: (1) mode blockage, (2) liner insertion loss, (3) short ducts, and (4) mode reflection.

  20. A Mode Propagation Database Suitable for Code Validation Utilizing the NASA Glenn Advanced Noise Control Fan and Artificial Sources

    NASA Technical Reports Server (NTRS)

    Sutliff, Daniel L.

    2014-01-01

    The NASA Glenn Research Center's Advanced Noise Control Fan (ANCF) was developed in the early 1990s to provide a convenient test bed to measure and understand fan-generated acoustics, duct propagation, and radiation to the farfield. A series of tests were performed primarily for the use of code validation and tool validation. Rotating Rake mode measurements were acquired for parametric sets of: (i) mode blockage, (ii) liner insertion loss, (iii) short ducts, and (iv) mode reflection.

  1. OFF, Open source Finite volume Fluid dynamics code: A free, high-order solver based on parallel, modular, object-oriented Fortran API

    NASA Astrophysics Data System (ADS)

    Zaghi, S.

    2014-07-01

    OFF, an open source (free software) code for performing fluid dynamics simulations, is presented. The aim of OFF is to solve, numerically, the unsteady (and steady) compressible Navier-Stokes equations of fluid dynamics by means of finite volume techniques: the research background is mainly focused on high-order (WENO) schemes for multi-fluids, multi-phase flows over complex geometries. To this purpose a highly modular, object-oriented application program interface (API) has been developed. In particular, the concepts of data encapsulation and inheritance available within Fortran language (from standard 2003) have been stressed in order to represent each fluid dynamics "entity" (e.g. the conservative variables of a finite volume, its geometry, etc…) by a single object so that a large variety of computational libraries can be easily (and efficiently) developed upon these objects. The main features of OFF can be summarized as follows: Programming LanguageOFF is written in standard (compliant) Fortran 2003; its design is highly modular in order to enhance simplicity of use and maintenance without compromising the efficiency; Parallel Frameworks Supported the development of OFF has been also targeted to maximize the computational efficiency: the code is designed to run on shared-memory multi-cores workstations and distributed-memory clusters of shared-memory nodes (supercomputers); the code's parallelization is based on Open Multiprocessing (OpenMP) and Message Passing Interface (MPI) paradigms; Usability, Maintenance and Enhancement in order to improve the usability, maintenance and enhancement of the code also the documentation has been carefully taken into account; the documentation is built upon comprehensive comments placed directly into the source files (no external documentation files needed): these comments are parsed by means of doxygen free software producing high quality html and latex documentation pages; the distributed versioning system referred as git

  2. Ada Compiler Validation Summary Report: Certificate Number: 880524I1. 09118 Systeam KG Systeam Ada Compiler VAX/VMS Version 1.8 VAX 8530 (Host and Target)

    DTIC Science & Technology

    1988-05-24

    Validation Summary Report, Ada Compiler Validation Capability, ACVC, Validation Testing, Ada Validation Office, AVO , Ada Validation Facility, AVF, ANSI...administered by the Ada Validation Organization ( AVO ). On-site testing was completed 88-05-24 at SYSTEAM KG at Karlsruhe. 1.2 USE OF THIS VALIDATION...SUMMARY REPORT Consistent with the national laws of the originating country, the AVO may make full and free public disclosure of this report. in the United

  3. Observations of X-ray transient source GS2023+338 with the TTM coded mask telescope

    NASA Astrophysics Data System (ADS)

    Pan, H. C.; in 't Zand, J. J. M.; Skinner, G. K.; Borozdin, K. N.; Gil'Fanov, M. R.; Siuniaev, R. A.

    1993-01-01

    TTM observations in which the bright X-ray transient source GS2023+338 (=V404 Cyg) in the period June-August 1989 are reported. The observed spectral structure can be modeled using a model of a power-law source with a photon index of about 1.5, surrounded by partially ionized material. The observed X-rays consist of a component from the power-law source and those reflected (down-scattered) by the partially ionized material. Varying the clumpy structure or changing the ionization state of the circumstellar matter will cause the low-energy absorption to fluctuate.

  4. Mobile, hybrid Compton/coded aperture imaging for detection, identification and localization of gamma-ray sources at stand-off distances

    NASA Astrophysics Data System (ADS)

    Tornga, Shawn R.

    The Stand-off Radiation Detection System (SORDS) program is an Advanced Technology Demonstration (ATD) project through the Department of Homeland Security's Domestic Nuclear Detection Office (DNDO) with the goal of detection, identification and localization of weak radiological sources in the presence of large dynamic backgrounds. The Raytheon-SORDS Tri-Modal Imager (TMI) is a mobile truck-based, hybrid gamma-ray imaging system able to quickly detect, identify and localize, radiation sources at standoff distances through improved sensitivity while minimizing the false alarm rate. Reconstruction of gamma-ray sources is performed using a combination of two imaging modalities; coded aperture and Compton scatter imaging. The TMI consists of 35 sodium iodide (NaI) crystals 5x5x2 in3 each, arranged in a random coded aperture mask array (CA), followed by 30 position sensitive NaI bars each 24x2.5x3 in3 called the detection array (DA). The CA array acts as both a coded aperture mask and scattering detector for Compton events. The large-area DA array acts as a collection detector for both Compton scattered events and coded aperture events. In this thesis, developed coded aperture, Compton and hybrid imaging algorithms will be described along with their performance. It will be shown that multiple imaging modalities can be fused to improve detection sensitivity over a broader energy range than either alone. Since the TMI is a moving system, peripheral data, such as a Global Positioning System (GPS) and Inertial Navigation System (INS) must also be incorporated. A method of adapting static imaging algorithms to a moving platform has been developed. Also, algorithms were developed in parallel with detector hardware, through the use of extensive simulations performed with the Geometry and Tracking Toolkit v4 (GEANT4). Simulations have been well validated against measured data. Results of image reconstruction algorithms at various speeds and distances will be presented as well as

  5. Universal Noiseless Coding Subroutines

    NASA Technical Reports Server (NTRS)

    Schlutsmeyer, A. P.; Rice, R. F.

    1986-01-01

    Software package consists of FORTRAN subroutines that perform universal noiseless coding and decoding of integer and binary data strings. Purpose of this type of coding to achieve data compression in sense that coded data represents original data perfectly (noiselessly) while taking fewer bits to do so. Routines universal because they apply to virtually any "real-world" data source.

  6. Dosimetric comparison between the microSelectron HDR 192Ir v2 source and the BEBIG 60Co source for HDR brachytherapy using the EGSnrc Monte Carlo transport code

    PubMed Central

    Islam, M. Anwarul; Akramuzzaman, M. M.; Zakaria, G. A.

    2012-01-01

    Manufacturing of miniaturized high activity 192Ir sources have been made a market preference in modern brachytherapy. The smaller dimensions of the sources are flexible for smaller diameter of the applicators and it is also suitable for interstitial implants. Presently, miniaturized 60Co HDR sources have been made available with identical dimensions to those of 192Ir sources. 60Co sources have an advantage of longer half life while comparing with 192Ir source. High dose rate brachytherapy sources with longer half life are logically pragmatic solution for developing country in economic point of view. This study is aimed to compare the TG-43U1 dosimetric parameters for new BEBIG 60Co HDR and new microSelectron 192Ir HDR sources. Dosimetric parameters are calculated using EGSnrc-based Monte Carlo simulation code accordance with the AAPM TG-43 formalism for microSlectron HDR 192Ir v2 and new BEBIG 60Co HDR sources. Air-kerma strength per unit source activity, calculated in dry air are 9.698×10-8 ± 0.55% U Bq-1 and 3.039×10-7 ± 0.41% U Bq-1 for the above mentioned two sources, respectively. The calculated dose rate constants per unit air-kerma strength in water medium are 1.116±0.12% cGy h-1U-1 and 1.097±0.12% cGy h-1U-1, respectively, for the two sources. The values of radial dose function for distances up to 1 cm and more than 22 cm for BEBIG 60Co HDR source are higher than that of other source. The anisotropic values are sharply increased to the longitudinal sides of the BEBIG 60Co source and the rise is comparatively sharper than that of the other source. Tissue dependence of the absorbed dose has been investigated with vacuum phantom for breast, compact bone, blood, lung, thyroid, soft tissue, testis, and muscle. No significant variation is noted at 5 cm of radial distance in this regard while comparing the two sources except for lung tissues. The true dose rates are calculated with considering photon as well as electron transport using appropriate cut

  7. Maximum Likelihood Expectation-Maximization Algorithms Applied to Localization and Identification of Radioactive Sources with Recent Coded Mask Gamma Cameras

    SciTech Connect

    Lemaire, H.; Barat, E.; Carrel, F.; Dautremer, T.; Dubos, S.; Limousin, O.; Montagu, T.; Normand, S.; Schoepff, V.; Amgarou, K.; Menaa, N.; Angelique, J.-C.; Patoz, A.

    2015-07-01

    In this work, we tested Maximum likelihood expectation-maximization (MLEM) algorithms optimized for gamma imaging applications on two recent coded mask gamma cameras. We respectively took advantage of the characteristics of the GAMPIX and Caliste HD-based gamma cameras: noise reduction thanks to mask/anti-mask procedure but limited energy resolution for GAMPIX, high energy resolution for Caliste HD. One of our short-term perspectives is the test of MAPEM algorithms integrating specific prior values for the data to reconstruct adapted to the gamma imaging topic. (authors)

  8. TIDY, a complete code for renumbering and editing FORTRAN source programs. User's manual for IBM 360/67

    NASA Technical Reports Server (NTRS)

    Barlow, A. V.; Vanderplaats, G. N.

    1973-01-01

    TIDY, a computer code which edits and renumerates FORTRAN decks which have become difficult to read because of many patches and revisions, is described. The old program is reorganized so that statement numbers are added sequentially, and extraneous FORTRAN statements are deleted. General instructions for using TIDY on the IBM 360/67 Tymeshare System, and specific instructions for use on the NASA/AMES IBM 360/67 TSS system are included as well as specific instructions on how to run TIDY in conversational and in batch modes. TIDY may be adopted for use on other computers.

  9. Bacteria-induced natural product formation in the fungus Aspergillus nidulans requires Saga/Ada-mediated histone acetylation.

    PubMed

    Nützmann, Hans-Wilhelm; Reyes-Dominguez, Yazmid; Scherlach, Kirstin; Schroeckh, Volker; Horn, Fabian; Gacek, Agnieszka; Schümann, Julia; Hertweck, Christian; Strauss, Joseph; Brakhage, Axel A

    2011-08-23

    Sequence analyses of fungal genomes have revealed that the potential of fungi to produce secondary metabolites is greatly underestimated. In fact, most gene clusters coding for the biosynthesis of antibiotics, toxins, or pigments are silent under standard laboratory conditions. Hence, it is one of the major challenges in microbiology to uncover the mechanisms required for pathway activation. Recently, we discovered that intimate physical interaction of the important model fungus Aspergillus nidulans with the soil-dwelling bacterium Streptomyces rapamycinicus specifically activated silent fungal secondary metabolism genes, resulting in the production of the archetypal polyketide orsellinic acid and its derivatives. Here, we report that the streptomycete triggers modification of fungal histones. Deletion analysis of 36 of 40 acetyltransferases, including histone acetyltransferases (HATs) of A. nidulans, demonstrated that the Saga/Ada complex containing the HAT GcnE and the AdaB protein is required for induction of the orsellinic acid gene cluster by the bacterium. We also showed that Saga/Ada plays a major role for specific induction of other biosynthesis gene clusters, such as sterigmatocystin, terrequinone, and penicillin. Chromatin immunoprecipitation showed that the Saga/Ada-dependent increase of histone 3 acetylation at lysine 9 and 14 occurs during interaction of fungus and bacterium. Furthermore, the production of secondary metabolites in A. nidulans is accompanied by a global increase in H3K14 acetylation. Increased H3K9 acetylation, however, was only found within gene clusters. This report provides previously undescribed evidence of Saga/Ada dependent histone acetylation triggered by prokaryotes.

  10. Facilitating Internet-Scale Code Retrieval

    ERIC Educational Resources Information Center

    Bajracharya, Sushil Krishna

    2010-01-01

    Internet-Scale code retrieval deals with the representation, storage, and access of relevant source code from a large amount of source code available on the Internet. Internet-Scale code retrieval systems support common emerging practices among software developers related to finding and reusing source code. In this dissertation we focus on some…

  11. Ada (Trade Name) Compiler Validation Summary Report: Verdix Corporation Verdix Ada Development System, Version 5.2 for the CCI Power 6/32 under Power 6 UNIX, Release 1.11.

    DTIC Science & Technology

    1986-11-16

    AB.ADA P- B 54A20A.ADA P B56001H-AB.ADA P C52103G-AB.ADA P D- 10 COMPLETE LIST OF TESTS AND RESULTS C5213H-A.AD P C4AOA- ABADA p C5002-AB.DA C52103H...B91001A-AB.ADA P B95OAJA-B.ADA P C92002A-.ADA P B91001C-AB.ADA P B95ODHA-B.ADA P C92O0A-.ADA P B91OO1D- ABADA P B950BAA-B.ADA P C92OBA.ADA P B91001E-AB.ADA P

  12. A practical implementation of the 2010 IPEM high dose rate brachytherapy code of practice for the calibration of 192Ir sources

    NASA Astrophysics Data System (ADS)

    Awunor, O. A.; Lecomber, A. R.; Richmond, N.; Walker, C.

    2011-08-01

    This paper details a practical method for deriving the reference air kerma rate calibration coefficient for Farmer NE2571 chambers using the UK Institute of Physics and Engineering in Medicine (IPEM) code of practice for the determination of the reference air kerma rate for HDR 192Ir brachytherapy sources based on the National Physical Laboratory (NPL) air kerma standard. The reference air kerma rate calibration coefficient was derived using pressure, temperature and source decay corrected ionization chamber response measurements over three successive 192Ir source clinical cycles. A secondary standard instrument (a Standard Imaging 1000 Plus well chamber) and four tertiary standard instruments (one additional Standard Imaging 1000 Plus well chamber and three Farmer NE2571 chambers housed in a perspex phantom) were used to provide traceability to the NPL primary standard and enable comparison of performance between the chambers. Conservative and optimized estimates on the expanded uncertainties (k = 2) associated with chamber response, ion recombination and reference air kerma rate calibration coefficient were determined. This was seen to be 2.3% and 0.4% respectively for chamber response, 0.2% and 0.08% respectively for ion recombination and 2.6% and 1.2% respectively for the calibration coefficient. No significant change in ion recombination with source decay was observed over the duration of clinical use of the respective 192Ir sources.

  13. A practical implementation of the 2010 IPEM high dose rate brachytherapy code of practice for the calibration of 192Ir sources.

    PubMed

    Awunor, O A; Lecomber, A R; Richmond, N; Walker, C

    2011-08-21

    This paper details a practical method for deriving the reference air kerma rate calibration coefficient for Farmer NE2571 chambers using the U.K. Institute of Physics and Engineering in Medicine (IPEM) code of practice for the determination of the reference air kerma rate for HDR (192)Ir brachytherapy sources based on the National Physical Laboratory (NPL) air kerma standard. The reference air kerma rate calibration coefficient was derived using pressure, temperature and source decay corrected ionization chamber response measurements over three successive (192)Ir source clinical cycles. A secondary standard instrument (a Standard Imaging 1000 Plus well chamber) and four tertiary standard instruments (one additional Standard Imaging 1000 Plus well chamber and three Farmer NE2571 chambers housed in a perspex phantom) were used to provide traceability to the NPL primary standard and enable comparison of performance between the chambers. Conservative and optimized estimates on the expanded uncertainties (k = 2) associated with chamber response, ion recombination and reference air kerma rate calibration coefficient were determined. This was seen to be 2.3% and 0.4% respectively for chamber response, 0.2% and 0.08% respectively for ion recombination and 2.6% and 1.2% respectively for the calibration coefficient. No significant change in ion recombination with source decay was observed over the duration of clinical use of the respective 192Ir sources.

  14. A seismic field test with a Low-level Acoustic Combustion Source and Pseudo-Noise codes

    NASA Astrophysics Data System (ADS)

    Askeland, Bjørn; Ruud, Bent Ole; Hobæk, Halvor; Mjelde, Rolf

    2009-01-01

    The Low-level Acoustic Combustion Source (LACS) which can fire its pulses at a high rate, has been tested successfully as a seismic marine source on shallow ice-age sediments in Byfjorden at Bergen, Norway. Pseudo-Noise pulsed signals with spiky autocorrelation functions were used to detect the sediments. Each transmitted sequence lasted 10 s and contained 43 pulses. While correlation gave a blurry result, deconvolution between the near-field recordings and the streamer recordings gave a clear seismic section. Compared to the section acquired with single air-gun shots along the same profile, the LACS gave a more clear presentation of the sediments and basement.

  15. Impact of Ada in the Flight Dynamics Division: Excitement and frustration

    NASA Technical Reports Server (NTRS)

    Bailey, John; Waligora, Sharon; Stark, Mike

    1993-01-01

    In 1985, NASA Goddard's Flight Dynamics Division (FDD) began investigating how the Ada language might apply to their software development projects. Although they began cautiously using Ada on only a few pilot projects, they expected that, if the Ada pilots showed promising results, they would fully transition their entire development organization from FORTRAN to Ada within 10 years. However, nearly 9 years later, the FDD still produces 80 percent of its software in FORTRAN, despite positive results on Ada projects. This paper reports preliminary results of an ongoing study, commissioned by the FDD, to quantify the impact of Ada in the FDD, to determine why Ada has not flourished, and to recommend future directions regarding Ada. Project trends in both languages are examined as are external factors and cultural issues that affected the infusion of this technology. This paper is the first public report on the Ada assessment study, which will conclude with a comprehensive final report in mid 1994.

  16. Neutron sources in the Varian Clinac 2100C/2300C medical accelerator calculated by the EGS4 code.

    PubMed

    Mao, X S; Kase, K R; Liu, J C; Nelson, W R; Kleck, J H; Johnsen, S

    1997-04-01

    The photoneutron yields produced in different components of the medical accelerator heads evaluated in these studies (24-MV Clinac 2500 and a Clinac 2100C/2300C running in the 10-MV, 15-MV, 18-MV and 20-MV modes) were calculated by the EGS4 Monte Carlo code using a modified version of the Combinatorial Geometry of MORSE-CG. Actual component dimensions and materials (i.e., targets, collimators, flattening filters, jaws and shielding for specific accelerator heads) were used in the geometric simulations. Calculated relative neutron yields in different components of a 24-MV Clinac 2500 were compared with the published measured data, and were found to agree to within +/-30%. Total neutron yields produced in the Clinac 2100/2300, as a function of primary electron energy and field size, are presented. A simplified Clinac 2100/2300C geometry is presented to calculate neutron yields, which were compared with those calculated by using the fully-described geometry.

  17. XSOR codes users manual

    SciTech Connect

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ``XSOR``. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms.

  18. Toward real-time performance benchmarks for Ada

    NASA Technical Reports Server (NTRS)

    Clapp, Russell M.; Duchesneau, Louis; Volz, Richard A.; Mudge, Trevor N.; Schultze, Timothy

    1986-01-01

    The issue of real-time performance measurements for the Ada programming language through the use of benchmarks is addressed. First, the Ada notion of time is examined and a set of basic measurement techniques are developed. Then a set of Ada language features believed to be important for real-time performance are presented and specific measurement methods discussed. In addition, other important time related features which are not explicitly part of the language but are part of the run-time related features which are not explicitly part of the language but are part of the run-time system are also identified and measurement techniques developed. The measurement techniques are applied to the language and run-time system features and the results are presented.

  19. Transparent Ada rendezvous in a fault tolerant distributed system

    NASA Technical Reports Server (NTRS)

    Racine, Roger

    1986-01-01

    There are many problems associated with distributing an Ada program over a loosely coupled communication network. Some of these problems involve the various aspects of the distributed rendezvous. The problems addressed involve supporting the delay statement in a selective call and supporting the else clause in a selective call. Most of these difficulties are compounded by the need for an efficient communication system. The difficulties are compounded even more by considering the possibility of hardware faults occurring while the program is running. With a hardware fault tolerant computer system, it is possible to design a distribution scheme and communication software which is efficient and allows Ada semantics to be preserved. An Ada design for the communications software of one such system will be presented, including a description of the services provided in the seven layers of an International Standards Organization (ISO) Open System Interconnect (OSI) model communications system. The system capabilities (hardware and software) that allow this communication system will also be described.

  20. An automated quality assessor for Ada object-oriented designs

    NASA Technical Reports Server (NTRS)

    Bailin, Sidney C.

    1988-01-01

    A tool for evaluating object-oriented designs (OODs) for Ada software is described. The tool assumes a design expressed as a hierarchy of object diagrams. A design of this type identifies the objects of a system, an interface to each object, and the usage relationships between objects. When such a design is implemented in Ada, objects become packages, interfaces become package specifications, and usage relationships become Ada `with' clauses and package references. An automated quality assessor has been developed that is based on flagging undesirable design constructs. For convenience, distinctions are made among three levels of severity: questionable, undesirable, and hazardous. A questionable construct is one that may well be appropriate. An undesirable construct is one that should be changed because it is potentially harmful to the reliability, maintainability, or reusability of the software. A hazardous construct is one that is undesirable and that introduces a high level of risk.

  1. TOMO3D: 3-D joint refraction and reflection traveltime tomography parallel code for active-source seismic data—synthetic test

    NASA Astrophysics Data System (ADS)

    Meléndez, A.; Korenaga, J.; Sallarès, V.; Miniussi, A.; Ranero, C. R.

    2015-10-01

    We present a new 3-D traveltime tomography code (TOMO3D) for the modelling of active-source seismic data that uses the arrival times of both refracted and reflected seismic phases to derive the velocity distribution and the geometry of reflecting boundaries in the subsurface. This code is based on its popular 2-D version TOMO2D from which it inherited the methods to solve the forward and inverse problems. The traveltime calculations are done using a hybrid ray-tracing technique combining the graph and bending methods. The LSQR algorithm is used to perform the iterative regularized inversion to improve the initial velocity and depth models. In order to cope with an increased computational demand due to the incorporation of the third dimension, the forward problem solver, which takes most of the run time (˜90 per cent in the test presented here), has been parallelized with a combination of multi-processing and message passing interface standards. This parallelization distributes the ray-tracing and traveltime calculations among available computational resources. The code's performance is illustrated with a realistic synthetic example, including a checkerboard anomaly and two reflectors, which simulates the geometry of a subduction zone. The code is designed to invert for a single reflector at a time. A data-driven layer-stripping strategy is proposed for cases involving multiple reflectors, and it is tested for the successive inversion of the two reflectors. Layers are bound by consecutive reflectors, and an initial velocity model for each inversion step incorporates the results from previous steps. This strategy poses simpler inversion problems at each step, allowing the recovery of strong velocity discontinuities that would otherwise be smoothened.

  2. Transmission from theory to practice: Experiences using open-source code development and a virtual short course to increase the adoption of new theoretical approaches

    NASA Astrophysics Data System (ADS)

    Harman, C. J.

    2015-12-01

    Even amongst the academic community, new theoretical tools can remain underutilized due to the investment of time and resources required to understand and implement them. This surely limits the frequency that new theory is rigorously tested against data by scientists outside the group that developed it, and limits the impact that new tools could have on the advancement of science. Reducing the barriers to adoption through online education and open-source code can bridge the gap between theory and data, forging new collaborations, and advancing science. A pilot venture aimed at increasing the adoption of a new theory of time-variable transit time distributions was begun in July 2015 as a collaboration between Johns Hopkins University and The Consortium of Universities for the Advancement of Hydrologic Science (CUAHSI). There were four main components to the venture: a public online seminar covering the theory, an open source code repository, a virtual short course designed to help participants apply the theory to their data, and an online forum to maintain discussion and build a community of users. 18 participants were selected for the non-public components based on their responses in an application, and were asked to fill out a course evaluation at the end of the short course, and again several months later. These evaluations, along with participation in the forum and on-going contact with the organizer suggest strengths and weaknesses in this combination of components to assist participants in adopting new tools.

  3. Ada compiler validation summary report: Cray Research, Inc. , Cray Ada Compiler, Version 1. 1 Cray X-MP (Host Target), 890523W1. 10080

    SciTech Connect

    Not Available

    1989-05-23

    This Validation Summary Report describes the extent to which a specific Ada compiler conforms to the Ada Standard, ANSI/MIL-STD-1815A. The report explains all technical terms used within it and thoroughly reports the results of testing this compiler using the Ada Compiler Validation Capability. An Ada compiler must be implemented according to the Ada Standard, and any implementation-dependent features must conform to the requirements of the Ada Standard. The Ada Standard must be implemented in its entirety, and nothing can be implemented that is not in the Standard. Even though all validated Ada compilers conform to the Ada Standard, it must be understood that some differences do exist between implementations. The Ada Standard permits some implementation dependencies - for example, the maximum length of identifiers or the maximum values of integer types. Other differences between compilers result from the characteristics of particular operating systems, hardware, or implementation strategies. All the dependencies observed during the process of testing this compiler are given in this report. The information in this report is derived from the test results produced during validation testing. The validation process includes submitting a suite of standardized tests, the ACVC, as inputs to an Ada compiler and evaluating the results.

  4. Ada compiler validation summary report. Cray Research, Inc. , Cray Ada Compiler, Version 1. 1, Cray-2, (Host Target), 890523W1. 10081

    SciTech Connect

    Not Available

    1989-05-23

    This Validation Summary Report describes the extent to which a specific Ada compiler conforms to the Ada Standard, ANSI-MIL-STD-1815A. The report explains all technical terms used within it and thoroughly reports the results of testing this compiler using the Ada Compiler Validation Capability. An Ada compiler must be implemented according to the Ada Standard, and any implementation-dependent features must conform to the requirements of the Ada Standard. The Ada Standard must be implemented in its entirety, and nothing can be implemented that is not in the Standard. Even though all validated Ada compilers conform to the Ada Standard, it must be understood that some differences do exist between implementations. The Ada Standard permits some implementation dependencies - for example, the maximum length of identifiers or the maximum values of integer types. Other differences between compilers result from the characteristics of particular operating systems, hardware, or implementation strategies. All the dependencies observed during the process of testing this compiler are given in this report. The information in this report is derived from the test results produced during validation testing. The validation process includes submitting a suite of standardized tests, the ACVC, as inputs to an Ada compiler and evaluating the results.

  5. T-cell lines from 2 patients with adenosine deaminase (ADA) deficiency showed the restoration of ADA activity resulted from the reversion of an inherited mutation.

    PubMed

    Ariga, T; Oda, N; Yamaguchi, K; Kawamura, N; Kikuta, H; Taniuchi, S; Kobayashi, Y; Terada, K; Ikeda, H; Hershfield, M S; Kobayashi, K; Sakiyama, Y

    2001-05-01

    Inherited deficiency of adenosine deaminase (ADA) results in one of the autosomal recessive forms of severe combined immunodeficiency. This report discusses 2 patients with ADA deficiency from different families, in whom a possible reverse mutation had occurred. The novel mutations were identified in the ADA gene from the patients, and both their parents were revealed to be carriers. Unexpectedly, established patient T-cell lines, not B-cell lines, showed half-normal levels of ADA enzyme activity. Reevaluation of the mutations in these T-cell lines indicated that one of the inherited ADA gene mutations was reverted in both patients. At least one of the patients seemed to possess the revertant cells in vivo; however, the mutant cells might have overcome the revertant after receiving ADA enzyme replacement therapy. These findings may have significant implications regarding the prospects for stem cell gene therapy for ADA deficiency.

  6. Formal methods in the design of Ada 1995

    NASA Technical Reports Server (NTRS)

    Guaspari, David

    1995-01-01

    Formal, mathematical methods are most useful when applied early in the design and implementation of a software system--that, at least, is the familiar refrain. I will report on a modest effort to apply formal methods at the earliest possible stage, namely, in the design of the Ada 95 programming language itself. This talk is an 'experience report' that provides brief case studies illustrating the kinds of problems we worked on, how we approached them, and the extent (if any) to which the results proved useful. It also derives some lessons and suggestions for those undertaking future projects of this kind. Ada 95 is the first revision of the standard for the Ada programming language. The revision began in 1988, when the Ada Joint Programming Office first asked the Ada Board to recommend a plan for revising the Ada standard. The first step in the revision was to solicit criticisms of Ada 83. A set of requirements for the new language standard, based on those criticisms, was published in 1990. A small design team, the Mapping Revision Team (MRT), became exclusively responsible for revising the language standard to satisfy those requirements. The MRT, from Intermetrics, is led by S. Tucker Taft. The work of the MRT was regularly subject to independent review and criticism by a committee of distinguished Reviewers and by several advisory teams--for example, the two User/Implementor teams, each consisting of an industrial user (attempting to make significant use of the new language on a realistic application) and a compiler vendor (undertaking, experimentally, to modify its current implementation in order to provide the necessary new features). One novel decision established the Language Precision Team (LPT), which investigated language proposals from a mathematical point of view. The LPT applied formal mathematical analysis to help improve the design of Ada 95 (e.g., by clarifying the language proposals) and to help promote its acceptance (e.g., by identifying a

  7. Simulation in Ada; Proceedings of the Eastern Simulation Conference, Norfolk, VA, March 3-8, 1985

    SciTech Connect

    Unger, B.; Lomow, G.; Trybul, T.

    1985-01-01

    The development of ADA compilers in 1984 reached a stage that now permits the implementation of ADA-language-based simulators which can handle large programs. The conference covered topics such as automated Ada-based methodology simulator program development. Attention was also given to implementing a simulator as a set of Ada tasks and testing software for Ada language discrete simulation models. Attempts at developing a moving target, distributed, real-time simulation using Ada for studying jet engine performance are described. Finally, techniques for simulating parallel computational processes for optical control problems are contrasted with sequential optimization process performance.

  8. Polar Codes

    DTIC Science & Technology

    2014-12-01

    density parity check (LDPC) code, a Reed–Solomon code, and three convolutional codes. iii CONTENTS EXECUTIVE SUMMARY...the most common. Many civilian systems use low density parity check (LDPC) FEC codes, and the Navy is planning to use LDPC for some future systems...other forward error correction methods: a turbo code, a low density parity check (LDPC) code, a Reed–Solomon code, and three convolutional codes

  9. System and method for investigating sub-surface features of a rock formation with acoustic sources generating coded signals

    SciTech Connect

    Vu, Cung Khac; Nihei, Kurt; Johnson, Paul A; Guyer, Robert; Ten Cate, James A; Le Bas, Pierre-Yves; Larmat, Carene S

    2014-12-30

    A system and a method for investigating rock formations includes generating, by a first acoustic source, a first acoustic signal comprising a first plurality of pulses, each pulse including a first modulated signal at a central frequency; and generating, by a second acoustic source, a second acoustic signal comprising a second plurality of pulses. A receiver arranged within the borehole receives a detected signal including a signal being generated by a non-linear mixing process from the first-and-second acoustic signal in a non-linear mixing zone within the intersection volume. The method also includes-processing the received signal to extract the signal generated by the non-linear mixing process over noise or over signals generated by a linear interaction process, or both.

  10. An open-source Matlab code package for improved rank-reduction 3D seismic data denoising and reconstruction

    NASA Astrophysics Data System (ADS)

    Chen, Yangkang; Huang, Weilin; Zhang, Dong; Chen, Wei

    2016-10-01

    Simultaneous seismic data denoising and reconstruction is a currently popular research subject in modern reflection seismology. Traditional rank-reduction based 3D seismic data denoising and reconstruction algorithm will cause strong residual noise in the reconstructed data and thus affect the following processing and interpretation tasks. In this paper, we propose an improved rank-reduction method by modifying the truncated singular value decomposition (TSVD) formula used in the traditional method. The proposed approach can help us obtain nearly perfect reconstruction performance even in the case of low signal-to-noise ratio (SNR). The proposed algorithm is tested via one synthetic and field data examples. Considering that seismic data interpolation and denoising source packages are seldom in the public domain, we also provide a program template for the rank-reduction based simultaneous denoising and reconstruction algorithm by providing an open-source Matlab package.

  11. Automated model integration at source code level: An approach for implementing models into the NASA Land Information System

    NASA Astrophysics Data System (ADS)

    Wang, S.; Peters-Lidard, C. D.; Mocko, D. M.; Kumar, S.; Nearing, G. S.; Arsenault, K. R.; Geiger, J. V.

    2014-12-01

    Model integration bridges the data flow between modeling frameworks and models. However, models usually do not fit directly into a particular modeling environment, if not designed for it. An example includes implementing different types of models into the NASA Land Information System (LIS), a software framework for land-surface modeling and data assimilation. Model implementation requires scientific knowledge and software expertise and may take a developer months to learn LIS and model software structure. Debugging and testing of the model implementation is also time-consuming due to not fully understanding LIS or the model. This time spent is costly for research and operational projects. To address this issue, an approach has been developed to automate model integration into LIS. With this in mind, a general model interface was designed to retrieve forcing inputs, parameters, and state variables needed by the model and to provide as state variables and outputs to LIS. Every model can be wrapped to comply with the interface, usually with a FORTRAN 90 subroutine. Development efforts need only knowledge of the model and basic programming skills. With such wrappers, the logic is the same for implementing all models. Code templates defined for this general model interface could be re-used with any specific model. Therefore, the model implementation can be done automatically. An automated model implementation toolkit was developed with Microsoft Excel and its built-in VBA language. It allows model specifications in three worksheets and contains FORTRAN 90 code templates in VBA programs. According to the model specification, the toolkit generates data structures and procedures within FORTRAN modules and subroutines, which transfer data between LIS and the model wrapper. Model implementation is standardized, and about 80 - 90% of the development load is reduced. In this presentation, the automated model implementation approach is described along with LIS programming

  12. The development of an Ada programming support environment database: SEAD (Software Engineering and Ada Database), user's manual

    NASA Technical Reports Server (NTRS)

    Liaw, Morris; Evesson, Donna

    1988-01-01

    This is a manual for users of the Software Engineering and Ada Database (SEAD). SEAD was developed to provide an information resource to NASA and NASA contractors with respect to Ada-based resources and activities that are available or underway either in NASA or elsewhere in the worldwide Ada community. The sharing of such information will reduce the duplication of effort while improving quality in the development of future software systems. The manual describes the organization of the data in SEAD, the user interface from logging in to logging out, and concludes with a ten chapter tutorial on how to use the information in SEAD. Two appendices provide quick reference for logging into SEAD and using the keyboard of an IBM 3270 or VT100 computer terminal.

  13. The IPEM code of practice for determination of the reference air kerma rate for HDR (192)Ir brachytherapy sources based on the NPL air kerma standard.

    PubMed

    Bidmead, A M; Sander, T; Locks, S M; Lee, C D; Aird, E G A; Nutbrown, R F; Flynn, A

    2010-06-07

    This paper contains the recommendations of the high dose rate (HDR) brachytherapy working party of the UK Institute of Physics and Engineering in Medicine (IPEM). The recommendations consist of a Code of Practice (COP) for the UK for measuring the reference air kerma rate (RAKR) of HDR (192)Ir brachytherapy sources. In 2004, the National Physical Laboratory (NPL) commissioned a primary standard for the realization of RAKR of HDR (192)Ir brachytherapy sources. This has meant that it is now possible to calibrate ionization chambers directly traceable to an air kerma standard using an (192)Ir source (Sander and Nutbrown 2006 NPL Report DQL-RD 004 (Teddington: NPL) http://publications.npl.co.uk). In order to use the source specification in terms of either RAKR, Κ(R) (ICRU 1985 ICRU Report No 38 (Washington, DC: ICRU); ICRU 1997 ICRU Report No 58 (Bethesda, MD: ICRU)), or air kerma strength, S(K) (Nath et al 1995 Med. Phys. 22 209-34), it has been necessary to develop algorithms that can calculate the dose at any point around brachytherapy sources within the patient tissues. The AAPM TG-43 protocol (Nath et al 1995 Med. Phys. 22 209-34) and the 2004 update TG-43U1 (Rivard et al 2004 Med. Phys. 31 633-74) have been developed more fully than any other protocol and are widely used in commercial treatment planning systems. Since the TG-43 formalism uses the quantity air kerma strength, whereas this COP uses RAKR, a unit conversion from RAKR to air kerma strength was included in the appendix to this COP. It is recommended that the measured RAKR determined with a calibrated well chamber traceable to the NPL (192)Ir primary standard is used in the treatment planning system. The measurement uncertainty in the source calibration based on the system described in this COP has been reduced considerably compared to other methods based on interpolation techniques.

  14. User Instructions for the EPIC-3 Code.

    DTIC Science & Technology

    1987-05-01

    Johnson, G.R., D.D Colby, and D J Vavrick. "Three-Dimensional Computer Code for Dynamic Response of Solids to Intense Impulsive Loads." Internatiunal...A182 728 USER INSTRUCTIONS FOR THE EPIC-3 CODE (U) HONEYWELL INC 1/1 BROOKLYN PARK MN DEFENSE SYSTEMS DIV JOHNSON ET AL MAY 87 7-8B-6 AFATL-TR-87-iB...COed AFATL-TR-87-10 AD-A 182 728 User Instructions for the EPIC -3 Code G R Johnson R A Stryk HONEYWELL INCORPORATED DEFENSE SYSTEMS DIVISION 7225

  15. SU-E-T-212: Comparison of TG-43 Dosimetric Parameters of Low and High Energy Brachytherapy Sources Obtained by MCNP Code Versions of 4C, X and 5

    SciTech Connect

    Zehtabian, M; Zaker, N; Sina, S; Meigooni, A Soleimani

    2015-06-15

    Purpose: Different versions of MCNP code are widely used for dosimetry purposes. The purpose of this study is to compare different versions of the MCNP codes in dosimetric evaluation of different brachytherapy sources. Methods: The TG-43 parameters such as dose rate constant, radial dose function, and anisotropy function of different brachytherapy sources, i.e. Pd-103, I-125, Ir-192, and Cs-137 were calculated in water phantom. The results obtained by three versions of Monte Carlo codes (MCNP4C, MCNPX, MCNP5) were compared for low and high energy brachytherapy sources. Then the cross section library of MCNP4C code was changed to ENDF/B-VI release 8 which is used in MCNP5 and MCNPX codes. Finally, the TG-43 parameters obtained using the MCNP4C-revised code, were compared with other codes. Results: The results of these investigations indicate that for high energy sources, the differences in TG-43 parameters between the codes are less than 1% for Ir-192 and less than 0.5% for Cs-137. However for low energy sources like I-125 and Pd-103, large discrepancies are observed in the g(r) values obtained by MCNP4C and the two other codes. The differences between g(r) values calculated using MCNP4C and MCNP5 at the distance of 6cm were found to be about 17% and 28% for I-125 and Pd-103 respectively. The results obtained with MCNP4C-revised and MCNPX were similar. However, the maximum difference between the results obtained with the MCNP5 and MCNP4C-revised codes was 2% at 6cm. Conclusion: The results indicate that using MCNP4C code for dosimetry of low energy brachytherapy sources can cause large errors in the results. Therefore it is recommended not to use this code for low energy sources, unless its cross section library is changed. Since the results obtained with MCNP4C-revised and MCNPX were similar, it is concluded that the difference between MCNP4C and MCNPX is their cross section libraries.

  16. Electron slowing-down spectra in water for electron and photon sources calculated with the Geant4-DNA code.

    PubMed

    Vassiliev, Oleg N

    2012-02-21

    Recently, a very low energy extension was added to the Monte Carlo simulation toolkit Geant4. It is intended for radiobiological modeling and is referred to as Geant4-DNA. Its performance, however, has not been systematically benchmarked in terms of transport characteristics. This study reports on the electron slowing-down spectra and mean energy per ion pair, the W-value, in water for monoenergetic electron and photon sources calculated with Geant4-DNA. These quantities depend on electron energy, but not on spatial or angular variables which makes them a good choice for testing the model of energy transfer processes. The spectra also have a scientific value for radiobiological modeling as they describe the energy distribution of electrons entering small volumes, such as the cell nucleus. Comparisons of Geant4-DNA results with previous studies showed overall good agreement. Some differences in slowing-down spectra between Geant4-DNA and previous studies were found at 100 eV and at approximately 500 eV that were attributed to approximations in models of vibrational excitations and atomic de-excitation after ionization by electron impact. We also found that the high-energy part of the Geant4-DNA spectrum for a 1 keV electron source was higher, and the asymptotic high-energy W-value was lower than previous studies reported.

  17. Ada as an implementation language for knowledge based systems

    NASA Technical Reports Server (NTRS)

    Rochowiak, Daniel

    1990-01-01

    Debates about the selection of programming languages often produce cultural collisions that are not easily resolved. This is especially true in the case of Ada and knowledge based programming. The construction of programming tools provides a desirable alternative for resolving the conflict.

  18. Section 504/ADA: Guidelines for Educators in Kansas. Revised.

    ERIC Educational Resources Information Center

    Miller, Joan; Bieker, Rod; Copenhaver, John

    This document presents the Kansas State Department of Education's guidelines to Section 504 of the Rehabilitation Act and the Americans with Disabilities Act (ADA). The guidelines specifically address Subparts A, B, C, and D of the regulations for Section 504 which deal with general provisions, employment practices, accessibility and education. An…

  19. The Impact of Business Size on Employer ADA Response

    ERIC Educational Resources Information Center

    Bruyere, Susanne M.; Erickson, William A.; VanLooy, Sara A.

    2006-01-01

    More than 10 years have passed since the employment provisions of the Americans with Disabilities Act of 1990 (ADA) came into effect for employers of 15 or more employees. Americans with disabilities continue to be more unemployed and underemployed than their nondisabled peers. Small businesses, with fewer than 500 employees, continue to be the…

  20. Software Engineering Laboratory Ada performance study: Results and implications

    NASA Technical Reports Server (NTRS)

    Booth, Eric W.; Stark, Michael E.

    1992-01-01

    The SEL is an organization sponsored by NASA/GSFC to investigate the effectiveness of software engineering technologies applied to the development of applications software. The SEL was created in 1977 and has three organizational members: NASA/GSFC, Systems Development Branch; The University of Maryland, Computer Sciences Department; and Computer Sciences Corporation, Systems Development Operation. The goals of the SEL are as follows: (1) to understand the software development process in the GSFC environments; (2) to measure the effect of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that include the Ada Performance Study Report. This paper describes the background of Ada in the Flight Dynamics Division (FDD), the objectives and scope of the Ada Performance Study, the measurement approach used, the performance tests performed, the major test results, and the implications for future FDD Ada development efforts.

  1. Rationale for the Design of the ADA (Tradename) Programming Language,

    DTIC Science & Technology

    1986-01-01

    Note that if the exception were propagated to the parent task, it would mean that child tasks could interfere asynchronously with their parent, and it...plethora of differently named procedures to cope with the demand for flexibilty in areas such as formatting. The approach taken in Ada lies between the two

  2. Analysis and Guidelines for Reusable Ada Software

    DTIC Science & Technology

    1992-08-01

    tion Openraions and Rrpos . 1215 e•retmor Davis Highway. Suite 1204. Arlington, A 22202-4302. and to the Office of Management and Budget. Paperwork...Reduction Proyect (0704-0 188), Washiston , DC 20503 1. AGENCY USE ONLY (Leavc blank) 2. REPORT DATE 3. REPORT TYPE AND DATES COVERE’D August 1992...not attempt to create a common source for the different components. These separate components would be relatively simple to manage , and comments could

  3. Ada Compiler Validation Summary Report: Certificate Number: 890420W1. 10075 International Business Machines Corporation. IBM Development System, for the Ada Language CMS/MVS Ada Cross Compiler, Version 2.1.1 IBM 3083 Host and IBM 4381 Target

    DTIC Science & Technology

    1989-04-20

    International business Machines Corporati,:i IBM Development System for the Ada Language, CMS/MVS Ada Cross Compiler, Version 2.1.1, Wright-Patterson AFB, IBM...VALIDATION SUMMARY REPORT: Certificate Number: 890420W1.10075 International Business Machines Corporation IBM Development System for the Ada Language CMS...command scripts provided by International Business Machines Corporation and reviewed by the validation team. The compiler was tested using all default

  4. Ada (Trade Name) Compiler Validation Summary Report: DDC Ada Compiler System Version 3.1 for DEC VAX-11/785.

    DTIC Science & Technology

    1985-12-27

    PB361O1A-AB.ADA P B38105A-AB.ADA P C35705K-.B.DEP P B36102A.ADA P B38105B- ABADA P C35705L-B.DEP N/A B36103A.ADA P B38106A-B.ADA P C357094-B.DEP N/A...aAB.ADA P C314OO1D.-B.DEP P C357O5T-.DEP N/A B36171G-AB-ADA P C3’IOO1E..B.DEP P C3570SU-aB.DEP N/AB36171H- ABADA P C34001F-B.DEP N/A C35705V-B.DEP N...AB361711.. ABADA P C3’IOO1G..D.DEP P C357O5W-B.DEP N/A B36201A-BADA P C34oo1H-B.ADA P C3570SX-B.DEP N/AB37003A-.AB.ADA P C3’IOO1I.B.ADA P C35705Y-B.DEP

  5. Flow cytometry analysis of adenosine deaminase (ADA) expression: a simple and reliable tool for the assessment of ADA-deficient patients before and after gene therapy.

    PubMed

    Otsu, Makoto; Hershfield, Michael S; Tuschong, Laura M; Muul, Linda M; Onodera, Masafumi; Ariga, Tadashi; Sakiyama, Yukio; Candotti, Fabio

    2002-02-10

    Clinical gene therapy trials for adenosine deaminase (ADA) deficiency have shown limited success of corrective gene transfer into autologous T lymphocytes and CD34(+) cells. In these trials, the levels of gene transduction and expression in hematopoietic cells have been assessed by DNA- or RNA-based assays and measurement of ADA enzyme activity. Although informative, these methods are rarely applied to clonal analysis. The results of these assays therefore provide best estimates of transduction efficiency and gene expression in bulk populations based on the assumption that gene transfer and expression are uniformly distributed among transduced cells. As a useful additional tool for evaluation of ADA gene expression, we have developed a flow cytometry (fluorescence-activated cell sorting, FACS) assay capable of estimating the levels of intracellular ADA on a single-cell basis. We validated this technique with T cell lines and peripheral blood mononuclear cells (PBMCs) from ADA-deficient patients that showed severely reduced levels of ADA expression (ADA-dull) by FACS and Western blot analyses. After retrovirus-mediated ADA gene transfer, these cells showed clearly distinguishable populations exhibiting ADA expression (ADA-bright), thus allowing estimation of transduction efficiency. By mixing ADA-deficient and normal cells and using enzymatic amplification, we determined that our staining procedure could detect as little as 5% ADA-bright cells. This technique, therefore, will be useful to quickly assess the expression of ADA in hematopoietic cells of severe combined immunodeficient patients and represents an important tool for the follow-up of patients treated in clinical gene transfer protocols.

  6. Development and Demonstration of an Ada Test Generation System

    NASA Technical Reports Server (NTRS)

    1996-01-01

    In this project we have built a prototype system that performs Feasible Path Analysis on Ada programs: given a description of a set of control flow paths through a procedure, and a predicate at a program point feasible path analysis determines if there is input data which causes execution to flow down some path in the collection reaching the point so that tile predicate is true. Feasible path analysis can be applied to program testing, program slicing, array bounds checking, and other forms of anomaly checking. FPA is central to most applications of program analysis. But, because this problem is formally unsolvable, syntactic-based approximations are used in its place. For example, in dead-code analysis the problem is to determine if there are any input values which cause execution to reach a specified program point. Instead an approximation to this problem is computed: determine whether there is a control flow path from the start of the program to the point. This syntactic approximation is efficiently computable and conservative: if there is no such path the program point is clearly unreachable, but if there is such a path, the analysis is inconclusive, and the code is assumed to be live. Such conservative analysis too often yields unsatisfactory results because the approximation is too weak. As another example, consider data flow analysis. A du-pair is a pair of program points such that the first point is a definition of a variable and the second point a use and for which there exists a definition-free path from the definition to the use. The sharper, semantic definition of a du-pair requires that there be a feasible definition-free path from the definition to the use. A compiler using du-pairs for detecting dead variables may miss optimizations by not considering feasibility. Similarly, a program analyzer computing program slices to merge parallel versions may report conflicts where none exist. In the context of software testing, feasibility analysis plays an

  7. Clinical coding. Code breakers.

    PubMed

    Mathieson, Steve

    2005-02-24

    --The advent of payment by results has seen the role of the clinical coder pushed to the fore in England. --Examinations for a clinical coding qualification began in 1999. In 2004, approximately 200 people took the qualification. --Trusts are attracting people to the role by offering training from scratch or through modern apprenticeships.

  8. SFACTOR: a computer code for calculating dose equivalent to a target organ per microcurie-day residence of a radionuclide in a source organ - supplementary report

    SciTech Connect

    Dunning, Jr, D E; Pleasant, J C; Killough, G G

    1980-05-01

    The purpose of this report is to describe revisions in the SFACTOR computer code and to provide useful documentation for that program. The SFACTOR computer code has been developed to implement current methodologies for computing the average dose equivalent rate S(X reverse arrow Y) to specified target organs in man due to 1 ..mu..Ci of a given radionuclide uniformly distributed in designated source orrgans. The SFACTOR methodology is largely based upon that of Snyder, however, it has been expanded to include components of S from alpha and spontaneous fission decay, in addition to electron and photon radiations. With this methodology, S-factors can be computed for any radionuclide for which decay data are available. The tabulations in Appendix II provide a reference compilation of S-factors for several dosimetrically important radionuclides which are not available elsewhere in the literature. These S-factors are calculated for an adult with characteristics similar to those of the International Commission on Radiological Protection's Reference Man. Corrections to tabulations from Dunning are presented in Appendix III, based upon the methods described in Section 2.3. 10 refs.

  9. Ada (Trademark) Compiler Validation Summary Report: Symbolics Incorporated. Symbolics Ada Compiler, Version 2.0, Symbolics 3670.

    DTIC Science & Technology

    1987-05-20

    invalid characters exist. $FILENAME WITH WILD CARD CHAR "eno:>testing>ada>c-tests>c*.*. * " An external file name that either contains a wild card character...or is too long if no wild card character exists. $GREATERTHANDURATION 86_401.0 A universal real value that lies between DURATION’BASE’LAST and 10

  10. Evolution of Ada technology in the flight dynamics area: Implementation/testing phase analysis

    NASA Technical Reports Server (NTRS)

    Quimby, Kelvin L.; Esker, Linda; Miller, John; Smith, Laurie; Stark, Mike; Mcgarry, Frank

    1989-01-01

    An analysis is presented of the software engineering issues related to the use of Ada for the implementation and system testing phases of four Ada projects developed in the flight dynamics area. These projects reflect an evolving understanding of more effective use of Ada features. In addition, the testing methodology used on these projects has changed substantially from that used on previous FORTRAN projects.

  11. Evolution of Ada technology in the flight dynamics area: Design phase analysis

    NASA Technical Reports Server (NTRS)

    Quimby, Kelvin L.; Esker, Linda

    1988-01-01

    The software engineering issues related to the use of the Ada programming language during the design phase of an Ada project are analyzed. Discussion shows how an evolving understanding of these issues is reflected in the design processes of three generations of Ada projects.

  12. 76 FR 38129 - Applications for New Awards; Americans With Disabilities Act (ADA) National Network Knowledge...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-29

    ... Applications for New Awards; Americans With Disabilities Act (ADA) National Network Knowledge Translation... Rehabilitation Research Projects (DRRP)--The ADA National Network Knowledge Translation Center Notice inviting... April 28, 2006 (71 FR 25472). The ADA National Network Knowledge Translation Center priority is from...

  13. 78 FR 34095 - Adequacy Status of the Idaho, Northern Ada County PM10

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-06

    ... AGENCY Adequacy Status of the Idaho, Northern Ada County PM 10 State Implementation Plan for... 2023 in the Northern Ada County PM 10 State Implementation Plan, Maintenance Plan: Ten-Year Update... in Northern Ada County. The EPA's finding was made pursuant to the adequacy review process...

  14. 78 FR 10263 - Proposed Collection; Comment Request for ADA Accommodations Request Packet

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-13

    ... Internal Revenue Service Proposed Collection; Comment Request for ADA Accommodations Request Packet AGENCY... U.S.C. 3506(c)(2)(A)). Currently, the IRS is soliciting comments concerning the ADA Accommodations... through the Internet, at Martha.R.Brinson @irs.gov. SUPPLEMENTARY INFORMATION: Title: ADA...

  15. Formal verification and testing: An integrated approach to validating Ada programs

    NASA Technical Reports Server (NTRS)

    Cohen, Norman H.

    1986-01-01

    An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.

  16. ELAPSE - NASA AMES LISP AND ADA BENCHMARK SUITE: EFFICIENCY OF LISP AND ADA PROCESSING - A SYSTEM EVALUATION

    NASA Technical Reports Server (NTRS)

    Davis, G. J.

    1994-01-01

    One area of research of the Information Sciences Division at NASA Ames Research Center is devoted to the analysis and enhancement of processors and advanced computer architectures, specifically in support of automation and robotic systems. To compare systems' abilities to efficiently process Lisp and Ada, scientists at Ames Research Center have developed a suite of non-parallel benchmarks called ELAPSE. The benchmark suite was designed to test a single computer's efficiency as well as alternate machine comparisons on Lisp, and/or Ada languages. ELAPSE tests the efficiency with which a machine can execute the various routines in each environment. The sample routines are based on numeric and symbolic manipulations and include two-dimensional fast Fourier transformations, Cholesky decomposition and substitution, Gaussian elimination, high-level data processing, and symbol-list references. Also included is a routine based on a Bayesian classification program sorting data into optimized groups. The ELAPSE benchmarks are available for any computer with a validated Ada compiler and/or Common Lisp system. Of the 18 routines that comprise ELAPSE, provided within this package are 14 developed or translated at Ames. The others are readily available through literature. The benchmark that requires the most memory is CHOLESKY.ADA. Under VAX/VMS, CHOLESKY.ADA requires 760K of main memory. ELAPSE is available on either two 5.25 inch 360K MS-DOS format diskettes (standard distribution) or a 9-track 1600 BPI ASCII CARD IMAGE format magnetic tape. The contents of the diskettes are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. The ELAPSE benchmarks were written in 1990. VAX and VMS are trademarks of Digital Equipment Corporation. MS-DOS is a registered trademark of Microsoft Corporation.

  17. Space and Terrestrial Power System Integration Optimization Code BRMAPS for Gas Turbine Space Power Plants With Nuclear Reactor Heat Sources

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    2007-01-01

    In view of the difficult times the US and global economies are experiencing today, funds for the development of advanced fission reactors nuclear power systems for space propulsion and planetary surface applications are currently not available. However, according to the Energy Policy Act of 2005 the U.S. needs to invest in developing fission reactor technology for ground based terrestrial power plants. Such plants would make a significant contribution toward drastic reduction of worldwide greenhouse gas emissions and associated global warming. To accomplish this goal the Next Generation Nuclear Plant Project (NGNP) has been established by DOE under the Generation IV Nuclear Systems Initiative. Idaho National Laboratory (INL) was designated as the lead in the development of VHTR (Very High Temperature Reactor) and HTGR (High Temperature Gas Reactor) technology to be integrated with MMW (multi-megawatt) helium gas turbine driven electric power AC generators. However, the advantages of transmitting power in high voltage DC form over large distances are also explored in the seminar lecture series. As an attractive alternate heat source the Liquid Fluoride Reactor (LFR), pioneered at ORNL (Oak Ridge National Laboratory) in the mid 1960's, would offer much higher energy yields than current nuclear plants by using an inherently safe energy conversion scheme based on the Thorium --> U233 fuel cycle and a fission process with a negative temperature coefficient of reactivity. The power plants are to be sized to meet electric power demand during peak periods and also for providing thermal energy for hydrogen (H2) production during "off peak" periods. This approach will both supply electric power by using environmentally clean nuclear heat which does not generate green house gases, and also provide a clean fuel H2 for the future, when, due to increased global demand and the decline in discovering new deposits, our supply of liquid fossil fuels will have been used up. This is

  18. All Source Analysis System (ASAS): Migration from VAX to Alpha AXP computer systems

    NASA Technical Reports Server (NTRS)

    Sjoholm-Sierchio, Michael J.; Friedman, Steven Z. (Editor)

    1994-01-01

    The Jet Propulsion Laboratory's (JPL's) experience migrating existing VAX applications to Digital Equipment Corporation's new Alpha AXP processor is covered. The rapid development approach used during the 10-month period required to migrate the All Source Analysis System (ASAS), 1.5 million lines of FORTRAN, C, and Ada code, is also covered. ASAS, an automated tactical intelligence system, was developed by the Jet Propulsion Laboratory for the U. S. Army. Other benefits achieved as a result of the significant performance improvements provided by Alpha AXP platform are also described.

  19. Ada (Trade Name) Compiler Validation Summary Report: International Business Machines Corporation. IBM Development System for the Ada Language System, Version 1.1.0, IBM 4381 under MVS.

    DTIC Science & Technology

    1988-05-22

    TITLE (andSubtile) 5. TYPE OF REPORT & PERIOD COVERED Ada Compler Validation Summary Report: 22 May 1987 to 22 May 1988 International Business Machines...IBM Development System for the Ada Language System, Version 1.1.0, International Business Machines Corporation, Wright-Patterson AFB. IBM 4381 under...SUMMARY REPORT: International Business Machines Corporation IBM Development System f’or the Ada Language System, Version 1.1.0 IBM 4381 under MVS

  20. Ada Compiler Validation Summary Report: Certificate Number: 890621W1. 10105, Tandem Computers, Incorporated, Tandem Ada, Version T9270C30, Tandem NonStop VLX

    DTIC Science & Technology

    1989-06-21

    Validation Capability, ACVC, Validation Testing, Ada Validation Office, AVO , Ada Validation Facility, AVF, ANS1/F4’L-S7D- 1815A, Ada Joint PrograrM Office...by the Ada Validation Organization ( AVO ). On-site testing was completed 21 June 1989 at Cupertino CA. 1.2 USE OF THIS VALIDATION SUMMARY REPORT...Consistent with the national laws of the originating country, the AVO may make full and free public disclosure of this report. In the United States, this

  1. [Gene therapy for adenosine deaminase (ADA) deficiency: review of the past, the present and the future].

    PubMed

    Ariga, T

    2001-01-01

    ADA deficiency is the first disease being treated by gene therapy. Since the first trial of gene therapy performed ten years ago, more than 10 patients including our case with ADA deficiency have been treated by the gene therapy with different clinical protocols. In contrast to the recent successful report for X-SCID patients, however, no curative effect of gene therapy for ADA deficiency has been achieved at the moment. In this chapter, I would like to review the past, the present and the future of gene therapy for ADA deficiency, and discuss an issue, especially PEG-ADA therapy, regarding the prospect for stem cell gene therapy for the disease.

  2. The implementation and use of Ada on distributed systems with high reliability requirements

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1986-01-01

    The use and implementation of Ada in distributed environments in which reliability is the primary concern were investigted. A distributed system, programmed entirely in Ada, was studied to assess the use of individual tasks without concern for the processor used. Continued development and testing of the fault tolerant Ada testbed; development of suggested changes to Ada to cope with the failures of interest; design of approaches to fault tolerant software in real time systems, and the integration of these ideas into Ada; and the preparation of various papers and presentations were discussed.

  3. Plaintiff cannot sue under ADA if he seeks disability benefits.

    PubMed

    1996-08-23

    [Name removed] sued a New Jersey Disney store for AIDS discrimination under the Americans with Disabilities Act (ADA). At the same time, [name removed] applied for disability benefits. A U.S. Circuit Court of Appeals ruled that a plaintiff who applies for disability benefits is no longer a qualified individual with a disability and therefore cannot sue for employment discrimination under the Americans with Disabilities Act (ADA). Disney fired [name removed] on November 18, 1993, after he took $2 from the cash register to buy cigarettes. Confronted with the accusation, [name removed] told his supervisor he had AIDS. He was fired immediately. Although [name removed] alleged Disney used the $2 cash transaction as a pretext to fire him, a Federal judge said [name removed] was barred from suing because of the inconsistency between his statements on the disability applications and on the lawsuit.

  4. Administration and scoring variance on the ADAS-Cog.

    PubMed

    Connor, Donald J; Sabbagh, Marwan N

    2008-11-01

    The Alzheimer's Disease Assessment Scale - Cognitive (ADAS-Cog) is the most commonly used primary outcome instrument in clinical trials for treatments of dementia. Variations in forms, administration procedures and scoring rules, along with rater turnover and intra-rater drift may decrease the reliability of the instrument. A survey of possible variations in the ADAS-Cog was administered to 26 volunteer raters at a clinical trials meeting. Results indicate notable protocol variations in the forms used, administration procedures, and scoring rules. Since change over time is used to determine treatment effect in clinical trials, standardizing the instrument's ambiguities and addressing common problems will greatly increase the instrument's reliability and thereby enhance its sensitivity to treatment effects.

  5. Evaluation of the scale dependent dynamic SGS model in the open source code caffa3d.MBRi in wall-bounded flows

    NASA Astrophysics Data System (ADS)

    Draper, Martin; Usera, Gabriel

    2015-04-01

    The Scale Dependent Dynamic Model (SDDM) has been widely validated in large-eddy simulations using pseudo-spectral codes [1][2][3]. The scale dependency, particularly the potential law, has been proved also in a priori studies [4][5]. To the authors' knowledge there have been only few attempts to use the SDDM in finite difference (FD) and finite volume (FV) codes [6][7], finding some improvements with the dynamic procedures (scale independent or scale dependent approach), but not showing the behavior of the scale-dependence parameter when using the SDDM. The aim of the present paper is to evaluate the SDDM in the open source code caffa3d.MBRi, an updated version of the code presented in [8]. caffa3d.MBRi is a FV code, second-order accurate, parallelized with MPI, in which the domain is divided in unstructured blocks of structured grids. To accomplish this, 2 cases are considered: flow between flat plates and flow over a rough surface with the presence of a model wind turbine, taking for this case the experimental data presented in [9]. In both cases the standard Smagorinsky Model (SM), the Scale Independent Dynamic Model (SIDM) and the SDDM are tested. As presented in [6][7] slight improvements are obtained with the SDDM. Nevertheless, the behavior of the scale-dependence parameter supports the generalization of the dynamic procedure proposed in the SDDM, particularly taking into account that no explicit filter is used (the implicit filter is unknown). [1] F. Porté-Agel, C. Meneveau, M.B. Parlange. "A scale-dependent dynamic model for large-eddy simulation: application to a neutral atmospheric boundary layer". Journal of Fluid Mechanics, 2000, 415, 261-284. [2] E. Bou-Zeid, C. Meneveau, M. Parlante. "A scale-dependent Lagrangian dynamic model for large eddy simulation of complex turbulent flows". Physics of Fluids, 2005, 17, 025105 (18p). [3] R. Stoll, F. Porté-Agel. "Dynamic subgrid-scale models for momentum and scalar fluxes in large-eddy simulations of

  6. Test Case Study: Estimating the Cost of Ada Software Development

    DTIC Science & Technology

    1989-04-01

    selected Navy data were used to develop the SASEr model (HEA89]. SASET is meant to be used to estimate development in Assembly, Ada, or any HOL. However...3-6 3.4 Criteria For Ratings Selection ........ 3-10 3.4.1 Normalization ..... ................. .. 3-11 3.4.2 Cunparison of Model Results...costing issues. The guiding principle for model selection for inclusion in the study was the availability of models to the AF=, USACEAC, and IITRI

  7. Insurance benefits under the ADA: Discrimination or business as usual?

    SciTech Connect

    McFadden, M.E.

    1993-12-31

    In December 1987, John McGann discovered he had AIDS. In July 1988, his employer altered his health insurance policy by reducing lifetime coverage for AIDS to $5,000, while maintaining the million-dollar limit for all other health conditions. The United States Court of Appeals for the Fifth Circuit upheld the employer`s right to make that change. The Supreme Court denied certiori. Public outcry was immediate and voluminous. The Solicitor General argued that the new Americans with Disabilities Act would save future John McGanns from the same treatment, but the validity of this optimistic prediction is yet to be determined. The Americans with Disabilities Act of 1990 (ADA) is landmark legislation that bars discrimination against the disabled in all aspects of employment, public services, and accommodations. The Act broadly defines disability to include illnesses such as AIDS and cancer, as well as limitations on mobility, vision, and hearing. The ADA indisputably creates a private cause of action for discrimination on the basis of disability. However, depending on the standard of review chosen by the federal courts, this cause of action may or may not provide much protection to those claiming discrimination on the basis of disability in employee benefits and insurance. This article discusses the ADA`s coverage of insurance and benefits in light of the possible standards courts might use to evaluate actions of parties in suits alleging discrimination in these areas and applies those standards of review to the facts of the McGann case. 146 refs.

  8. AN ADA LINEAR ALGEBRA PACKAGE MODELED AFTER HAL/S

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.

    1994-01-01

    This package extends the Ada programming language to include linear algebra capabilities similar to those of the HAL/S programming language. The package is designed for avionics applications such as Space Station flight software. In addition to the HAL/S built-in functions, the package incorporates the quaternion functions used in the Shuttle and Galileo projects, and routines from LINPAK that solve systems of equations involving general square matrices. Language conventions in this package follow those of HAL/S to the maximum extent practical and minimize the effort required for writing new avionics software and translating existent software into Ada. Valid numeric types in this package include scalar, vector, matrix, and quaternion declarations. (Quaternions are fourcomponent vectors used in representing motion between two coordinate frames). Single precision and double precision floating point arithmetic is available in addition to the standard double precision integer manipulation. Infix operators are used instead of function calls to define dot products, cross products, quaternion products, and mixed scalar-vector, scalar-matrix, and vector-matrix products. The package contains two generic programs: one for floating point, and one for integer. The actual component type is passed as a formal parameter to the generic linear algebra package. The procedures for solving systems of linear equations defined by general matrices include GEFA, GECO, GESL, and GIDI. The HAL/S functions include ABVAL, UNIT, TRACE, DET, INVERSE, TRANSPOSE, GET, PUT, FETCH, PLACE, and IDENTITY. This package is written in Ada (Version 1.2) for batch execution and is machine independent. The linear algebra software depends on nothing outside the Ada language except for a call to a square root function for floating point scalars (such as SQRT in the DEC VAX MATHLIB library). This program was developed in 1989, and is a copyrighted work with all copyright vested in NASA.

  9. The Definition of Production Quality Ada(trade name) Compiler.

    DTIC Science & Technology

    1987-03-20

    UNIT ELEMENT NO. INO. INO. ACCESSION NO. 11. TITLE (include Security Clasification ) The Definition of a Production Quality Ada Compiler 12. PERSONAL...DIVISION AIR FORCE SYSTEMS COMMAND Los Angeles Air Force Station P.O. Box 92960, Worldway Postal Center Los Angeles, CA 90009-2960 APPROVED FOR PUBLIC ...Department, Engineering Group. Mr. Giovanni Bargero, SD/ALR, approved the report for the Air Force. This report has been reviewed by the Public Affairs

  10. LIGHT CURVES OF CORE-COLLAPSE SUPERNOVAE WITH SUBSTANTIAL MASS LOSS USING THE NEW OPEN-SOURCE SUPERNOVA EXPLOSION CODE (SNEC)

    SciTech Connect

    Morozova, Viktoriya; Renzo, Mathieu; Ott, Christian D.; Clausen, Drew; Couch, Sean M.; Ellis, Justin; Roberts, Luke F.; Piro, Anthony L.

    2015-11-20

    We present the SuperNova Explosion Code (SNEC), an open-source Lagrangian code for the hydrodynamics and equilibrium-diffusion radiation transport in the expanding envelopes of supernovae. Given a model of a progenitor star, an explosion energy, and an amount and distribution of radioactive nickel, SNEC generates the bolometric light curve, as well as the light curves in different broad bands assuming blackbody emission. As a first application of SNEC, we consider the explosions of a grid of 15 M{sub ⊙} (at zero-age main sequence, ZAMS) stars whose hydrogen envelopes are stripped to different extents and at different points in their evolution. The resulting light curves exhibit plateaus with durations of ∼20–100 days if ≳1.5–2 M{sub ⊙} of hydrogen-rich material is left and no plateau if less hydrogen-rich material is left. If these shorter plateau lengths are not seen for SNe IIP in nature, it suggests that, at least for ZAMS masses ≲20 M{sub ⊙}, hydrogen mass loss occurs as an all or nothing process. This perhaps points to the important role binary interactions play in generating the observed mass-stripped supernovae (i.e., Type Ib/c events). These light curves are also unlike what is typically seen for SNe IIL, arguing that simply varying the amount of mass loss cannot explain these events. The most stripped models begin to show double-peaked light curves similar to what is often seen for SNe IIb, confirming previous work that these supernovae can come from progenitors that have a small amount of hydrogen and a radius of ∼500 R{sub ⊙}.

  11. Development and implementation in the Monte Carlo code PENELOPE of a new virtual source model for radiotherapy photon beams and portal image calculation

    NASA Astrophysics Data System (ADS)

    Chabert, I.; Barat, E.; Dautremer, T.; Montagu, T.; Agelou, M.; Croc de Suray, A.; Garcia-Hernandez, J. C.; Gempp, S.; Benkreira, M.; de Carlan, L.; Lazaro, D.

    2016-07-01

    This work aims at developing a generic virtual source model (VSM) preserving all existing correlations between variables stored in a Monte Carlo pre-computed phase space (PS) file, for dose calculation and high-resolution portal image prediction. The reference PS file was calculated using the PENELOPE code, after the flattening filter (FF) of an Elekta Synergy 6 MV photon beam. Each particle was represented in a mobile coordinate system by its radial position (r s ) in the PS plane, its energy (E), and its polar and azimuthal angles (φ d and θ d ), describing the particle deviation compared to its initial direction after bremsstrahlung, and the deviation orientation. Three sub-sources were created by sorting out particles according to their last interaction location (target, primary collimator or FF). For each sub-source, 4D correlated-histograms were built by storing E, r s , φ d and θ d values. Five different adaptive binning schemes were studied to construct 4D histograms of the VSMs, to ensure histogram efficient handling as well as an accurate reproduction of E, r s , φ d and θ d distribution details. The five resulting VSMs were then implemented in PENELOPE. Their accuracy was first assessed in the PS plane, by comparing E, r s , φ d and θ d distributions with those obtained from the reference PS file. Second, dose distributions computed in water, using the VSMs and the reference PS file located below the FF, and also after collimation in both water and heterogeneous phantom, were compared using a 1.5%-0 mm and a 2%-0 mm global gamma index, respectively. Finally, portal images were calculated without and with phantoms in the beam. The model was then evaluated using a 1%-0 mm global gamma index. Performance of a mono-source VSM was also investigated and led, as with the multi-source model, to excellent results when combined with an adaptive binning scheme.

  12. Development and implementation in the Monte Carlo code PENELOPE of a new virtual source model for radiotherapy photon beams and portal image calculation.

    PubMed

    Chabert, I; Barat, E; Dautremer, T; Montagu, T; Agelou, M; Croc de Suray, A; Garcia-Hernandez, J C; Gempp, S; Benkreira, M; de Carlan, L; Lazaro, D

    2016-07-21

    This work aims at developing a generic virtual source model (VSM) preserving all existing correlations between variables stored in a Monte Carlo pre-computed phase space (PS) file, for dose calculation and high-resolution portal image prediction. The reference PS file was calculated using the PENELOPE code, after the flattening filter (FF) of an Elekta Synergy 6 MV photon beam. Each particle was represented in a mobile coordinate system by its radial position (r s ) in the PS plane, its energy (E), and its polar and azimuthal angles (φ d and θ d ), describing the particle deviation compared to its initial direction after bremsstrahlung, and the deviation orientation. Three sub-sources were created by sorting out particles according to their last interaction location (target, primary collimator or FF). For each sub-source, 4D correlated-histograms were built by storing E, r s , φ d and θ d values. Five different adaptive binning schemes were studied to construct 4D histograms of the VSMs, to ensure histogram efficient handling as well as an accurate reproduction of E, r s , φ d and θ d distribution details. The five resulting VSMs were then implemented in PENELOPE. Their accuracy was first assessed in the PS plane, by comparing E, r s , φ d and θ d distributions with those obtained from the reference PS file. Second, dose distributions computed in water, using the VSMs and the reference PS file located below the FF, and also after collimation in both water and heterogeneous phantom, were compared using a 1.5%-0 mm and a 2%-0 mm global gamma index, respectively. Finally, portal images were calculated without and with phantoms in the beam. The model was then evaluated using a 1%-0 mm global gamma index. Performance of a mono-source VSM was also investigated and led, as with the multi-source model, to excellent results when combined with an adaptive binning scheme.

  13. Chagas Parasite Detection in Blood Images Using AdaBoost

    PubMed Central

    Uc-Cetina, Víctor; Brito-Loeza, Carlos; Ruiz-Piña, Hugo

    2015-01-01

    The Chagas disease is a potentially life-threatening illness caused by the protozoan parasite, Trypanosoma cruzi. Visual detection of such parasite through microscopic inspection is a tedious and time-consuming task. In this paper, we provide an AdaBoost learning solution to the task of Chagas parasite detection in blood images. We give details of the algorithm and our experimental setup. With this method, we get 100% and 93.25% of sensitivity and specificity, respectively. A ROC comparison with the method most commonly used for the detection of malaria parasites based on support vector machines (SVM) is also provided. Our experimental work shows mainly two things: (1) Chagas parasites can be detected automatically using machine learning methods with high accuracy and (2) AdaBoost + SVM provides better overall detection performance than AdaBoost or SVMs alone. Such results are the best ones known so far for the problem of automatic detection of Chagas parasites through the use of machine learning, computer vision, and image processing methods. PMID:25861375

  14. Patient can't get ADA relief unless discrimination persists.

    PubMed

    1997-09-05

    Several United States courts have ruled that under Title III of the Americans with Disabilities Act (ADA), HIV-positive patients have no recourse against a physician who has discriminated against them, provided that the patients do not plan to visit the doctor in the future. If the patient states that he or she does not intend to use the physician's or facility's services again, there is no basis for injunctive relief under the ADA. This issue is illustrated by the case of [name removed], an HIV-positive patient residing in Atlanta, GA, who attempted to sue [name removed], a plastic surgeon who refused to perform a cosmetic implant procedure. [Name removed] claimed that the procedure was a direct threat to [name removed]'s health and refused to treat him. When [name removed] vowed that he would never again seek medical care from [name removed] he lost his opportunity to file suit against [name removed] under the ADA. Additionally, [name removed] did not prove that the cosmetic procedure would not pose a direct threat to his own health in the form of a postoperative infection. This position has been accepted by a number of courts, including [name removed] v. St. Helena Hospital (CA).

  15. Epitope characterization of the ADA response directed against a targeted immunocytokine.

    PubMed

    Stubenrauch, Kay; Künzel, Christian; Vogel, Rudolf; Tuerck, Dietrich; Schick, Eginhard; Heinrich, Julia

    2015-10-10

    Targeted immunocytokines (TICs) display potent activity in selective tumor suppression. This class of multi domain biotherapeutics (MDBs) is composed of the three major domains Fab, Fc, and a cytokine which may induce a complex polyclonal anti-drug antibody (ADA) response. However, classical ADA assays usually are not suitable to specify ADAs and to identify the immunogenic domains of a TIC. The purpose of the present study was to establish epitope characterization of ADA responses in order to specify immunogenic responses against a TIC and their direct impact on the pharmacokinetic profile, safety, and efficacy. Based on standard ADA screening and confirmation assays, respectively, domain detection assays (DDAs) and domain competition assays (DCAs) were established and compared by the use of 12 ADA-positive samples obtained from a cynomolgus monkey study in early development. Both domain-specific assays were sensitive enough to preserve the positive screening assay result and revealed an overall accordance for the evaluation of domain-specific ADA responses. About half of the samples displayed one ADA specificity, either for the Fab or for the cytokine (Cy) domain, and the remaining samples showed a combination of Fab-specific and Cy-specific ADA fractions. Fc-specific ADAs occurred in only one sample. In-depth comparison of DCAs and DDAs showed that both assays appeared to be appropriate to assess multi-specific ADA responses as well as minor ADA fractions. An advantage of DCAs is typically a fast and easy assay establishment, whereas, DDAs in some cases may be superior to assess low abundant ADAs in multi-specific responses. Our results reveal that both approaches benefit from thorough reagent development as an essential precondition for reliable epitope characterization of ADA responses.

  16. Alteration/deficiency in activation-3 (Ada3) plays a critical role in maintaining genomic stability

    PubMed Central

    Mirza, Sameer; Katafiasz, Bryan J.; Kumar, Rakesh; Wang, Jun; Mohibi, Shakur; Jain, Smrati; Gurumurthy, Channabasavaiah Basavaraju; Pandita, Tej K.; Dave, Bhavana J.; Band, Hamid; Band, Vimla

    2012-01-01

    Cell cycle regulation and DNA repair following damage are essential for maintaining genome integrity. DNA damage activates checkpoints in order to repair damaged DNA prior to exit to the next phase of cell cycle. Recently, we have shown the role of Ada3, a component of various histone acetyltransferase complexes, in cell cycle regulation, and loss of Ada3 results in mouse embryonic lethality. Here, we used adenovirus-Cre-mediated Ada3 deletion in Ada3fl/fl mouse embryonic fibroblasts (MEFs) to assess the role of Ada3 in DNA damage response following exposure to ionizing radiation (IR). We report that Ada3 depletion was associated with increased levels of phospho-ATM (pATM), γH2AX, phospho-53BP1 (p53BP1) and phospho-RAD51 (pRAD51) in untreated cells; however, radiation response was intact in Ada3−/− cells. Notably, Ada3−/− cells exhibited a significant delay in disappearance of DNA damage foci for several critical proteins involved in the DNA repair process. Significantly, loss of Ada3 led to enhanced chromosomal aberrations, such as chromosome breaks, fragments, deletions and translocations, which further increased upon DNA damage. Notably, the total numbers of aberrations were more clearly observed in S-phase, as compared with G₁ or G₂ phases of cell cycle with IR. Lastly, comparison of DNA damage in Ada3fl/fl and Ada3−/− cells confirmed higher residual DNA damage in Ada3−/− cells, underscoring a critical role of Ada3 in the DNA repair process. Taken together, these findings provide evidence for a novel role for Ada3 in maintenance of the DNA repair process and genomic stability. PMID:23095635

  17. The ADA Complex Is a Distinct Histone Acetyltransferase Complex in Saccharomyces cerevisiae

    PubMed Central

    Eberharter, Anton; Sterner, David E.; Schieltz, David; Hassan, Ahmed; Yates, John R.; Berger, Shelley L.; Workman, Jerry L.

    1999-01-01

    We have identified two Gcn5-dependent histone acetyltransferase (HAT) complexes from Saccharomyces cerevisiae, the 0.8-MDa ADA complex and the 1.8-MDa SAGA complex. The SAGA (Spt-Ada-Gcn5-acetyltransferase) complex contains several subunits which also function as part of other protein complexes, including a subset of TATA box binding protein-associated factors (TAFIIs) and Tra1. These observations raise the question of whether the 0.8-MDa ADA complex is a subcomplex of SAGA or whether it is a distinct HAT complex that also shares subunits with SAGA. To address this issue, we sought to determine if the ADA complex contained subunits that are not present in the SAGA complex. In this study, we report the purification of the ADA complex over 10 chromatographic steps. By a combination of mass spectrometry analysis and immunoblotting, we demonstrate that the adapter proteins Ada2, Ada3, and Gcn5 are indeed integral components of ADA. Furthermore, we identify the product of the S. cerevisiae gene YOR023C as a novel subunit of the ADA complex and name it Ahc1 for ADA HAT complex component 1. Biochemical functions of YOR023C have not been reported. However, AHC1 in high copy numbers suppresses the cold sensitivity caused by particular mutations in HTA1 (I. Pinto and F. Winston, personal communication), which encodes histone H2A (J. N. Hirschhorn et al., Mol. Cell. Biol. 15:1999–2009, 1995). Deletion of AHC1 disrupted the integrity of the ADA complex but did not affect SAGA or give rise to classic Ada− phenotypes. These results indicate that Gcn5, Ada2, and Ada3 function as part of a unique HAT complex (ADA) and represent shared subunits between this complex and SAGA. PMID:10490601

  18. Keno-Nr a Monte Carlo Code Simulating the Californium -252-SOURCE-DRIVEN Noise Analysis Experimental Method for Determining Subcriticality

    NASA Astrophysics Data System (ADS)

    Ficaro, Edward Patrick

    The ^{252}Cf -source-driven noise analysis (CSDNA) requires the measurement of the cross power spectral density (CPSD) G_ {23}(omega), between a pair of neutron detectors (subscripts 2 and 3) located in or near the fissile assembly, and the CPSDs, G_{12}( omega) and G_{13}( omega), between the neutron detectors and an ionization chamber 1 containing ^{252}Cf also located in or near the fissile assembly. The key advantage of this method is that the subcriticality of the assembly can be obtained from the ratio of spectral densities,{G _sp{12}{*}(omega)G_ {13}(omega)over G_{11 }(omega)G_{23}(omega) },using a point kinetic model formulation which is independent of the detector's properties and a reference measurement. The multigroup, Monte Carlo code, KENO-NR, was developed to eliminate the dependence of the measurement on the point kinetic formulation. This code utilizes time dependent, analog neutron tracking to simulate the experimental method, in addition to the underlying nuclear physics, as closely as possible. From a direct comparison of simulated and measured data, the calculational model and cross sections are validated for the calculation, and KENO-NR can then be rerun to provide a distributed source k_ {eff} calculation. Depending on the fissile assembly, a few hours to a couple of days of computation time are needed for a typical simulation executed on a desktop workstation. In this work, KENO-NR demonstrated the ability to accurately estimate the measured ratio of spectral densities from experiments using capture detectors performed on uranium metal cylinders, a cylindrical tank filled with aqueous uranyl nitrate, and arrays of safe storage bottles filled with uranyl nitrate. Good agreement was also seen between simulated and measured values of the prompt neutron decay constant from the fitted CPSDs. Poor agreement was seen between simulated and measured results using composite ^6Li-glass-plastic scintillators at large subcriticalities for the tank of

  19. Ada (Trademark) Compiler Validation Summary Report: Telesoft Ada Compiler Version 2.0a6 for VAX-11/780, Using UNIX (Trademark) 4.2 BSD (Mt. Xinu).

    DTIC Science & Technology

    1985-02-05

    file name ending with .TST means the test depends on one or more of the implementation-dependent parameters listed in section 4.1. A h file name ending...B5800A- ABADA B5802A-BADA B58001A-AB..ADA P B58002CAB.ADA P B58003B-B.ADA P B58002C-AB.ADA P B58003A-B.ADA P B58001B-AB.ADA P B59001A-AB.ADA P B59001C

  20. Source Code Analysis Laboratory (SCALe)

    DTIC Science & Technology

    2012-04-01

    revenue. Among respondents to the IAAR survey, 86% of companies certified in quality management realized a positive return on investment (ROI). An...SCALe undertakes. Testing and calibration laboratories that comply with ISO /IEC 17025 also operate in accordance with ISO 9001 . • NIST National...17025:2005 accredited and ISO 9001 :2008 registered. 4.3 SAIC Accreditation and Certification Services SAIC (Science Applications International

  1. Distribution of Isotopic and Environmental Tracers in Groundwater, Northern Ada County, Southwestern Idaho

    USGS Publications Warehouse

    Adkins, Candice B.; Bartolino, James R.

    2010-01-01

    Residents of northern Ada County, Idaho, depend on groundwater for domestic and agricultural uses. The population of this area is growing rapidly and groundwater resources must be understood for future water-resource management. The U.S. Geological Survey, in cooperation with the Idaho Department of Water Resources, used a suite of isotopic and environmental tracers to gain a better understanding of groundwater ages, recharge sources, and flowpaths in northern Ada County. Thirteen wells were sampled between September and October 2009 for field parameters, major anions and cations, nutrients, oxygen and hydrogen isotopes, tritium, radiocarbon, chlorofluorocarbons, and dissolved gasses. Well depths ranged from 30 to 580 feet below land surface. Wells were grouped together based on their depth and geographic location into the following four categories: shallow aquifer, intermediate/deep aquifer, Willow Creek aquifer, and Dry Creek aquifer. Major cations and anions indicated calcium-bicarbonate and sodium-bicarbonate water types in the study area. Oxygen and hydrogen isotopes carried an oxygen-18 excess signature, possibly indicating recharge from evaporated sources or water-rock interactions in the subsurface. Chlorofluorocarbons detected modern (post-1940s) recharge in every well sampled; tritium data indicated modern water (post-1951) in seven, predominantly shallow wells. Nutrient concentrations tended to be greater in wells signaling recent recharge based on groundwater age dating, thus confirming the presence of recent recharge in these wells. Corrected radiocarbon results generated estimated residence times from modern to 5,100 years before present. Residence time tended to increase with depth, as confirmed by all three age-tracers. The disagreement among residence times indicates that samples were well-mixed and that the sampled aquifers contain a mixture of young and old recharge. Due to a lack of data, no conclusions about sources of recharge could be drawn

  2. NASA-evolving to Ada: Five-year plan. A plan for implementing recommendations made by the Ada and software management assessment working group

    NASA Technical Reports Server (NTRS)

    1989-01-01

    At their March 1988 meeting, members of the National Aeronautics and Space Administration (NASA) Information Resources Management (IRM) Council expressed concern that NASA may not have the infrastructure necessary to support the use of Ada for major NASA software projects. Members also observed that the agency has no coordinated strategy for applying its experiences with Ada to subsequent projects (Hinners, 27 June 1988). To deal with these problems, the IRM Council chair appointed an intercenter Ada and Software Management Assessment Working Group (ASMAWG). They prepared a report (McGarry et al., March 1989) entitled, 'Ada and Software Management in NASA: Findings and Recommendations'. That report presented a series of recommendations intended to enable NASA to develop better software at lower cost through the use of Ada and other state-of-the-art software engineering technologies. The purpose here is to describe the steps (called objectives) by which this goal may be achieved, to identify the NASA officials or organizations responsible for carrying out the steps, and to define a schedule for doing so. This document sets forth four goals: adopt agency-wide software standards and policies; use Ada as the programming language for all mission software; establish an infrastructure to support software engineering, including the use of Ada, and to leverage the agency's software experience; and build the agency's knowledge base in Ada and software engineering. A schedule for achieving the objectives and goals is given.

  3. SPECT Imaging of 2-D and 3-D Distributed Sources with Near-Field Coded Aperture Collimation: Computer Simulation and Real Data Validation.

    PubMed

    Mu, Zhiping; Dobrucki, Lawrence W; Liu, Yi-Hwa

    The imaging of distributed sources with near-field coded aperture (CA) remains extremely challenging and is broadly considered unsuitable for single-photon emission computerized tomography (SPECT). This study proposes a novel CA SPECT reconstruction approach and evaluates the feasibilities of imaging and reconstructing distributed hot sources and cold lesions using near-field CA collimation and iterative image reconstruction. Computer simulations were designed to compare CA and pinhole collimations in two-dimensional radionuclide imaging. Digital phantoms were created and CA images of the phantoms were reconstructed using maximum likelihood expectation maximization (MLEM). Errors and the contrast-to-noise ratio (CNR) were calculated and image resolution was evaluated. An ex vivo rat heart with myocardial infarction was imaged using a micro-SPECT system equipped with a custom-made CA module and a commercial 5-pinhole collimator. Rat CA images were reconstructed via the three-dimensional (3-D) MLEM algorithm developed for CA SPECT with and without correction for a large projection angle, and 5-pinhole images were reconstructed using the commercial software provided by the SPECT system. Phantom images of CA were markedly improved in terms of image quality, quantitative root-mean-squared error, and CNR, as compared to pinhole images. CA and pinhole images yielded similar image resolution, while CA collimation resulted in fewer noise artifacts. CA and pinhole images of the rat heart were well reconstructed and the myocardial perfusion defects could be clearly discerned from 3-D CA and 5-pinhole SPECT images, whereas 5-pinhole SPECT images suffered from severe noise artifacts. Image contrast of CA SPECT was further improved after correction for the large projection angle used in the rat heart imaging. The computer simulations and small-animal imaging study presented herein indicate that the proposed 3-D CA SPECT imaging and reconstruction approaches worked reasonably

  4. Ada Compiler Validation Summary Report: R.R. Software, Inc., Janus/ADA, Version 2.1.3, IBM PS/2, Model 70-80386 (Host & Target) 890919W1.10158

    DTIC Science & Technology

    1989-09-19

    Capability, ACVC, Validation Testing, Ada Validation Office, AVO , Ada Validation Facility, AVF, A%.SI/.UL-STD- 1815A, Ada Joint Program Office, AJPO 20...the direction of the AVF according to procedures established by the Ada JQ.Int Program Office and administered by the Ada Validation Organization ( AVO ...originating country, the AVO may make full and free public disclosure of this report. In the United States, this is provided in accordance with the

  5. Application and systems software in Ada: Development experiences

    NASA Technical Reports Server (NTRS)

    Kuschill, Jim

    1986-01-01

    In its most basic sense software development involves describing the tasks to be solved, including the given objects and the operations to be performed on those objects. Unfortunately, the way people describe objects and operations usually bears little resemblance to source code in most contemporary computer languages. There are two ways around this problem. One is to allow users to describe what they want the computer to do in everyday, typically imprecise English. The PRODOC methodology and software development environment is based on a second more flexible and possibly even easier to use approach. Rather than hiding program structure, PRODOC represents such structure graphically using visual programming techniques. In addition, the program terminology used in PRODOC may be customized so as to match the way human experts in any given application area naturally describe the relevant data and operations. The PRODOC methodology is described in detail.

  6. Evaluation of the area factor used in the RESRAD code for the estimation of airborne contaminant concentrations of finite area sources

    SciTech Connect

    Chang, Y.S.; Yu, C.; Wang, S.K.

    1998-07-01

    The area factor is used in the RESRAD code to estimate the airborne contaminant concentrations for a finite area of contaminated soils. The area factor model used in RESRAD version 5.70 and earlier (referred to as the old area factor) was a simple, but conservative, mixing model that tended to overestimate the airborne concentrations of radionuclide contaminants. An improved and more realistic model for the area factor (referred to here as the new area factor) is described in this report. The new area factor model is designed to reflect site-specific soil characteristics and meteorological conditions. The site-specific parameters considered include the size of the source area, average particle diameter, and average wind speed. Other site-specific parameters (particle density, atmospheric stability, raindrop diameter, and annual precipitation rate) were assumed to be constant. The model uses the Gaussian plume model combined with contaminant removal processes, such as dry and wet deposition of particulates. Area factors estimated with the new model are compared with old area factors that were based on the simple mixing model. In addition, sensitivity analyses are conducted for parameters assumed to be constant. The new area factor model has been incorporated into RESRAD version 5.75 and later.

  7. Ada (Trade name) Validation Summary Report: TeleSoft, Inc., TeleSoft Ada Compiler, Version 2.3C3 for the Gould CONCEPT/32 (Trade name) Model 6750 under MPX, Version 3.2. Completion of On-Site Validation.

    DTIC Science & Technology

    1985-12-06

    B.ADA P B62001E-B.ADA P C62003B-B.ADA P C65003A-B.ADA P B63006P-B.ADA P c63004A-AB.ADA P C65003B-B.ADA P B63005A- ABADA P C62006A-B.ADA P C66002DA...B.ADA P B63005B-AB.ADA P C63004A-AB.ADA P C66002C-AB.ADA P B63005A-AB.ADA P c6�B-B.ADA P C66002D- ABADA P% B63005B-AB.ADA P C64OO0IG-B.ADA P C66002E

  8. Ada (Trade Name) Compiler Validation Summary Report. ALSYS AlsyCOMP 001, Version 1.3, VAX-11/750 Host, Altos ACS 68000 14 Target.

    DTIC Science & Technology

    1985-11-08

    C5266Sr-A .ADA P 952004E-AD.DEP P D5ADIO-AB.ADA P C520e7A-B.ADA P 952806A-A .ADA P RS5Ae1 P-AR. ADA P C52008A-A .ADA P 853001A-AB.ADA p 855AO10- ABADA P...A9. ADA P D55AO3D-AB.ADA P C54Ae4A-AB.ADA P C55Ce2B-AB.ADA P D55AO3E-AB.ADA P C54Afl6A- ABADA P C55CO3A-AB.ADA P D55Ae3F-AB.ADA P C54AO7A-AB.ADA P... ABADA P C64665D- .ADA P C67e62E-S .ADA P 6)63009A-S. ADA P C$4Oe5D~u C C67003A- .ADA P 863699-. ADA P C64865DA C C6700398. ADA P 863069C- . ADA P

  9. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  10. Ethical coding.

    PubMed

    Resnik, Barry I

    2009-01-01

    It is ethical, legal, and proper for a dermatologist to maximize income through proper coding of patient encounters and procedures. The overzealous physician can misinterpret reimbursement requirements or receive bad advice from other physicians and cross the line from aggressive coding to coding fraud. Several of the more common problem areas are discussed.

  11. Coding Issues in Grounded Theory

    ERIC Educational Resources Information Center

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  12. Reference Manual for the Ada Programming Language. Proposed Standard Document

    DTIC Science & Technology

    1980-07-01

    be given in their pre.ence. Example of conflicting names in two packbges: procedure R is use TRAFFIC, WATERCOLORS ; -- subtypes used to resolve the...conflicting type name COLOR subtype TCOLOR is TRAFFIC.COLOR; subtype WCOLOR is WATERCOLORS.COLOR; SIGNAL T_COLOR; PAINT W COLOR; begin SIGNAL = GREEN...that of TF AFFIC PAINT = GREEN; - - that of W ,TERCOLORS end R; A Use C0auses 8.4 8-8 Ada Reference Manual Example of name identification with a use

  13. An Overview of Advanced Data Acquisition System (ADAS)

    NASA Technical Reports Server (NTRS)

    Mata, Carlos T.; Steinrock, T. (Technical Monitor)

    2001-01-01

    The paper discusses the following: 1. Historical background. 2. What is ADAS? 3. R and D status. 4. Reliability/cost examples (1, 2, and 3). 5. What's new? 6. Technical advantages. 7. NASA relevance. 8. NASA plans/options. 9. Remaining R and D. 10. Applications. 11. Product benefits. 11. Commercial advantages. 12. intellectual property. Aerospace industry requires highly reliable data acquisition systems. Traditional Acquisition systems employ end-to-end hardware and software redundancy. Typically, redundancy adds weight, cost, power consumption, and complexity.

  14. An approach to distributed execution of Ada programs

    NASA Technical Reports Server (NTRS)

    Volz, R. A.; Krishnan, P.; Theriault, R.

    1987-01-01

    Intelligent control of the Space Station will require the coordinated execution of computer programs across a substantial number of computing elements. It will be important to develop large subsets of these programs in the form of a single program which executes in a distributed fashion across a number of processors. A translation strategy for distributed execution of Ada programs in which library packages and subprograms may be distributed is described. A preliminary version of the translator is operational. Simple data objects (no records or arrays as yet), subprograms, and static tasks may be referenced remotely.

  15. Ada as a Hardware Description Language: An Initial Report

    DTIC Science & Technology

    2014-02-25

    language. We are not 1AdL is i tradomark Pf the US Goverment Ada Joint Projert cffice0i! :,:~~~~~~~~~~~~~.. . .. . . . . .. . . . . ....; .:-. i...by Martin Marietta Corporution ind in part by the Defense Advanced Research Projects Agency (DOD). DARPA Order iu. 3597, iu:4io,ed by the Air Force...Avionics Laboratory under Contract F3361531 .1(1539, and DARIPA Order -1 on, n tored by the Office of Naval Research , under Contract ;.MDA 903-81 -C0411

  16. Quantum AdaBoost algorithm via cluster state

    NASA Astrophysics Data System (ADS)

    Li, Yuan

    2017-03-01

    The principle and theory of quantum computation are investigated by researchers for many years, and further applied to improve the efficiency of classical machine learning algorithms. Based on physical mechanism, a quantum version of AdaBoost (Adaptive Boosting) training algorithm is proposed in this paper, of which purpose is to construct a strong classifier. In the proposed scheme with cluster state in quantum mechanism is to realize the weak learning algorithm, and then update the corresponding weight of examples. As a result, a final classifier can be obtained by combining efficiently weak hypothesis based on measuring cluster state to reweight the distribution of examples.

  17. Antiepileptogenic effects of glutathione against increased brain ADA in PTZ-induced epilepsy.

    PubMed

    Pence, Sadrettin; Erkutlu, Ibrahim; Kurtul, Naciye; Bosnak, Mehmet; Alptekin, Mehmet; Tan, Uner

    2009-01-01

    Adenosine has been shown to play a significant role as a modulator of neuronal activity in convulsive disorders, acting as an endogenous anticonvulsant agent. Any change in adenosine deaminase (ADA) levels will reflect to adenosine levels. In the present study, we have investigated the effect of glutathione on brain tissue ADA levels due to seizures induced by convulsive and subconvulsive dose of pentylenetetrazol (PTZ) in mice. ADA levels due to seizures induced by convulsive and subconvulsive pentylenetetrazol were measured using the Giusti method. ADA levels were higher in the experimental epilepsy groups than in the control and sham groups. ADA levels significantly decreased in the glutathione groups, which may have antiseizure effects. Decreased levels of ADA would be due to increased adenosine levels, protecting against oxidative stress.

  18. Ada Adoption Handbook: A Program Manager’s Guide, Version 2.0

    DTIC Science & Technology

    1992-10-01

    crtical , and embedded applications. This handbook presents program managers with Information to make effective use of Ada. This handbook’ provides...for example, ground-based command and control systems or embedded avionics ) or specific programs. ’This handbook is one of two volumes of the Ada...using a single language, and software maintainability and reliability [193, 29, 107, 1911. An area where Ada’s impact has been positive Is in

  19. ADA3: a gene, identified by resistance to GAL4-VP16, with properties similar to and different from those of ADA2.

    PubMed Central

    Piña, B; Berger, S; Marcus, G A; Silverman, N; Agapite, J; Guarente, L

    1993-01-01

    We describe the isolation of a yeast gene, ADA3, mutations in which prevent the toxicity of GAL4-VP16 in vivo. Toxicity was previously proposed to be due to the trapping of general transcription factors required at RNA polymerase II promoters (S. L. Berger, B. Piña, N. Silverman, G. A. Marcus, J. Agapite, J. L. Regier, S. J. Triezenberg, and L. Guarente, Cell 70:251-265, 1992). trans activation by VP16 as well as the acidic activation domain of GCN4 is reduced in the mutant. Other activation domains, such as those of GAL4 and HAP4, are only slightly affected in the mutant. This spectrum is similar to that observed for mutants with lesions in ADA2, a gene proposed to encode a transcriptional adaptor. The ADA3 gene is not absolutely essential for cell growth, but gene disruption mutants grow slowly and are temperature sensitive. Strains doubly disrupted for ada2 and ada3 grow no more slowly than single mutants, providing further evidence that these genes function in the same pathway. Selection of initiation sites by the general transcriptional machinery in vitro is altered in the ada3 mutant, providing a clue that ADA3 could be a novel general transcription factor involved in the response to acidic activators. Images PMID:8413201

  20. Successful reconstitution of immunity in ADA-SCID by stem cell gene therapy following cessation of PEG-ADA and use of mild preconditioning.

    PubMed

    Gaspar, H Bobby; Bjorkegren, Emma; Parsley, Kate; Gilmour, Kimberly C; King, Doug; Sinclair, Joanna; Zhang, Fang; Giannakopoulos, Aris; Adams, Stuart; Fairbanks, Lynette D; Gaspar, Jane; Henderson, Lesley; Xu-Bayford, Jin Hua; Davies, E Graham; Veys, Paul A; Kinnon, Christine; Thrasher, Adrian J

    2006-10-01

    Gene therapy is a promising treatment option for monogenic diseases, but success has been seen in only a handful of studies thus far. We now document successful reconstitution of immune function in a child with the adenosine deaminase (ADA)-deficient form of severe combined immunodeficiency (SCID) following hematopoietic stem cell (HSC) gene therapy. An ADA-SCID child who showed a poor response to PEG-ADA enzyme replacement was enrolled into the clinical study. Following cessation of enzyme replacement therapy, autologous CD34(+) HSCs were transduced with an ADA-expressing gammaretroviral vector. Gene-modified cells were reinfused following one dose of preconditioning chemotherapy. Two years after the procedure, immunological and biochemical correction has been maintained with progressive increase in lymphocyte numbers, reinitiation of thymopoiesis, and systemic detoxification of ADA metabolites. Sustained vector marking with detection of polyclonal vector integration sites in multiple cell lineages and detection of ADA activity in red blood cells suggests transduction of early hematopoietic progenitors. No serious side effects were seen either as a result of the conditioning procedure or due to retroviral insertion. Gene therapy is an effective treatment option for the treatment of ADA-SCID.