Science.gov

Sample records for ada source code

  1. A graphically oriented specification language for automatic code generation. GRASP/Ada: A Graphical Representation of Algorithms, Structure, and Processes for Ada, phase 1

    NASA Technical Reports Server (NTRS)

    Cross, James H., II; Morrison, Kelly I.; May, Charles H., Jr.; Waddel, Kathryn C.

    1989-01-01

    The first phase of a three-phase effort to develop a new graphically oriented specification language which will facilitate the reverse engineering of Ada source code into graphical representations (GRs) as well as the automatic generation of Ada source code is described. A simplified view of the three phases of Graphical Representations for Algorithms, Structure, and Processes for Ada (GRASP/Ada) with respect to three basic classes of GRs is presented. Phase 1 concentrated on the derivation of an algorithmic diagram, the control structure diagram (CSD) (CRO88a) from Ada source code or Ada PDL. Phase 2 includes the generation of architectural and system level diagrams such as structure charts and data flow diagrams and should result in a requirements specification for a graphically oriented language able to support automatic code generation. Phase 3 will concentrate on the development of a prototype to demonstrate the feasibility of this new specification language.

  2. STGT program: Ada coding and architecture lessons learned

    NASA Technical Reports Server (NTRS)

    Usavage, Paul; Nagurney, Don

    1992-01-01

    STGT (Second TDRSS Ground Terminal) is currently halfway through the System Integration Test phase (Level 4 Testing). To date, many software architecture and Ada language issues have been encountered and solved. This paper, which is the transcript of a presentation at the 3 Dec. meeting, attempts to define these lessons plus others learned regarding software project management and risk management issues, training, performance, reuse, and reliability. Observations are included regarding the use of particular Ada coding constructs, software architecture trade-offs during the prototyping, development and testing stages of the project, and dangers inherent in parallel or concurrent systems, software, hardware, and operations engineering.

  3. Software engineering capability for Ada (GRASP/Ada Tool)

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1995-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis has been on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada Source code. A new Motif compliant graphical user interface has been developed for the GRASP/Ada prototype.

  4. Experiments with Ada

    NASA Technical Reports Server (NTRS)

    Roy, D.; Mcclimens, M.; Agresti, W.

    1985-01-01

    A 1200-line Ada source code project simulating the most basic functions of an operations control center was developed. We selected George Cherry's Process Abstraction Methodology for Embedded Large Applications (PAMELA) and DEC's Ada Compilation System (ACS) under VAX/VMS to build the software from requirements to acceptance test. The system runs faster than its FORTRAN implementation and was produced on schedule and under budget with an overall productivity in excess of 30 lines of Ada source code per day.

  5. ART-Ada design project, phase 2

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel; Allen, Bradley P.

    1990-01-01

    Interest in deploying expert systems in Ada has increased. An Ada based expert system tool is described called ART-Ada, which was built to support research into the language and methodological issues of expert systems in Ada. ART-Ada allows applications of an existing expert system tool called ART-IM (Automated Reasoning Tool for Information Management) to be deployed in various Ada environments. ART-IM, a C-based expert system tool, is used to generate Ada source code which is compiled and linked with an Ada based inference engine to produce an Ada executable image. ART-Ada is being used to implement several expert systems for NASA's Space Station Freedom Program and the U.S. Air Force.

  6. Translating expert system rules into Ada code with validation and verification

    NASA Technical Reports Server (NTRS)

    Becker, Lee; Duckworth, R. James; Green, Peter; Michalson, Bill; Gosselin, Dave; Nainani, Krishan; Pease, Adam

    1991-01-01

    The purpose of this ongoing research and development program is to develop software tools which enable the rapid development, upgrading, and maintenance of embedded real-time artificial intelligence systems. The goals of this phase of the research were to investigate the feasibility of developing software tools which automatically translate expert system rules into Ada code and develop methods for performing validation and verification testing of the resultant expert system. A prototype system was demonstrated which automatically translated rules from an Air Force expert system was demonstrated which detected errors in the execution of the resultant system. The method and prototype tools for converting AI representations into Ada code by converting the rules into Ada code modules and then linking them with an Activation Framework based run-time environment to form an executable load module are discussed. This method is based upon the use of Evidence Flow Graphs which are a data flow representation for intelligent systems. The development of prototype test generation and evaluation software which was used to test the resultant code is discussed. This testing was performed automatically using Monte-Carlo techniques based upon a constraint based description of the required performance for the system.

  7. An Embedded Rule-Based Diagnostic Expert System in Ada

    NASA Technical Reports Server (NTRS)

    Jones, Robert E.; Liberman, Eugene M.

    1992-01-01

    Ada is becoming an increasingly popular programming language for large Government-funded software projects. Ada with it portability, transportability, and maintainability lends itself well to today's complex programming environment. In addition, expert systems have also assumed a growing role in providing human-like reasoning capability expertise for computer systems. The integration is discussed of expert system technology with Ada programming language, especially a rule-based expert system using an ART-Ada (Automated Reasoning Tool for Ada) system shell. NASA Lewis was chosen as a beta test site for ART-Ada. The test was conducted by implementing the existing Autonomous Power EXpert System (APEX), a Lisp-based power expert system, in ART-Ada. Three components, the rule-based expert systems, a graphics user interface, and communications software make up SMART-Ada (Systems fault Management with ART-Ada). The rules were written in the ART-Ada development environment and converted to Ada source code. The graphics interface was developed with the Transportable Application Environment (TAE) Plus, which generates Ada source code to control graphics images. SMART-Ada communicates with a remote host to obtain either simulated or real data. The Ada source code generated with ART-Ada, TAE Plus, and communications code was incorporated into an Ada expert system that reads the data from a power distribution test bed, applies the rule to determine a fault, if one exists, and graphically displays it on the screen. The main objective, to conduct a beta test on the ART-Ada rule-based expert system shell, was achieved. The system is operational. New Ada tools will assist in future successful projects. ART-Ada is one such tool and is a viable alternative to the straight Ada code when an application requires a rule-based or knowledge-based approach.

  8. Update of GRASP/Ada reverse engineering tools for Ada

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1992-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation of Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype (Version 1) was designed and implemented using FLEX and BISON running under VMS on a VAS 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented using the HP widget toolkit and the X Windows System. In Phase 3, the user interface was extensively reworked using the Athena widget toolkit and X Windows. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. Following Phase 3, the prototype was evaluated by software engineering students at Auburn University and then updated with significant enhancements to the user interface including editing capabilities. Version 3.2 of the prototype was prepared for limited distribution to facilitate further evaluation. The current prototype provides the capability for the user to generate CSD's from Ada PDL or source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical application.

  9. The development of a program analysis environment for Ada: Reverse engineering tools for Ada

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1991-01-01

    The Graphical Representations of Algorithms, Structures, and Processes for Ada (GRASP/Ada) has successfully created and prototyped a new algorithm level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and thus improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype (Version 1) was designed and implemented using FLEX and BISON running under the Virtual Memory System (VMS) on a VAX 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. In Phase 3 of the project, the prototype was prepared for limited distribution (GRASP/Ada Version 3.0) to facilitate evaluation. The user interface was extensively reworked. The current prototype provides the capability for the user to generate CSD from Ada source code in a reverse engineering mode with a level of flexibility suitable for practical application.

  10. Software issues involved in code translation of C to Ada programs

    NASA Technical Reports Server (NTRS)

    Hooi, Robert; Giarratano, Joseph

    1986-01-01

    It is often thought that translation of one programming language to another is a simple solution that can be used to extend the software life span or in rehosting software to another environment. The possible problems are examined as are the advantages and disadvantages of direct machine or human code translation versus that of redesign and rewrite of the software. The translation of the expert system language called C Language Integrated Production System (CLIPS) which is written in C, to Ada, will be used as a case study of the problems that are encountered.

  11. Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; DuPrie, K.; Berriman, B.; Hanisch, R. J.; Mink, J.; Teuben, P. J.

    2013-10-01

    The Astrophysics Source Code Library (ASCL), founded in 1999, is a free on-line registry for source codes of interest to astronomers and astrophysicists. The library is housed on the discussion forum for Astronomy Picture of the Day (APOD) and can be accessed at http://ascl.net. The ASCL has a comprehensive listing that covers a significant number of the astrophysics source codes used to generate results published in or submitted to refereed journals and continues to grow. The ASCL currently has entries for over 500 codes; its records are citable and are indexed by ADS. The editors of the ASCL and members of its Advisory Committee were on hand at a demonstration table in the ADASS poster room to present the ASCL, accept code submissions, show how the ASCL is starting to be used by the astrophysics community, and take questions on and suggestions for improving the resource.

  12. Generic Ada code in the NASA space station command, control and communications environment

    NASA Technical Reports Server (NTRS)

    Mcdougall, D. P.; Vollman, T. E.

    1986-01-01

    The results of efforts to apply powerful Ada constructs to the formatted message handling process are described. The goal of these efforts was to extend the state-of-technology in message handling while at the same time producing production-quality, reusable code. The first effort was initiated in September, 1984 and delivered in April, 1985. That product, the Generic Message Handling Facility, met initial goals, was reused, and is available in the Ada Repository on ARPANET. However, it became apparent during its development that the initial approach to building a message handler template was not optimal. As a result of this initial effort, several alternate approaches were identified, and research is now on-going to identify an improved product. The ultimate goal is to be able to instantly build a message handling system for any message format given a specification of that message format. The problem lies in how to specify the message format, and one that is done, how to use that information to build the message handler. Message handling systems and message types are described. The initial efforts, its results and its shortcomings are detailed. The approach now being taken to build a system which will be significantly easier to implement, and once implemented, easier to use, is described. Finally, conclusions are offered.

  13. C Language Integrated Production System, Ada Version

    NASA Technical Reports Server (NTRS)

    Culbert, Chris; Riley, Gary; Savely, Robert T.; Melebeck, Clovis J.; White, Wesley A.; Mcgregor, Terry L.; Ferguson, Melisa; Razavipour, Reza

    1992-01-01

    CLIPS/Ada provides capabilities of CLIPS v4.3 but uses Ada as source language for CLIPS executable code. Implements forward-chaining rule-based language. Program contains inference engine and language syntax providing framework for construction of expert-system program. Also includes features for debugging application program. Based on Rete algorithm which provides efficient method for performing repeated matching of patterns. Written in Ada.

  14. Update of GRASP/Ada reverse engineering tools for Ada

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1993-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis was on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional pretty printed Ada source code. In Phase 1 of the GRASP/Ada project, the CSD graphical constructs were created and applied manually to several small Ada programs. A prototype CSD generator (Version 1) was designed and implemented using FLEX and BISON running under VMS on a VAX 11-780. In Phase 2, the prototype was improved and ported to the Sun 4 platform under UNIX. A user interface was designed and partially implemented using the HP widget toolkit and the X Windows System. In Phase 3, the user interface was extensively reworked using the Athena widget toolkit and X Windows. The prototype was applied successfully to numerous Ada programs ranging in size from several hundred to several thousand lines of source code. Following Phase 3,e two update phases were completed. Update'92 focused on the initial analysis of evaluation data collected from software engineering students at Auburn University and the addition of significant enhancements to the user interface. Update'93 (the current update) focused on the statistical analysis of the data collected in the previous update and preparation of Version 3.4 of the prototype for limited distribution to facilitate further evaluation. The current prototype provides the capability for the user to generate CSD's from Ada PDL or source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for practical

  15. Coded source neutron imaging

    SciTech Connect

    Bingham, Philip R; Santos-Villalobos, Hector J

    2011-01-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100 m) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100um and 10um aperture hole diameters show resolutions matching the hole diameters.

  16. GRASP/Ada 95: Reverse Engineering Tools for Ada

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1996-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped an algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD), and a new visualization for a fine-grained complexity metric called the Complexity Profile Graph (CPG). By synchronizing the CSD and the CPG, the CSD view of control structure, nesting, and source code is directly linked to the corresponding visualization of statement level complexity in the CPG. GRASP has been integrated with GNAT, the GNU Ada 95 Translator to provide a comprehensive graphical user interface and development environment for Ada 95. The user may view, edit, print, and compile source code as a CSD with no discernible addition to storage or computational overhead. The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis has been on the automatic generation of the CSD from Ada 95 source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada source code. The current update has focused on the design and implementation of a new Motif compliant user interface, and a new CSD generator consisting of a tagger and renderer. The Complexity Profile Graph (CPG) is based on a set of functions that describes the context, content, and the scaling for complexity on a statement by statement basis. When combined graphicafly, the result is a composite profile of complexity for the program unit. Ongoing research includes the development and refinement of the associated functions, and the development of the CPG generator prototype. The current Version 5.0 prototype provides the capability for the user to generate CSDs and CPGs from Ada 95 source code in a reverse engineering as well as forward engineering mode with a level of flexibility suitable for

  17. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  18. An Ada programming support environment

    NASA Technical Reports Server (NTRS)

    Tyrrill, AL; Chan, A. David

    1986-01-01

    The toolset of an Ada Programming Support Environment (APSE) being developed at North American Aircraft Operations (NAAO) of Rockwell International, is described. The APSE is resident on three different hosts and must support developments for the hosts and for embedded targets. Tools and developed software must be freely portable between the hosts. The toolset includes the usual editors, compilers, linkers, debuggers, configuration magnagers, and documentation tools. Generally, these are being supplied by the host computer vendors. Other tools, for example, pretty printer, cross referencer, compilation order tool, and management tools were obtained from public-domain sources, are implemented in Ada and are being ported to the hosts. Several tools being implemented in-house are of interest, these include an Ada Design Language processor based on compilable Ada. A Standalone Test Environment Generator facilitates test tool construction and partially automates unit level testing. A Code Auditor/Static Analyzer permits the Ada programs to be evaluated against measures of quality. An Ada Comment Box Generator partially automates generation of header comment boxes.

  19. Ada/POSIX binding: A focused Ada investigation

    NASA Technical Reports Server (NTRS)

    Legrand, Sue

    1988-01-01

    NASA is seeking an operating system interface definition (OSID) for the Space Station Program (SSP) in order to take advantage of the commercial off-the-shelf (COTS) products available today and the many that are expected in the future. NASA would also like to avoid the reliance on any one source for operating systems, information system, communication system, or instruction set architecture. The use of the Portable Operating System Interface for Computer Environments (POSIX) is examined as a possible solution to this problem. Since Ada is already the language of choice for SSP, the question of an Ada/POSIX binding is addressed. The intent of the binding is to provide access to the POSIX standard operation system (OS) interface and environment, by which application portability of Ada applications will be supported at the source code level. A guiding principle of Ada/POSIX binding development is a clear conformance of the Ada interface with the functional definition of POSIX. The interface is intended to be used by both application developers and system implementors. The objective is to provide a standard that allows a strictly conforming application source program that can be compiled to execute on any conforming implementation. Special emphasis is placed on first providing those functions and facilities that are needed in a wide variety of commercial applications

  20. Managing Ada development

    NASA Technical Reports Server (NTRS)

    Green, James R.

    1986-01-01

    The Ada programming language was developed under the sponsorship of the Department of Defense to address the soaring costs associated with software development and maintenance. Ada is powerful, and yet to take full advantage of its power, it is sufficiently complex and different from current programming approaches that there is considerable risk associated with committing a program to be done in Ada. There are also few programs of any substantial size that have been implemented using Ada that may be studied to determine those management methods that resulted in a successful Ada project. The items presented are the author's opinions which have been formed as a result of going through an experience software development. The difficulties faced, risks assumed, management methods applied, and lessons learned, and most importantly, the techniques that were successful are all valuable sources of management information for those managers ready to assume major Ada developments projects.

  1. Astrophysics Source Code Library Enhancements

    NASA Astrophysics Data System (ADS)

    Hanisch, R. J.; Allen, A.; Berriman, G. B.; DuPrie, K.; Mink, J.; Nemiroff, R. J.; Schmidt, J.; Shamir, L.; Shortridge, K.; Taylor, M.; Teuben, P. J.; Wallin, J.

    2015-09-01

    The Astrophysics Source Code Library (ASCL)1 is a free online registry of codes used in astronomy research; it currently contains over 900 codes and is indexed by ADS. The ASCL has recently moved a new infrastructure into production. The new site provides a true database for the code entries and integrates the WordPress news and information pages and the discussion forum into one site. Previous capabilities are retained and permalinks to ascl.net continue to work. This improvement offers more functionality and flexibility than the previous site, is easier to maintain, and offers new possibilities for collaboration. This paper covers these recent changes to the ASCL.

  2. FORTRAN Static Source Code Analyzer

    NASA Technical Reports Server (NTRS)

    Merwarth, P.

    1982-01-01

    FORTRAN Static Source Code Analyzer program (SAP) automatically gathers and reports statistics on occurrences of statements and structures within FORTRAN program. Provisions are made for weighting each statistic, providing user with overall figure of complexity. Statistics, as well as figures of complexity, are gathered on module-by-module basis. Overall summed statistics are accumulated for complete input source file.

  3. Source-Code-Analyzing Program

    NASA Technical Reports Server (NTRS)

    Manteufel, Thomas; Jun, Linda

    1991-01-01

    FORTRAN Static Source Code Analyzer program, SAP, developed to gather statistics automatically on occurrences of statements and structures within FORTRAN program and provide for reporting of those statistics. Provisions made to weight each statistic and provide overall figure of complexity. Statistics, as well as figures of complexity, gathered on module-by-module basis. Overall summed statistics also accumulated for complete input source file. Written in FORTRAN IV.

  4. FORTRAN Static Source Code Analyzer

    NASA Technical Reports Server (NTRS)

    Merwarth, P.

    1984-01-01

    FORTRAN Static Source Code Analyzer program, SAP (DEC VAX version), automatically gathers statistics on occurrences of statements and structures within FORTRAN program and provides reports of those statistics. Provisions made for weighting each statistic and provide an overall figure of complexity.

  5. GRASP/Ada: Graphical Representations of Algorithms, Structures, and Processes for Ada. The development of a program analysis environment for Ada: Reverse engineering tools for Ada, task 2, phase 3

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1991-01-01

    The main objective is the investigation, formulation, and generation of graphical representations of algorithms, structures, and processes for Ada (GRASP/Ada). The presented task, in which various graphical representations that can be extracted or generated from source code are described and categorized, is focused on reverse engineering. The following subject areas are covered: the system model; control structure diagram generator; object oriented design diagram generator; user interface; and the GRASP library.

  6. Syndrome source coding and its universal generalization

    NASA Technical Reports Server (NTRS)

    Ancheta, T. C., Jr.

    1975-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A universal generalization of syndrome-source-coding is formulated which provides robustly-effective, distortionless, coding of source ensembles.

  7. Applying Ada to Beech Starship avionics

    NASA Technical Reports Server (NTRS)

    Funk, David W.

    1986-01-01

    As Ada solidified in its development, it became evident that it offered advantages for avionics systems because of it support for modern software engineering principles and real time applications. An Ada programming support environment was developed for two major avionics subsystems in the Beech Starship. The two subsystems include electronic flight instrument displays and the flight management computer system. Both of these systems use multiple Intel 80186 microprocessors. The flight management computer provides flight planning, navigation displays, primary flight display of checklists and other pilot advisory information. Together these systems represent nearly 80,000 lines of Ada source code and to date approximately 30 man years of effort. The Beech Starship avionics systems are in flight testing.

  8. Practices in Code Discoverability: Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; Teuben, P.; Nemiroff, R. J.; Shamir, L.

    2012-09-01

    Here we describe the Astrophysics Source Code Library (ASCL), which takes an active approach to sharing astrophysics source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL now has over 340 codes in it and continues to grow. In 2011, the ASCL has on average added 19 codes per month. An advisory committee has been established to provide input and guide the development and expansion of the new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This paper provides the history and description of the ASCL. It lists the requirements for including codes, examines the advantages of the ASCL, and outlines some of its future plans.

  9. A proposed classification scheme for Ada-based software products

    NASA Technical Reports Server (NTRS)

    Cernosek, Gary J.

    1986-01-01

    As the requirements for producing software in the Ada language become a reality for projects such as the Space Station, a great amount of Ada-based program code will begin to emerge. Recognizing the potential for varying levels of quality to result in Ada programs, what is needed is a classification scheme that describes the quality of a software product whose source code exists in Ada form. A 5-level classification scheme is proposed that attempts to decompose this potentially broad spectrum of quality which Ada programs may possess. The number of classes and their corresponding names are not as important as the mere fact that there needs to be some set of criteria from which to evaluate programs existing in Ada. An exact criteria for each class is not presented, nor are any detailed suggestions of how to effectively implement this quality assessment. The idea of Ada-based software classification is introduced and a set of requirements from which to base further research and development is suggested.

  10. Ada style guide (version 1.1)

    NASA Technical Reports Server (NTRS)

    Seidewitz, Edwin V.; Agresti, William; Ferry, Daniel; Lavallee, David; Maresca, Paul; Nelson, Robert; Quimby, Kelvin; Rosenberg, Jacob; Roy, Daniel; Shell, Allyn

    1987-01-01

    Ada is a programming language of considerable expressive power. The Ada Language Reference Manual provides a thorough definition of the language. However, it does not offer sufficient guidance on the appropriate use of Ada's powerful features. For this reason, the Goddard Space Flight Center Ada User's Group has produced this style guide which addresses such program style issues. The guide covers three areas of Ada program style: the structural decomposition of a program; the coding and the use of specific Ada features; and the textural formatting of a program.

  11. GRASP/Ada (Graphical Representations of Algorithms, Structures, and Processes for Ada): The development of a program analysis environment for Ada. Reverse engineering tools for Ada, task 1, phase 2

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1990-01-01

    The study, formulation, and generation of structures for Ada (GRASP/Ada) are discussed in this second phase report of a three phase effort. Various graphical representations that can be extracted or generated from source code are described and categorized with focus on reverse engineering. The overall goal is to provide the foundation for a CASE (computer-aided software design) environment in which reverse engineering and forward engineering (development) are tightly coupled. Emphasis is on a subset of architectural diagrams that can be generated automatically from source code with the control structure diagram (CSD) included for completeness.

  12. Ada software productivity prototypes: A case study

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus M.; Habib-Agahi, Hamid; Malhotra, Shan

    1988-01-01

    A case study of the impact of Ada on a Command and Control project completed at the Jet Propulsion Laboratory (JPL) is given. The data for this study was collected as part of a general survey of software costs and productivity at JPL and other NASA sites. The task analyzed is a successful example of the use of rapid prototyping as applied to command and control for the U.S. Air Force and provides the U.S. Air Force Military Airlift Command with the ability to track aircraft, air crews and payloads worldwide. The task consists of a replicated database at several globally distributed sites. The local databases at each site can be updated within seconds after changes are entered at any one site. The system must be able to handle up to 400,000 activities per day. There are currently seven sites, each with a local area network of computers and a variety of user displays; the local area networks are tied together into a single wide area network. Using data obtained for eight modules, totaling approximately 500,000 source lines of code, researchers analyze the differences in productivities between subtasks. Factors considered are percentage of Ada used in coding, years of programmer experience, and the use of Ada tools and modern programming practices. The principle findings are the following. Productivity is very sensitive to programmer experience. The use of Ada software tools and the use of modern programming practices are important; without such use Ada is just a large complex language which can cause productivity to decrease. The impact of Ada on development effort phases is consistent with earlier reports at the project level but not at the module level.

  13. The Astrophysics Source Code Library: An Update

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Nemiroff, R. J.; Shamir, L.; Teuben, P. J.

    2012-01-01

    The Astrophysics Source Code Library (ASCL), founded in 1999, takes an active approach to sharing astrophysical source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL moved to a new location in 2010, and has over 300 codes in it and continues to grow. In 2011, the ASCL (http://asterisk.apod.com/viewforum.php?f=35) has on average added 19 new codes per month; we encourage scientists to submit their codes for inclusion. An advisory committee has been established to provide input and guide the development and expansion of its new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This presentation covers the history of the ASCL and examines the current state and benefits of the ASCL, the means of and requirements for including codes, and outlines its future plans.

  14. Implementation issues in source coding

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Chen, Yun-Chung; Hadenfeldt, A. C.

    1989-01-01

    An edge preserving image coding scheme which can be operated in both a lossy and a lossless manner was developed. The technique is an extension of the lossless encoding algorithm developed for the Mars observer spectral data. It can also be viewed as a modification of the DPCM algorithm. A packet video simulator was also developed from an existing modified packet network simulator. The coding scheme for this system is a modification of the mixture block coding (MBC) scheme described in the last report. Coding algorithms for packet video were also investigated.

  15. Making your code citable with the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; DuPrie, Kimberly; Schmidt, Judy; Berriman, G. Bruce; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.

    2016-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. With nearly 1,200 codes, it is the largest indexed resource for astronomy codes in existence. Established in 1999, it offers software authors a path to citation of their research codes even without publication of a paper describing the software, and offers scientists a way to find codes used in refereed publications, thus improving the transparency of the research. It also provides a method to quantify the impact of source codes in a fashion similar to the science metrics of journal articles. Citations using ASCL IDs are accepted by major astronomy journals and if formatted properly are tracked by ADS and other indexing services. The number of citations to ASCL entries increased sharply from 110 citations in January 2014 to 456 citations in September 2015. The percentage of code entries in ASCL that were cited at least once rose from 7.5% in January 2014 to 17.4% in September 2015. The ASCL's mid-2014 infrastructure upgrade added an easy entry submission form, more flexible browsing, search capabilities, and an RSS feeder for updates. A Changes/Additions form added this past fall lets authors submit links for papers that use their codes for addition to the ASCL entry even if those papers don't formally cite the codes, thus increasing the transparency of that research and capturing the value of their software to the community.

  16. Initial Ada components evaluation

    NASA Technical Reports Server (NTRS)

    Moebes, Travis

    1989-01-01

    The SAIC has the responsibility for independent test and validation of the SSE. They have been using a mathematical functions library package implemented in Ada to test the SSE IV and V process. The library package consists of elementary mathematical functions and is both machine and accuracy independent. The SSE Ada components evaluation includes code complexity metrics based on Halstead's software science metrics and McCabe's measure of cyclomatic complexity. Halstead's metrics are based on the number of operators and operands on a logical unit of code and are compiled from the number of distinct operators, distinct operands, and total number of occurrences of operators and operands. These metrics give an indication of the physical size of a program in terms of operators and operands and are used diagnostically to point to potential problems. McCabe's Cyclomatic Complexity Metrics (CCM) are compiled from flow charts transformed to equivalent directed graphs. The CCM is a measure of the total number of linearly independent paths through the code's control structure. These metrics were computed for the Ada mathematical functions library using Software Automated Verification and Validation (SAVVAS), the SSE IV and V tool. A table with selected results was shown, indicating that most of these routines are of good quality. Thresholds for the Halstead measures indicate poor quality if the length metric exceeds 260 or difficulty is greater than 190. The McCabe CCM indicated a high quality of software products.

  17. Source Code Plagiarism--A Student Perspective

    ERIC Educational Resources Information Center

    Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.

    2011-01-01

    This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…

  18. Distributed systems and Ada

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Viewgraphs of two briefings designed to provide information to the Software I and V Study Group to help complete the I and V study task are given. The information is taken from the areas of Ada real-time processing support, Ada run-time environments, Ada program construction, object oriented design, and Ada/UNIX/POSIX interfaces. Also covered are the development, integration, and verification of Ada systems; fault tolerance and Ada; and Ada programming standards, techniques, and tools.

  19. Software unit testing in Ada environment

    NASA Technical Reports Server (NTRS)

    Warnock, Glenn

    1986-01-01

    A validation procedure for the Ada binding of the Graphical Kernel System (GKS) is being developed. PRIOR Data Sciences is also producing a version of the GKS written in Ada. These major software engineering projects will provide an opportunity to demonstrate a sound approach for software testing in an Ada environment. The GKS/Ada validation capability will be a collection of test programs and data, and test management guidelines. These products will be used to assess the correctness, completeness, and efficiency of any GKS/Ada implementation. The GKS/Ada developers will be able to obtain the validation software for their own use. It is anticipated that this validation software will eventually be taken over by an independent standards body to provide objective assessments of GKS/Ada implementations, using an approach similar to the validation testing currently applied to Ada compilers. In the meantime, if requested, this validation software will be used to assess GKS/Ada products. The second project, implementation of GKS using the Ada language, is a conventional software engineering tasks. It represents a large body of Ada code and has some interesting testing problems associated with automatic testing of graphics routines. Here the normal test practices which include automated regression testing, independent quality assistance, test configuration management, and the application of software quality metrics will be employed. The software testing methods emphasize quality enhancement and automated procedures. Ada makes some aspects of testing easier, and introduces some concerns. These issues are addressed.

  20. Ada Structure Design Language (ASDL)

    NASA Technical Reports Server (NTRS)

    Chedrawi, Lutfi

    1986-01-01

    An artist acquires all the necessary tools before painting a scene. In the same analogy, a software engineer needs the necessary tools to provide their design with the proper means for implementation. Ada provide these tools. Yet, as an artist's painting needs a brochure to accompany it for further explanation of the scene, an Ada design also needs a document along with it to show the design in its detailed structure and hierarchical order. Ada could be self-explanatory in small programs not exceeding fifty lines of code in length. But, in a large environment, ranging from thousands of lines and above, Ada programs need to be well documented to be preserved and maintained. The language used to specify an Ada document is called Ada Structure Design Language (ASDL). This language sets some rules to help derive a well formatted Ada detailed design document. The rules are defined to meet the needs of a project manager, a maintenance team, a programmer and a systems designer. The design document templates, the document extractor, and the rules set forth by the ASDL are explained in detail.

  1. QUEST/Ada (Query Utility Environment for Software Testing of Ada): The development of a prgram analysis environment for Ada, task 1, phase 2

    NASA Technical Reports Server (NTRS)

    Brown, David B.

    1990-01-01

    The results of research and development efforts are described for Task one, Phase two of a general project entitled The Development of a Program Analysis Environment for Ada. The scope of this task includes the design and development of a prototype system for testing Ada software modules at the unit level. The system is called Query Utility Environment for Software Testing of Ada (QUEST/Ada). The prototype for condition coverage provides a platform that implements expert system interaction with program testing. The expert system can modify data in the instrument source code in order to achieve coverage goals. Given this initial prototype, it is possible to evaluate the rule base in order to develop improved rules for test case generation. The goals of Phase two are the following: (1) to continue to develop and improve the current user interface to support the other goals of this research effort (i.e., those related to improved testing efficiency and increased code reliable); (2) to develop and empirically evaluate a succession of alternative rule bases for the test case generator such that the expert system achieves coverage in a more efficient manner; and (3) to extend the concepts of the current test environment to address the issues of Ada concurrency.

  2. Astrophysics Source Code Library: Incite to Cite!

    NASA Astrophysics Data System (ADS)

    DuPrie, K.; Allen, A.; Berriman, B.; Hanisch, R. J.; Mink, J.; Nemiroff, R. J.; Shamir, L.; Shortridge, K.; Taylor, M. B.; Teuben, P.; Wallen, J. F.

    2014-05-01

    The Astrophysics Source Code Library (ASCl,http://ascl.net/) is an on-line registry of over 700 source codes that are of interest to astrophysicists, with more being added regularly. The ASCL actively seeks out codes as well as accepting submissions from the code authors, and all entries are citable and indexed by ADS. All codes have been used to generate results published in or submitted to a refereed journal and are available either via a download site or from an identified source. In addition to being the largest directory of scientist-written astrophysics programs available, the ASCL is also an active participant in the reproducible research movement with presentations at various conferences, numerous blog posts and a journal article. This poster provides a description of the ASCL and the changes that we are starting to see in the astrophysics community as a result of the work we are doing.

  3. Distributed transform coding via source-splitting

    NASA Astrophysics Data System (ADS)

    Yahampath, Pradeepa

    2012-12-01

    Transform coding (TC) is one of the best known practical methods for quantizing high-dimensional vectors. In this article, a practical approach to distributed TC of jointly Gaussian vectors is presented. This approach, referred to as source-split distributed transform coding (SP-DTC), can be used to easily implement two terminal transform codes for any given rate-pair. The main idea is to apply source-splitting using orthogonal-transforms, so that only Wyner-Ziv (WZ) quantizers are required for compression of transform coefficients. This approach however requires optimizing the bit allocation among dependent sets of WZ quantizers. In order to solve this problem, a low-complexity tree-search algorithm based on analytical models for transform coefficient quantization is developed. A rate-distortion (RD) analysis of SP-DTCs for jointly Gaussian sources is presented, which indicates that these codes can significantly outperform the practical alternative of independent TC of each source, whenever there is a strong correlation between the sources. For practical implementation of SP-DTCs, the idea of using conditional entropy constrained (CEC) quantizers followed by Slepian-Wolf coding is explored. Experimental results obtained with SP-DTC designs based on both CEC scalar quantizers and CEC trellis-coded quantizers demonstrate that actual implementations of SP-DTCs can achieve RD performance close to the analytically predicted limits.

  4. Transforming AdaPT to Ada

    NASA Technical Reports Server (NTRS)

    Goldsack, Stephen J.; Holzbach-Valero, A. A.; Waldrop, Raymond S.; Volz, Richard A.

    1991-01-01

    This paper describes how the main features of the proposed Ada language extensions intended to support distribution, and offered as possible solutions for Ada9X can be implemented by transformation into standard Ada83. We start by summarizing the features proposed in a paper (Gargaro et al, 1990) which constitutes the definition of the extensions. For convenience we have called the language in its modified form AdaPT which might be interpreted as Ada with partitions. These features were carefully chosen to provide support for the construction of executable modules for execution in nodes of a network of loosely coupled computers, but flexibly configurable for different network architectures and for recovery following failure, or adapting to mode changes. The intention in their design was to provide extensions which would not impact adversely on the normal use of Ada, and would fit well in style and feel with the existing standard. We begin by summarizing the features introduced in AdaPT.

  5. Maximum aposteriori joint source/channel coding

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Gibson, Jerry D.

    1991-01-01

    A maximum aposteriori probability (MAP) approach to joint source/channel coder design is presented in this paper. This method attempts to explore a technique for designing joint source/channel codes, rather than ways of distributing bits between source coders and channel coders. For a nonideal source coder, MAP arguments are used to design a decoder which takes advantage of redundancy in the source coder output to perform error correction. Once the decoder is obtained, it is analyzed with the purpose of obtaining 'desirable properties' of the channel input sequence for improving overall system performance. Finally, an encoder design which incorporates these properties is proposed.

  6. Astrophysics Source Code Library -- Now even better!

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Schmidt, Judy; Berriman, Bruce; DuPrie, Kimberly; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.

    2015-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. Indexed by ADS, it now contains nearly 1,000 codes and with recent major changes, is better than ever! The resource has a new infrastructure that offers greater flexibility and functionality for users, including an easier submission process, better browsing, one-click author search, and an RSS feeder for news. The new database structure is easier to maintain and offers new possibilities for collaboration. Come see what we've done!

  7. GSFC Ada programming guidelines

    NASA Technical Reports Server (NTRS)

    Roy, Daniel M.; Nelson, Robert W.

    1986-01-01

    A significant Ada effort has been under way at Goddard for the last two years. To ease the center's transition toward Ada (notably for future space station projects), a cooperative effort of half a dozen companies and NASA personnel was started in 1985 to produce programming standards and guidelines for the Ada language. The great richness of the Ada language and the need of programmers for good style examples makes Ada programming guidelines an important tool to smooth the Ada transition. Because of the natural divergence of technical opinions, the great diversity of our government and private organizations and the novelty of the Ada technology, the creation of an Ada programming guidelines document is a difficult and time consuming task. It is also a vital one. Steps must now be taken to ensure that the guide is refined in an organized but timely manner to reflect the growing level of expertise of the Ada community.

  8. Source code management with version control software

    NASA Astrophysics Data System (ADS)

    Arraki, Kenza S.

    2016-01-01

    Developing and maintaining software is an important part of astronomy research. As time progresses projects can move in unexpected directions or simply last longer than expected. Making changes to software can quickly result in many different versions of the code, wanting to return to a previous lost version, and problems sharing updated code with others. It is simple to update and collaboratively edit source code when you use version control software. This short talk will highlight the version control softwares svn, git, and hg for use with local and remote software repositories. In addition I will touch on using GitHub and BitBucket as excellent ways to share your code using an online interface.

  9. Using the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Teuben, P. J.; Berriman, G. B.; DuPrie, K.; Hanisch, R. J.; Mink, J. D.; Nemiroff, R. J.; Shamir, L.; Wallin, J. F.

    2013-01-01

    The Astrophysics Source Code Library (ASCL) is a free on-line registry of source codes that are of interest to astrophysicists; with over 500 codes, it is the largest collection of scientist-written astrophysics programs in existence. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are available either via a download site or from an identified source. An advisory committee formed in 2011 provides input and guides the development and expansion of the ASCL, and since January 2012, all accepted ASCL entries are indexed by ADS. Though software is increasingly important for the advancement of science in astrophysics, these methods are still often hidden from view or difficult to find. The ASCL (ascl.net/) seeks to improve the transparency and reproducibility of research by making these vital methods discoverable, and to provide recognition and incentive to those who write and release programs useful for astrophysics research. This poster provides a description of the ASCL, an update on recent additions, and the changes in the astrophysics community we are starting to see because of the ASCL.

  10. Multimedia Multicast Based on Multiterminal Source Coding

    NASA Astrophysics Data System (ADS)

    Aghagolzadeh, Ali; Nooshyar, Mahdi; Rabiee, Hamid R.; Mikaili, Elhameh

    Multimedia multicast with two servers based on the multiterminal source coding is studied in some previous researches. Due to the possibility of providing an approach for practical code design for more than two correlated sources in IMTSC/CEO setup, in this paper, the framework of Slepian-Wolf coded quantization is extended and a practical code design is presented for IMTSC/CEO with the number of encoders greater than two. Then the multicast system based on the IMTSC/CEO is applied to the cases with three, four and five servers. Since the underlying code design approach for the IMTSC/CEO problem has the capability of applying to an arbitrary number of active encoders, the proposed MMBMSC method can also be used with an arbitrary number of servers easily. Also, explicit expressions of the expected distortion with an arbitrary number of servers in the MMBMSC system are presented. Experimental results with data, image and video signals show the superiority of our proposed method over the conventional solutions and over the MMBMSC system with two servers.

  11. Lessons learned from an Ada conversion project

    NASA Technical Reports Server (NTRS)

    Porter, Tim

    1988-01-01

    Background; SAVVAS architecture; software portability; history of Ada; isolation of non-portable code; simple terminal interface package; constraints of language features; and virtual interfaces are outlined. This presentation is represented by viewgraphs only.

  12. Debugging tasked Ada programs

    NASA Technical Reports Server (NTRS)

    Fainter, R. G.; Lindquist, T. E.

    1986-01-01

    The applications for which Ada was developed require distributed implementations of the language and extensive use of tasking facilities. Debugging and testing technology as it applies to parallel features of languages currently falls short of needs. Thus, the development of embedded systems using Ada pose special challenges to the software engineer. Techniques for distributing Ada programs, support for simulating distributed target machines, testing facilities for tasked programs, and debugging support applicable to simulated and to real targets all need to be addressed. A technique is presented for debugging Ada programs that use tasking and it describes a debugger, called AdaTAD, to support the technique. The debugging technique is presented together with the use interface to AdaTAD. The component of AdaTAD that monitors and controls communication among tasks was designed in Ada and is presented through an example with a simple tasked program.

  13. Magnified Neutron Radiography with Coded Sources

    NASA Astrophysics Data System (ADS)

    Bingham, P.; Santos-Villalobos, H.; Lavrik, N.; Gregor, J.; Bilheux, H.

    A coded source imaging (CSI) system has been developed and tested at the High Flux Isotope Reactor (HFIR) CG-1D beamline at Oak Ridge National Laboratory (ORNL). The goal of this system is to use magnification to improve resolution of the imaging system beyond the detector resolution. For this system, coded masks have been manufactured at 10 μm resolution with 9 μm thick Gd patterned on Si wafers, a system model base iterative reconstruction code developed, and experiments have been performed at resolutions of 200 μm, 100 μm, 50 μm, 20 μm, and 10 μm with the object place greater than 5.5m from the detector giving magnifications up to 25 times.

  14. ART/Ada and CLIPS/Ada

    NASA Technical Reports Server (NTRS)

    Culbert, Chris

    1990-01-01

    Although they have reached a point of commercial viability, expert systems were originally developed in artificial intelligence (AI) research environments. Many of the available tools still work best in such environments. These environments typically utilize special hardware such as LISP machines and relatively unfamiliar languages such as LISP or Prolog. Space Station applications will require deep integration of expert system technology with applications developed in conventional languages, specifically Ada. The ability to apply automation to Space Station functions could be greatly enhanced by widespread availability of state-of-the-art expert system tools based on Ada. Although there have been some efforts to examine the use of Ada for AI applications, there are few, if any, existing products which provide state-of-the-art AI capabilities in an Ada tool. The goal of the ART/Ada Design Project is to conduct research into the implementation in Ada of state-of-the-art hybrid expert systems building tools (ESBT's). This project takes the following approach: using the existing design of the ART-IM ESBT as a starting point, analyze the impact of the Ada language and Ada development methodologies on that design; redesign the system in Ada; and analyze its performance. The research project will attempt to achieve a comprehensive understanding of the potential for embedding expert systems in Ada systems for eventual application in future Space Station Freedom projects. During Phase 1 of the project, initial requirements analysis, design, and implementation of the kernel subset of ART-IM functionality was completed. During Phase 2, the effort has been focused on the implementation and performance analysis of several versions with increasing functionality. Since production quality ART/Ada tools will not be available for a considerable time, and additional subtask of this project will be the completion of an Ada version of the CLIPS expert system shell developed by NASA

  15. Testing and Troubleshooting Automatically Generated Source Code

    NASA Technical Reports Server (NTRS)

    Henry, Joel

    1998-01-01

    Tools allowing engineers to model the real-time behavior of systems that control many types of NASA systems have become widespread. These tools automatically generate source code that is compiled, linked, then downloaded into computers controlling everything from wind tunnels to space flight systems. These tools save hundreds of hours of software development time and allow engineers with thorough application area knowledge but little software development experience to generate software to control the systems they use daily. These systems are verified and validated by simulating the real-time models, and by other techniques that focus on the model or the hardware. The automatically generated source code is typically not subjected to rigorous testing using conventional software testing techniques. Given the criticality and safety issues surrounding these systems, the application of conventional and new software testing and troubleshooting techniques to the automatically generated will improve the reliability of the resulting systems.

  16. Iterative Reconstruction of Coded Source Neutron Radiographs

    SciTech Connect

    Santos-Villalobos, Hector J; Bingham, Philip R; Gregor, Jens

    2012-01-01

    Use of a coded source facilitates high-resolution neutron imaging but requires that the radiographic data be deconvolved. In this paper, we compare direct deconvolution with two different iterative algorithms, namely, one based on direct deconvolution embedded in an MLE-like framework and one based on a geometric model of the neutron beam and a least squares formulation of the inverse imaging problem.

  17. ADAS Update and Maintainability

    NASA Technical Reports Server (NTRS)

    Watson, Leela R.

    2010-01-01

    Since 2000, both the National Weather Service Melbourne (NWS MLB) and the Spaceflight Meteorology Group (SMG) have used a local data integration system (LOIS) as part of their forecast and warning operations. The original LOIS was developed by the Applied Meteorology Unit (AMU) in 1998 (Manobianco and Case 1998) and has undergone subsequent improvements. Each has benefited from three-dimensional (3-D) analyses that are delivered to forecasters every 15 minutes across the peninsula of Florida. The intent is to generate products that enhance short-range weather forecasts issued in support of NWS MLB and SMG operational requirements within East Central Florida. The current LDIS uses the Advanced Regional Prediction System (ARPS) Data Analysis System (AD AS) package as its core, which integrates a wide variety of national, regional, and local observational data sets. It assimilates all available real-time data within its domain and is run at a finer spatial and temporal resolution than current national or regional-scale analysis packages. As such, it provides local forecasters with a more comprehensive understanding of evolving fine-scale weather features. Over the years, the LDIS has become problematic to maintain since it depends on AMU-developed shell scripts that were written for an earlier version of the ADAS software. The goals of this task were to update the NWS MLB/SMG LDIS with the latest version of ADAS, incorporate new sources of observational data, and upgrade and modify the AMU-developed shell scripts written to govern the system. In addition, the previously developed ADAS graphical user interface (GUI) was updated. Operationally, these upgrades will result in more accurate depictions of the current local environment to help with short-range weather forecasting applications, while also offering an improved initialization for local versions of the Weather Research and Forecasting (WRF) model used by both groups.

  18. Documentation generator application for VHDL source codes

    NASA Astrophysics Data System (ADS)

    Niton, B.; Pozniak, K. T.; Romaniuk, R. S.

    2011-06-01

    The UML, which is a complex system modeling and description technology, has recently been expanding its uses in the field of formalization and algorithmic approach to such systems like multiprocessor photonic, optoelectronic and advanced electronics carriers; distributed, multichannel measurement systems; optical networks, industrial electronics, novel R&D solutions. The paper describes a realization of an application for documenting VHDL source codes. There are presented own novel solution based on Doxygen program which is available on the free license, with accessible source code. The used supporting tools for parser building were Bison and Flex. There are presented the practical results of the documentation generator. The program was applied for exemplary VHDL codes. The documentation generator application is used for design of large optoelectronic and electronic measurement and control systems. The paper consists of three parts which describe the following components of the documentation generator for photonic and electronic systems: concept, MatLab application and VHDL application. This is part three which describes the VHDL application. VHDL is used for behavioral description of the Optoelectronic system.

  19. A small evaluation suite for Ada compilers

    NASA Technical Reports Server (NTRS)

    Wilke, Randy; Roy, Daniel M.

    1986-01-01

    After completing a small Ada pilot project (OCC simulator) for the Multi Satellite Operations Control Center (MSOCC) at Goddard last year, the use of Ada to develop OCCs was recommended. To help MSOCC transition toward Ada, a suite of about 100 evaluation programs was developed which can be used to assess Ada compilers. These programs compare the overall quality of the compilation system, compare the relative efficiencies of the compilers and the environments in which they work, and compare the size and execution speed of generated machine code. Another goal of the benchmark software was to provide MSOCC system developers with rough timing estimates for the purpose of predicting performance of future systems written in Ada.

  20. Iterative Reconstruction of Coded Source Neutron Radiographs

    SciTech Connect

    Santos-Villalobos, Hector J; Bingham, Philip R; Gregor, Jens

    2013-01-01

    Use of a coded source facilitates high-resolution neutron imaging through magnifications but requires that the radiographic data be deconvolved. A comparison of direct deconvolution with two different iterative algorithms has been performed. One iterative algorithm is based on a maximum likelihood estimation (MLE)-like framework and the second is based on a geometric model of the neutron beam within a least squares formulation of the inverse imaging problem. Simulated data for both uniform and Gaussian shaped source distributions was used for testing to understand the impact of non-uniformities present in neutron beam distributions on the reconstructed images. Results indicate that the model based reconstruction method will match resolution and improve on contrast over convolution methods in the presence of non-uniform sources. Additionally, the model based iterative algorithm provides direct calculation of quantitative transmission values while the convolution based methods must be normalized base on known values.

  1. Coded source imaging simulation with visible light

    NASA Astrophysics Data System (ADS)

    Wang, Sheng; Zou, Yubin; Zhang, Xueshuang; Lu, Yuanrong; Guo, Zhiyu

    2011-09-01

    A coded source could increase the neutron flux with high L/ D ratio. It may benefit a neutron imaging system with low yield neutron source. Visible light CSI experiments were carried out to test the physical design and reconstruction algorithm. We used a non-mosaic Modified Uniformly Redundant Array (MURA) mask to project the shadow of black/white samples on a screen. A cooled-CCD camera was used to record the image on the screen. Different mask sizes and amplification factors were tested. The correlation, Wiener filter deconvolution and Richardson-Lucy maximum likelihood iteration algorithm were employed to reconstruct the object imaging from the original projection. The results show that CSI can benefit the low flux neutron imaging with high background noise.

  2. Software Model Checking Without Source Code

    NASA Technical Reports Server (NTRS)

    Chaki, Sagar; Ivers, James

    2009-01-01

    We present a framework, called AIR, for verifying safety properties of assembly language programs via software model checking. AIR extends the applicability of predicate abstraction and counterexample guided abstraction refinement to the automated verification of low-level software. By working at the assembly level, AIR allows verification of programs for which source code is unavailable-such as legacy and COTS software-and programs that use features-such as pointers, structures, and object-orientation-that are problematic for source-level software verification tools. In addition, AIR makes no assumptions about the underlying compiler technology. We have implemented a prototype of AIR and present encouraging results on several non-trivial examples.

  3. Syndrome-source-coding and its universal generalization. [error correcting codes for data compression

    NASA Technical Reports Server (NTRS)

    Ancheta, T. C., Jr.

    1976-01-01

    A method of using error-correcting codes to obtain data compression, called syndrome-source-coding, is described in which the source sequence is treated as an error pattern whose syndrome forms the compressed data. It is shown that syndrome-source-coding can achieve arbitrarily small distortion with the number of compressed digits per source digit arbitrarily close to the entropy of a binary memoryless source. A 'universal' generalization of syndrome-source-coding is formulated which provides robustly effective distortionless coding of source ensembles. Two examples are given, comparing the performance of noiseless universal syndrome-source-coding to (1) run-length coding and (2) Lynch-Davisson-Schalkwijk-Cover universal coding for an ensemble of binary memoryless sources.

  4. A distributed programming environment for Ada

    NASA Technical Reports Server (NTRS)

    Brennan, Peter; Mcdonnell, Tom; Mcfarland, Gregory; Timmins, Lawrence J.; Litke, John D.

    1986-01-01

    Despite considerable commercial exploitation of fault tolerance systems, significant and difficult research problems remain in such areas as fault detection and correction. A research project is described which constructs a distributed computing test bed for loosely coupled computers. The project is constructing a tool kit to support research into distributed control algorithms, including a distributed Ada compiler, distributed debugger, test harnesses, and environment monitors. The Ada compiler is being written in Ada and will implement distributed computing at the subsystem level. The design goal is to provide a variety of control mechanics for distributed programming while retaining total transparency at the code level.

  5. Software development in Ada

    NASA Technical Reports Server (NTRS)

    Basili, V. R.; Katz, E. E.

    1985-01-01

    Ada will soon become a part of systems developed for the US Department of Defense. NASA must determine whether it will become part of its environment and particularly whether it will become a part fo the space station development. However, there are several issues about Ada which should be considered before this decision is made. One means of considering these issues is the examination of other developments in Ada. Unfortunately, few full scale developments have been completed or made publicly available for observation. Therefore, it will probably be necessary to study an Ada development in a NASA environment. Another means related to the first is the development of Ada metrics which can be used to characterize and evaluate Ada developments. These metrics need not be confined to full scale developments and could be used to evaluate on going projects as well. An early development in Ada, some observations from that development, metrics which were developed for use with Ada, and future directions for research into the use of Ada in software development in general and in the NASA Goddard environment in particular are described.

  6. Development of an Ada programming support environment database SEAD (Software Engineering and Ada Database) administration manual

    NASA Technical Reports Server (NTRS)

    Liaw, Morris; Evesson, Donna

    1988-01-01

    Software Engineering and Ada Database (SEAD) was developed to provide an information resource to NASA and NASA contractors with respect to Ada-based resources and activities which are available or underway either in NASA or elsewhere in the worldwide Ada community. The sharing of such information will reduce duplication of effort while improving quality in the development of future software systems. SEAD data is organized into five major areas: information regarding education and training resources which are relevant to the life cycle of Ada-based software engineering projects such as those in the Space Station program; research publications relevant to NASA projects such as the Space Station Program and conferences relating to Ada technology; the latest progress reports on Ada projects completed or in progress both within NASA and throughout the free world; Ada compilers and other commercial products that support Ada software development; and reusable Ada components generated both within NASA and from elsewhere in the free world. This classified listing of reusable components shall include descriptions of tools, libraries, and other components of interest to NASA. Sources for the data include technical newletters and periodicals, conference proceedings, the Ada Information Clearinghouse, product vendors, and project sponsors and contractors.

  7. Ada & the Analytical Engine.

    ERIC Educational Resources Information Center

    Freeman, Elisabeth

    1996-01-01

    Presents a brief history of Ada Byron King, Countess of Lovelace, focusing on her primary role in the development of the Analytical Engine--the world's first computer. Describes the Ada Project (TAP), a centralized World Wide Web site that serves as a clearinghouse for information related to women in computing, and provides a Web address for…

  8. Astronomy education and the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Nemiroff, Robert J.

    2016-01-01

    The Astrophysics Source Code Library (ASCL) is an online registry of source codes used in refereed astrophysics research. It currently lists nearly 1,200 codes and covers all aspects of computational astrophysics. How can this resource be of use to educators and to the graduate students they mentor? The ASCL serves as a discovery tool for codes that can be used for one's own research. Graduate students can also investigate existing codes to see how common astronomical problems are approached numerically in practice, and use these codes as benchmarks for their own solutions to these problems. Further, they can deepen their knowledge of software practices and techniques through examination of others' codes.

  9. Adaptive Source Coding Schemes for Geometrically Distributed Integer Alphabets

    NASA Technical Reports Server (NTRS)

    Cheung, K-M.; Smyth, P.

    1993-01-01

    Revisit the Gallager and van Voorhis optimal source coding scheme for geometrically distributed non-negative integer alphabets and show that the various subcodes in the popular Rice algorithm can be derived from the Gallager and van Voorhis code.

  10. A Construction of Lossy Source Code Using LDPC Matrices

    NASA Astrophysics Data System (ADS)

    Miyake, Shigeki; Muramatsu, Jun

    Research into applying LDPC code theory, which is used for channel coding, to source coding has received a lot of attention in several research fields such as distributed source coding. In this paper, a source coding problem with a fidelity criterion is considered. Matsunaga et al. and Martinian et al. constructed a lossy code under the conditions of a binary alphabet, a uniform distribution, and a Hamming measure of fidelity criterion. We extend their results and construct a lossy code under the extended conditions of a binary alphabet, a distribution that is not necessarily uniform, and a fidelity measure that is bounded and additive and show that the code can achieve the optimal rate, rate-distortion function. By applying a formula for the random walk on lattice to the analysis of LDPC matrices on Zq, where q is a prime number, we show that results similar to those for the binary alphabet condition hold for Zq, the multiple alphabet condition.

  11. ART/Ada design project, phase 1: Project plan

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.

    1988-01-01

    The plan and schedule for Phase 1 of the Ada based ESBT Design Research Project is described. The main platform for the project is a DEC Ada compiler on VAX mini-computers and VAXstations running the Virtual Memory System (VMS) operating system. The Ada effort and lines of code are given in tabular form. A chart is given of the entire project life cycle.

  12. ART-Ada: An Ada-based expert system tool

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel; Allen, Bradley P.

    1990-01-01

    The Department of Defense mandate to standardize on Ada as the language for software systems development has resulted in an increased interest in making expert systems technology readily available in Ada environments. NASA's Space Station Freedom is an example of the large Ada software development projects that will require expert systems in the 1990's. Another large scale application that can benefit from Ada based expert system tool technology is the Pilot's Associate (PA) expert system project for military combat aircraft. The Automated Reasoning Tool-Ada (ART-Ada), an Ada expert system tool, is explained. ART-Ada allows applications of a C-based expert system tool called ART-IM to be deployed in various Ada environments. ART-Ada is being used to implement several prototype expert systems for NASA's Space Station Freedom program and the U.S. Air Force.

  13. An Efficient Variable Length Coding Scheme for an IID Source

    NASA Technical Reports Server (NTRS)

    Cheung, K. -M.

    1995-01-01

    A scheme is examined for using two alternating Huffman codes to encode a discrete independent and identically distributed source with a dominant symbol. This combined strategy, or alternating runlength Huffman (ARH) coding, was found to be more efficient than ordinary coding in certain circumstances.

  14. Some techniques in universal source coding and during for composite sources

    NASA Astrophysics Data System (ADS)

    Wallace, M. S.

    1981-12-01

    We consider three problems in source coding. First, we consider the composite source model. A composite source has a switch driven by a random process which selects one of a possible set of subsources. We derive some convergence results for estimation of the switching process, and use these to prove that the entropy of some composite sources may be computed. some coding techniques for composite sources are also presented and their performance is bounded. Next, we construct a variable-length-to-fixed-length (VL-FL) universal code for a class of unifilar Markov sources. A VL-FL code maps strings of source outputs into fixed-length codewords. We show that the redundancy of the code converges to zero uniformly over the class of sources as the blocklength increases. The code is also universal with respect to the initial state of the source. We compare the performance of this code to FL-VL universal codes. We then consider universal coding for real-valued sources. We show that given some coding technique for a known source, we may construct a code for a class of sources. We show that this technique works for some classes of memoryless sources, and also for a compact subset of the class of k-th order Gaussian autoregressive sources.

  15. Ada and cyclic runtime scheduling

    NASA Technical Reports Server (NTRS)

    Hood, Philip E.

    1986-01-01

    An important issue that must be faced while introducing Ada into the real time world is efficient and prodictable runtime behavior. One of the most effective methods employed during the traditional design of a real time system is the cyclic executive. The role cyclic scheduling might play in an Ada application in terms of currently available implementations and in terms of implementations that might be developed especially to support real time system development is examined. The cyclic executive solves many of the problems faced by real time designers, resulting in a system for which it is relatively easy to achieve approporiate timing behavior. Unfortunately a cyclic executive carries with it a very high maintenance penalty over the lifetime of the software that is schedules. Additionally, these cyclic systems tend to be quite fragil when any aspect of the system changes. The findings are presented of an ongoing SofTech investigation into Ada methods for real time system development. The topics covered include a description of the costs involved in using cyclic schedulers, the sources of these costs, and measures for future systems to avoid these costs without giving up the runtime performance of a cyclic system.

  16. Data processing with microcode designed with source coding

    DOEpatents

    McCoy, James A; Morrison, Steven E

    2013-05-07

    Programming for a data processor to execute a data processing application is provided using microcode source code. The microcode source code is assembled to produce microcode that includes digital microcode instructions with which to signal the data processor to execute the data processing application.

  17. The Astrophysics Source Code Library: http://www.ascl.net/

    NASA Astrophysics Data System (ADS)

    Nemiroff, R. J.; Wallin, J. F.

    1999-05-01

    Submissions are invited to the newly formed Astrophysics Source Code Library (ASCL). Original codes that have generated significant results for any paper published in a refereed astronomy or astrophysics journal are eligible for inclusion in ASCL. All submissions and personalized correspondence will be handled electronically. ASCL will not claim copyright on any of its archived codes, but will not archive codes without permission from the copyright owners. ASCL archived source codes will be indexed on the World Wide Web and made freely available for non-commercial purposes. Many results reported in astrophysics are derived though the writing and implementation of source codes. Small or large, few source codes are ever made publicly available. Because of the effort involved in the creation of scientific codes and their impact in astrophysics, we have created a site which archives and distribute codes which were used in astrophysical publications. Goals in the creation of ASCL include increasing the availability, falsifiability, and utility of source codes important to astrophysicists. ASCL is an experimental concept in its formative year - its value will be assessed from author response and user feedback in one years' time.

  18. ART-Ada: An Ada-based expert system tool

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel; Allen, Bradley P.

    1991-01-01

    The Department of Defense mandate to standardize on Ada as the language for software systems development has resulted in increased interest in making expert systems technology readily available in Ada environments. NASA's Space Station Freedom is an example of the large Ada software development projects that will require expert systems in the 1990's. Another large scale application that can benefit from Ada based expert system tool technology is the Pilot's Associate (PA) expert system project for military combat aircraft. Automated Reasoning Tool (ART) Ada, an Ada Expert system tool is described. ART-Ada allow applications of a C-based expert system tool called ART-IM to be deployed in various Ada environments. ART-Ada is being used to implement several prototype expert systems for NASA's Space Station Freedom Program and the U.S. Air Force.

  19. Merged Source Word Codes for Efficient, High-Speed Entropy Coding

    SciTech Connect

    Senecal, J; Joy, K; Duchaineau, M

    2002-12-05

    We present our work on fast entropy coders for binary messages utilizing only bit shifts and table lookups. To minimize code table size we limit our code lengths with a novel type of variable-to-variable (VV) length code created from source word merging. We refer to these codes as merged codes. With merged codes it is possible to achieve a desired level of efficiency by adjusting the number of bits read from the source at each step. The most efficient merged codes yield a coder with an inefficiency of 0.4%, relative to the Shannon entropy, in the worst case. On one of our test systems a current implementation of coder using merged codes has a throughput of 35 Mbytes/sec.

  20. Using ADA Tasks to Simulate Operating Equipment

    NASA Technical Reports Server (NTRS)

    DeAcetis, Louis A.; Schmidt, Oron; Krishen, Kumar

    1990-01-01

    A method of simulating equipment using ADA tasks is discussed. Individual units of equipment are coded as concurrently running tasks that monitor and respond to input signals. This technique has been used in a simulation of the space-to-ground Communications and Tracking subsystem of Space Station Freedom.

  1. Coded source neutron imaging with a MURA mask

    NASA Astrophysics Data System (ADS)

    Zou, Y. B.; Schillinger, B.; Wang, S.; Zhang, X. S.; Guo, Z. Y.; Lu, Y. R.

    2011-09-01

    In coded source neutron imaging the single aperture commonly used in neutron radiography is replaced with a coded mask. Using a coded source can improve the neutron flux at the sample plane when a very high L/ D ratio is needed. The coded source imaging is a possible way to reduce the exposure time to get a neutron image with very high L/ D ratio. A 17×17 modified uniformly redundant array coded source was tested in this work. There are 144 holes of 0.8 mm diameter on the coded source. The neutron flux from the coded source is as high as from a single 9.6 mm aperture, while its effective L/ D is the same as in the case of a 0.8 mm aperture. The Richardson-Lucy maximum likelihood algorithm was used for image reconstruction. Compared to an in-line phase contrast neutron image taken with a 1 mm aperture, it takes much less time for the coded source to get an image of similar quality.

  2. Source Term Code Package: a user's guide (Mod 1)

    SciTech Connect

    Gieseke, J.A.; Cybulskis, P.; Jordan, H.; Lee, K.W.; Schumacher, P.M.; Curtis, L.A.; Wooton, R.O.; Quayle, S.F.; Kogan, V.

    1986-07-01

    As part of a major reassessment of the release of radioactive materials to the environment (source terms) in severe reactor accidents, a group of state-of-the-art computer codes was utilized to perform extensive analyses. A major product of this source term reassessment effort was a demonstrated methodology for analyzing specific accident situations to provide source term predictions. The computer codes forming this methodology have been upgraded and modified for release and further use. This system of codes has been named the Source Term Code Package (STCP) and is the subject of this user's guide. The guide is intended to provide an understanding of the STCP structure and to facilitate STCP use. The STCP was prepared for operation on a CDC system but is written in FORTRAN-77 to permit transportability. In the current version (Mod 1) of the STCP, the various calculational elements fall into four major categories represented by the codes MARCH3, TRAP-MELT3, VANESA, and NAUA/SPARC/ICEDF. The MARCH3 code is a combination of the MARCH2, CORSOR-M, and CORCON-Mod 2 codes. The TRAP-MELT3 code is a combination of the TRAP-MELT2.0 and MERGE codes.

  3. Procedures and tools for building large Ada systems

    NASA Technical Reports Server (NTRS)

    Hyde, Ben

    1986-01-01

    Some of the problems unique to building a very large Ada system are addressed. This is done through examples from experience. In the winter of 1985 and 1986, Intermetrics bootstrapped the Ada compiler, which was being built over the last few years. This system consists of about one million lines of full Ada. Over the last few years a number of procedures and tools were adopted for managing the life cycle of each of the many parts of an Ada system. Many of these procedures are well known to most system builders: release management, quality assurance testing; and source file revision control. Others are unique to working in an Ada language environment; i.e., recompilation management, Ada program library management, and managing multiple implementations. First a look is taken at how a large Ada system is broken down into pieces. The Ada definition leaves unspecified a number of issues that the system builder must address: versions, subsystems, multiple implementations, and synchronization of branched development paths. Having introduced how the Ada systems are decomposed, a look is taken, via a series of examples, at how the life cylces of those parts is managed. The procedures and tools used to manage the evolution of the system are examined. It is hoped that other Ada system builders can build upon the experience of the last few years.

  4. MATLAB tensor classes for fast algorithm prototyping : source code.

    SciTech Connect

    Bader, Brett William; Kolda, Tamara Gibson

    2004-10-01

    We present the source code for three MATLAB classes for manipulating tensors in order to allow fast algorithm prototyping. A tensor is a multidimensional or Nway array. This is a supplementary report; details on using this code are provided separately in SAND-XXXX.

  5. A LISP-Ada connection

    NASA Technical Reports Server (NTRS)

    Jaworski, Allan; Lavallee, David; Zoch, David

    1987-01-01

    The prototype demonstrates the feasibility of using Ada for expert systems and the implementation of an expert-friendly interface which supports knowledge entry. In the Ford LISP-Ada Connection (FLAC) system LISP and Ada are used in ways which complement their respective capabilities. Future investigation will concentrate on the enhancement of the expert knowledge entry/debugging interface and on the issues associated with multitasking and real-time expert systems implementation in Ada.

  6. Statistical physics, optimization and source coding

    NASA Astrophysics Data System (ADS)

    Zechhina, Riccardo

    2005-06-01

    The combinatorial problem of satisfying a given set of constraints that depend on N discrete variables is a fundamental one in optimization and coding theory. Even for instances of randomly generated problems, the question ``does there exist an assignment to the variables that satisfies all constraints?'' may become extraordinarily difficult to solve in some range of parameters where a glass phase sets in. We shall provide a brief review of the recent advances in the statistical mechanics approach to these satisfiability problems and show how the analytic results have helped to design a new class of message-passing algorithms -- the survey propagation (SP) algorithms -- that can efficiently solve some combinatorial problems considered intractable. As an application, we discuss how the packing properties of clusters of solutions in randomly generated satisfiability problems can be exploited in the design of simple lossy data compression algorithms.

  7. The FORTRAN static source code analyzer program (SAP) system description

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Merwarth, P.; Oneill, M.; Goorevich, C.; Waligora, S.

    1982-01-01

    A source code analyzer program (SAP) designed to assist personnel in conducting studies of FORTRAN programs is described. The SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. The processing performed by SAP and of the routines, COMMON blocks, and files used by SAP are described. The system generation procedure for SAP is also presented.

  8. AN ADA NAMELIST PACKAGE

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.

    1994-01-01

    The Ada Namelist Package, developed for the Ada programming language, enables a calling program to read and write FORTRAN-style namelist files. A namelist file consists of any number of assignment statements in any order. Features of the Ada Namelist Package are: the handling of any combination of user-defined types; the ability to read vectors, matrices, and slices of vectors and matrices; the handling of mismatches between variables in the namelist file and those in the programmed list of namelist variables; and the ability to avoid searching the entire input file for each variable. The principle user benefits of this software are the following: the ability to write namelist-readable files, the ability to detect most file errors in the initialization phase, a package organization that reduces the number of instantiated units to a few packages rather than to many subprograms, a reduced number of restrictions, and an increased execution speed. The Ada Namelist reads data from an input file into variables declared within a user program. It then writes data from the user program to an output file, printer, or display. The input file contains a sequence of assignment statements in arbitrary order. The output is in namelist-readable form. There is a one-to-one correspondence between namelist I/O statements executed in the user program and variables read or written. Nevertheless, in the input file, mismatches are allowed between assignment statements in the file and the namelist read procedure statements in the user program. The Ada Namelist Package itself is non-generic. However, it has a group of nested generic packages following the nongeneric opening portion. The opening portion declares a variety of useraccessible constants, variables and subprograms. The subprograms are procedures for initializing namelists for reading, reading and writing strings. The subprograms are also functions for analyzing the content of the current dataset and diagnosing errors. Two nested

  9. Toward the Automated Generation of Components from Existing Source Code

    SciTech Connect

    Quinlan, D; Yi, Q; Kumfert, G; Epperly, T; Dahlgren, T; Schordan, M; White, B

    2004-12-02

    A major challenge to achieving widespread use of software component technology in scientific computing is an effective migration strategy for existing, or legacy, source code. This paper describes initial work and challenges in automating the identification and generation of components using the ROSE compiler infrastructure and the Babel language interoperability tool. Babel enables calling interfaces expressed in the Scientific Interface Definition Language (SIDL) to be implemented in, and called from, an arbitrary combination of supported languages. ROSE is used to build specialized source-to-source translators that (1) extract a SIDL interface specification from information implicit in existing C++ source code and (2) transform Babel's output to include dispatches to the legacy code.

  10. Simulation of the space station information system in Ada

    NASA Technical Reports Server (NTRS)

    Spiegel, James R.

    1986-01-01

    The Flexible Ada Simulation Tool (FAST) is a discrete event simulation language which is written in Ada. FAST has been used to simulate a number of options for ground data distribution of Space Station payload data. The fact that Ada language is used for implementation has allowed a number of useful interactive features to be built into FAST and has facilitated quick enhancement of its capabilities to support new modeling requirements. General simulation concepts are discussed, and how these concepts are implemented in FAST. The FAST design is discussed, and it is pointed out how the used of the Ada language enabled the development of some significant advantages over classical FORTRAN based simulation languages. The advantages discussed are in the areas of efficiency, ease of debugging, and ease of integrating user code. The specific Ada language features which enable these advances are discussed.

  11. Alma Flor Ada: Writer, Translator, Storyteller.

    ERIC Educational Resources Information Center

    Brodie, Carolyn S.

    2003-01-01

    Discusses the work of children's author Alma Flor Ada, a Cuban native who has won awards honoring Latino writers and illustrators. Includes part of an interview that explores her background, describes activity ideas, and presents a bibliography of works written by her (several title published in both English and Spanish) as well as sources of…

  12. A preprocessor for FORTRAN source code produced by reduce

    NASA Astrophysics Data System (ADS)

    Kaneko, Toshiaki; Kawabata, Setsuya

    1989-09-01

    For Estimating total cross sections and various spectra for complicated processes in high energy physics, the most time consuming part is numerical integration over the phase volume. When a FORTRAN source code for the integrand is produced by REDUCE, often it is not only too long but is not enough reduced to be optimized by a FORTRAN compiler. A program package called SPROC has been developed to convert FORTRAN source code to a more optimized form and to divide the code into subroutines whose lengths are short enough for FORTRAN compilers. It can also generate a vectorizable code, which can achieve high efficiency of vector computers. The output is given in a suitable form for the numerical integration package BASES and its vector computer version VBASES. By this improvement the CPU-time for integration is shortened by a factor of about two on a scalar computer and of several times then on a vector computer.

  13. Streamlined Genome Sequence Compression using Distributed Source Coding

    PubMed Central

    Wang, Shuang; Jiang, Xiaoqian; Chen, Feng; Cui, Lijuan; Cheng, Samuel

    2014-01-01

    We aim at developing a streamlined genome sequence compression algorithm to support alternative miniaturized sequencing devices, which have limited communication, storage, and computation power. Existing techniques that require heavy client (encoder side) cannot be applied. To tackle this challenge, we carefully examined distributed source coding theory and developed a customized reference-based genome compression protocol to meet the low-complexity need at the client side. Based on the variation between source and reference, our protocol will pick adaptively either syndrome coding or hash coding to compress subsequences of changing code length. Our experimental results showed promising performance of the proposed method when compared with the state-of-the-art algorithm (GRS). PMID:25520552

  14. AdaNET phase 0 support for the AdaNET Dynamic Software Inventory (DSI) management system prototype. Catalog of available reusable software components

    NASA Technical Reports Server (NTRS)

    Hanley, Lionel

    1989-01-01

    The Ada Software Repository is a public-domain collection of Ada software and information. The Ada Software Repository is one of several repositories located on the SIMTEL20 Defense Data Network host computer at White Sands Missile Range, and available to any host computer on the network since 26 November 1984. This repository provides a free source for Ada programs and information. The Ada Software Repository is divided into several subdirectories. These directories are organized by topic, and their names and a brief overview of their topics are contained. The Ada Software Repository on SIMTEL20 serves two basic roles: to promote the exchange and use (reusability) of Ada programs and tools (including components) and to promote Ada education.

  15. Encoding of multi-alphabet sources by binary arithmetic coding

    NASA Astrophysics Data System (ADS)

    Guo, Muling; Oka, Takahumi; Kato, Shigeo; Kajiwara, Hiroshi; Kawamura, Naoto

    1998-12-01

    In case of encoding a multi-alphabet source, the multi- alphabet symbol sequence can be encoded directly by a multi- alphabet arithmetic encoder, or the sequence can be first converted into several binary sequences and then each binary sequence is encoded by binary arithmetic encoder, such as the L-R arithmetic coder. Arithmetic coding, however, requires arithmetic operations for each symbol and is computationally heavy. In this paper, a binary representation method using Huffman tree is introduced to reduce the number of arithmetic operations, and a new probability approximation for L-R arithmetic coding is further proposed to improve the coding efficiency when the probability of LPS (Least Probable Symbol) is near 0.5. Simulation results show that our proposed scheme has high coding efficacy and can reduce the number of coding symbols.

  16. Techniques and implementation of the embedded rule-based expert system using Ada

    NASA Technical Reports Server (NTRS)

    Liberman, Eugene M.; Jones, Robert E.

    1991-01-01

    Ada is becoming an increasingly popular programming language for large Government-funded software projects. Ada with its portability, transportability, and maintainability lends itself well to today's complex programming environment. In addition, expert systems have also assured a growing role in providing human-like reasoning capability and expertise for computer systems. The integration of expert system technology with Ada programming language, specifically a rule-based expert system using an ART-Ada (Automated Reasoning Tool for Ada) system shell is discussed. The NASA Lewis Research Center was chosen as a beta test site for ART-Ada. The test was conducted by implementing the existing Autonomous Power EXpert System (APEX), a Lisp-base power expert system, in ART-Ada. Three components, the rule-based expert system, a graphics user interface, and communications software make up SMART-Ada (Systems fault Management with ART-Ada). The main objective, to conduct a beta test on the ART-Ada rule-based expert system shell, was achieved. The system is operational. New Ada tools will assist in future successful projects. ART-Ada is one such tool and is a viable alternative to the straight Ada code when an application requires a rule-based or knowledge-based approach.

  17. Integrity and security in an Ada runtime environment

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    A review is provided of the Formal Methods group discussions. It was stated that integrity is not a pure mathematical dual of security. The input data is part of the integrity domain. The group provided a roadmap for research. One item of the roadmap and the final position statement are closely related to the space shuttle and space station. The group's position is to use a safe subset of Ada. Examples of safe sets include the Army Secure Operating System and the Penelope Ada verification tool. It is recommended that a conservative attitude is required when writing Ada code for life and property critical systems.

  18. AdaNET executive summary

    NASA Technical Reports Server (NTRS)

    Digman, R. Michael

    1988-01-01

    The goal of AdaNET is to transfer existing and emerging software engineering technology from the Federal government to the private sector. The views and perspectives of the current project participants on long and short term goals for AdaNET; organizational structure; resources and returns; summary of identified AdaNET services; and the summary of the organizational model currently under discussion are presented.

  19. Ada training evaluation and recommendation

    NASA Technical Reports Server (NTRS)

    Murphy, Robert; Stark, Michael

    1987-01-01

    This paper documents the Ada training experiences and recommendations of the Gamma Ray Observatory dynamics simulator Ada development team. A two month Ada training program for software developers is recommended which stresses the importance of teaching design methodologies early, as well as the use of certain training aids such as videotaped lectures and computer-aided instruction. Furthermore, a separate training program for managers is recommended, so that they may gain a better understanding of modified review products and resource allocation associated with Ada projects.

  20. AdaNET research project

    NASA Technical Reports Server (NTRS)

    Digman, R. Michael

    1988-01-01

    The components necessary for the success of the commercialization of an Ada Technology Transition Network are reported in detail. The organizational plan presents the planned structure for services development and technical transition of AdaNET services to potential user communities. The Business Plan is the operational plan for the AdaNET service as a commercial venture. The Technical Plan is the plan from which the AdaNET can be designed including detailed requirements analysis. Also contained is an analysis of user fees and charges, and a proposed user fee schedule.

  1. Using cryptology models for protecting PHP source code

    NASA Astrophysics Data System (ADS)

    Jevremović, Aleksandar; Ristić, Nenad; Veinović, Mladen

    2013-10-01

    Protecting PHP scripts from unwanted use, copying and modifications is a big issue today. Existing solutions on source code level are mostly working as obfuscators, they are free, and they are not providing any serious protection. Solutions that encode opcode are more secure, but they are commercial and require closed-source proprietary PHP interpreter's extension. Additionally, encoded opcode is not compatible with future versions of interpreters which imply re-buying encoders from the authors. Finally, if extension source-code is compromised, all scripts encoded with that solution are compromised too. In this paper, we will present a new model for free and open-source PHP script protection solution. Protection level provided by the proposed solution is equal to protection level of commercial solutions. Model is based on conclusions from use of standard cryptology models for analysis of strengths and weaknesses of the existing solutions, when a scripts protection is seen as secure communication channel in the cryptology.

  2. Coded source neutron imaging at the PULSTAR reactor

    SciTech Connect

    Xiao, Ziyu; Mishra, Kaushal; Hawari, Ayman; Bingham, Philip R; Bilheux, Hassina Z; Tobin Jr, Kenneth William

    2011-01-01

    A neutron imaging facility is located on beam-tube No.5 of the 1-MW PULSTAR reactor at North Carolina State University. An investigation of high resolution imaging using the coded source imaging technique has been initiated at the facility. Coded imaging uses a mosaic of pinholes to encode an aperture, thus generating an encoded image of the object at the detector. To reconstruct the image data received by the detector, the corresponding decoding patterns are used. The optimized design of coded mask is critical for the performance of this technique and will depend on the characteristics of the imaging beam. In this work, a 34 x 38 uniformly redundant array (URA) coded aperture system is studied for application at the PULSTAR reactor neutron imaging facility. The URA pattern was fabricated on a 500 ?m gadolinium sheet. Simulations and experiments with a pinhole object have been conducted using the Gd URA and the optimized beam line.

  3. Distributed and parallel Ada and the Ada 9X recommendations

    NASA Astrophysics Data System (ADS)

    Volz, R. A.; Theriault, R.; Waldrop, R.; Goldsack, S. J.; Holzbacher-Valero, A.

    1994-06-01

    In modern software systems development, distributed and parallel systems are of increasing importance. Much research has been done to investigate the distribution of Ada programs across a set of processors, both in loosely-coupled distributed systems and in more tightly-coupled parallel systems. To this point, however, there has been something of an idea that the support needed for distributed systems differs from that required for parallel systems. In this paper, the authors first discuss the support requirements for distributed and parallel Ada programs, and point out that the requirements for these two areas have more in common than may have been previously thought. Next, the authors discuss AdaPT (Ada plus ParTitions), a set of extensions to Ada to support distributed and fault-tolerant systems. AdaPT is used as a reference in the further discussion of the previously identified requirements for distributed systems. After this, the authors provide an in-depth discussion of the Ada 9X Distributed Systems Annex, as presented by the Ada 9X mapping/revision team in the version 5.0 draft Language Reference Manual, and the extent to which this annex fulfils the previously identified requirements.

  4. Transforming AdaPT to Ada9x

    NASA Technical Reports Server (NTRS)

    Goldsack, Stephen J.; Holzbach-Valero, A. A.; Volz, Richard A.; Waldrop, Raymond S.

    1993-01-01

    How the concepts of AdaPT can be transformed into programs using the object oriented features proposed in the preliminary mapping for Ada9x are described. Emphasizing, as they do, the importance of data types as units of program, these features match well with the development of partitions as translations into Abstract Data Types which was exploited in the Ada83 translation covered in report R3. By providing a form of polymorphic type, the Ada83 version also gives support for the conformant partition idea which could be achieved in Ada83 only by using UNCHECKED CONVERSIONS. It is assumed that the reader understands AdaPT itself, but the translation into Ada83 is briefly reviewed, by applying it to a small example. This is then used to show how the same translation would be achieved in the 9x version. It is important to appreciate that the distribution features which are proposed in current mapping are not used or discussed in any detail, as those are not well matched to the AdaPT approach. Critical evaluation and comparison of these approaches is given in a separate report.

  5. A Comparison of Source Code Plagiarism Detection Engines

    ERIC Educational Resources Information Center

    Lancaster, Thomas; Culwin, Fintan

    2004-01-01

    Automated techniques for finding plagiarism in student source code submissions have been in use for over 20 years and there are many available engines and services. This paper reviews the literature on the major modern detection engines, providing a comparison of them based upon the metrics and techniques they deploy. Generally the most common and…

  6. MATHEMATICAL MODEL OF ELECTROSTATIC PRECIPITATION (REVISION 3): SOURCE CODE

    EPA Science Inventory

    This tape contains the source code (FORTRAN) for Revision 3 of the Mathematical Model of Electrostatic Precipitation. Improvements found in Revision 3 of the model include a new method of calculating the solutions to the electric field equations, a dynamic method for calculating ...

  7. Plagiarism Detection Algorithm for Source Code in Computer Science Education

    ERIC Educational Resources Information Center

    Liu, Xin; Xu, Chan; Ouyang, Boyu

    2015-01-01

    Nowadays, computer programming is getting more necessary in the course of program design in college education. However, the trick of plagiarizing plus a little modification exists among some students' home works. It's not easy for teachers to judge if there's plagiarizing in source code or not. Traditional detection algorithms cannot fit this…

  8. Secondary neutron source modelling using MCNPX and ALEPH codes

    NASA Astrophysics Data System (ADS)

    Trakas, Christos; Kerkar, Nordine

    2014-06-01

    Monitoring the subcritical state and divergence of reactors requires the presence of neutron sources. But mainly secondary neutrons from these sources feed the ex-core detectors (SRD, Source Range Detector) whose counting rate is correlated with the level of the subcriticality of reactor. In cycle 1, primary neutrons are provided by sources activated outside of the reactor (e.g. Cf252); part of this source can be used for the divergence of cycle 2 (not systematic). A second family of neutron sources is used for the second cycle: the spontaneous neutrons of actinides produced after irradiation of fuel in the first cycle. Both families of sources are not sufficient to efficiently monitor the divergence of the second cycles and following ones, in most reactors. Secondary sources cluster (SSC) fulfil this role. In the present case, the SSC [Sb, Be], after activation in the first cycle (production of Sb124, unstable), produces in subsequent cycles a photo-neutron source by gamma (from Sb124)-neutron (on Be9) reaction. This paper presents the model of the process between irradiation in cycle 1 and cycle 2 results for SRD counting rate at the beginning of cycle 2, using the MCNPX code and the depletion chain ALEPH-V1 (coupling of MCNPX and ORIGEN codes). The results of this simulation are compared with two experimental results of the PWR 1450 MWe-N4 reactors. A good agreement is observed between these results and the simulations. The subcriticality of the reactors is about at -15,000 pcm. Discrepancies on the SRD counting rate between calculations and measurements are in the order of 10%, lower than the combined uncertainty of measurements and code simulation. This comparison validates the AREVA methodology, which allows having an SRD counting rate best-estimate for cycles 2 and next ones and optimizing the position of the SSC, depending on the geographic location of sources, main parameter for optimal monitoring of subcritical states.

  9. Distributed and parallel Ada and the Ada 9X recommendations

    NASA Technical Reports Server (NTRS)

    Volz, Richard A.; Goldsack, Stephen J.; Theriault, R.; Waldrop, Raymond S.; Holzbacher-Valero, A. A.

    1992-01-01

    Recently, the DoD has sponsored work towards a new version of Ada, intended to support the construction of distributed systems. The revised version, often called Ada 9X, will become the new standard sometimes in the 1990s. It is intended that Ada 9X should provide language features giving limited support for distributed system construction. The requirements for such features are given. Many of the most advanced computer applications involve embedded systems that are comprised of parallel processors or networks of distributed computers. If Ada is to become the widely adopted language envisioned by many, it is essential that suitable compilers and tools be available to facilitate the creation of distributed and parallel Ada programs for these applications. The major languages issues impacting distributed and parallel programming are reviewed, and some principles upon which distributed/parallel language systems should be built are suggested. Based upon these, alternative language concepts for distributed/parallel programming are analyzed.

  10. Codes for sound-source location in nontonotopic auditory cortex.

    PubMed

    Middlebrooks, J C; Xu, L; Eddins, A C; Green, D M

    1998-08-01

    We evaluated two hypothetical codes for sound-source location in the auditory cortex. The topographical code assumed that single neurons are selective for particular locations and that sound-source locations are coded by the cortical location of small populations of maximally activated neurons. The distributed code assumed that the responses of individual neurons can carry information about locations throughout 360 degrees of azimuth and that accurate sound localization derives from information that is distributed across large populations of such panoramic neurons. We recorded from single units in the anterior ectosylvian sulcus area (area AES) and in area A2 of alpha-chloralose-anesthetized cats. Results obtained in the two areas were essentially equivalent. Noise bursts were presented from loudspeakers spaced in 20 degrees intervals of azimuth throughout 360 degrees of the horizontal plane. Spike counts of the majority of units were modulated >50% by changes in sound-source azimuth. Nevertheless, sound-source locations that produced greater than half-maximal spike counts often spanned >180 degrees of azimuth. The spatial selectivity of units tended to broaden and, often, to shift in azimuth as sound pressure levels (SPLs) were increased to a moderate level. We sometimes saw systematic changes in spatial tuning along segments of electrode tracks as long as 1.5 mm but such progressions were not evident at higher sound levels. Moderate-level sounds presented anywhere in the contralateral hemifield produced greater than half-maximal activation of nearly all units. These results are not consistent with the hypothesis of a topographic code. We used an artificial-neural-network algorithm to recognize spike patterns and, thereby, infer the locations of sound sources. Network input consisted of spike density functions formed by averages of responses to eight stimulus repetitions. Information carried in the responses of single units permitted reasonable estimates of sound-source

  11. Using Ada: The deeper challenges

    NASA Technical Reports Server (NTRS)

    Feinberg, David A.

    1986-01-01

    The Ada programming language and the associated Ada Programming Support Environment (APSE) and Ada Run Time Environment (ARTE) provide the potential for significant life-cycle cost reductions in computer software development and maintenance activities. The Ada programming language itself is standardized, trademarked, and controlled via formal validation procedures. Though compilers are not yet production-ready as most would desire, the technology for constructing them is sufficiently well known and understood that time and money should suffice to correct current deficiencies. The APSE and ARTE are, on the other hand, significantly newer issues within most software development and maintenance efforts. Currently, APSE and ARTE are highly dependent on differing implementer concepts, strategies, and market objectives. Complex and sophisticated mission-critical computing systems require the use of a complete Ada-based capability, not just the programming language itself; yet the range of APSE and ARTE features which must actually be utilized can vary significantly from one system to another. As a consequence, the need to understand, objectively evaluate, and select differing APSE and ARTE capabilities and features is critical to the effective use of Ada and the life-cycle efficiencies it is intended to promote. It is the selection, collection, and understanding of APSE and ARTE which provide the deeper challenges of using Ada for real-life mission-critical computing systems. Some of the current issues which must be clarified, often on a case-by-case basis, in order to successfully realize the full capabilities of Ada are discussed.

  12. ADA Guide for Small Businesses.

    ERIC Educational Resources Information Center

    Department of Justice, Washington, DC. Civil Rights Div.

    This guide presents an informal overview of some basic Americans with Disabilities Act (ADA) requirements for small businesses that provide goods or services to the public. References to key sections of the regulations or other information are included. The first section describes the ADA briefly. Section two lists the 12 categories of public…

  13. ADAS: Asiago-DLR Asteroid Survey

    NASA Astrophysics Data System (ADS)

    Barbieri, C.; Bertini, I.; Magrin, S.; Salvadori, L.; Calvani, M.; Claudi, R.; Pignata, G.; Hahn, G.; Mottola, S.; Hoffmann, M.

    The Asiago-DLR Asteroid Survey is the joint program among the Department of Astronomy and Astronomical Observatory of Padova and the DLR Berlin, dedicated to the search of asteroids. The Minor Planet Center has attributed to ADAS the survey code 209. The project is carried out since the end of December 2000 with the S67/92cm telescope at Asiago - Cima Ekar equipped with the SCAM-1 camera of DLR, in Time Delay Integration mode, in a strip from -5o to +15o around the celestial equator. The camera has a front illuminated Loral chip of 2048x2048 pixels of 15 mu m each, covering a field of 49'x49' with a resolution of 1.4'' pixel-1. This paper presents the main results obtained till March 15, 2002, when the telescope has been closed for a complete overhaul. ADAS will resume presumably at the end of June 2002.

  14. Source-Code Instrumentation and Quantification of Events

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Havelund, Klaus; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Aspect Oriented Programming (AOP) is making quantified programmatic assertions over programs that otherwise are not annotated to receive these assertions. Varieties of AOP systems are characterized by which quantified assertions they allow, what they permit in the actions of the assertions (including how the actions interact with the base code), and what mechanisms they use to achieve the overall effect. Here, we argue that all quantification is over dynamic events, and describe our preliminary work in developing a system that maps dynamic events to transformations over source code. We discuss possible applications of this system, particularly with respect to debugging concurrent systems.

  15. The Need for Vendor Source Code at NAS. Revised

    NASA Technical Reports Server (NTRS)

    Carter, Russell; Acheson, Steve; Blaylock, Bruce; Brock, David; Cardo, Nick; Ciotti, Bob; Poston, Alan; Wong, Parkson; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    The Numerical Aerodynamic Simulation (NAS) Facility has a long standing practice of maintaining buildable source code for installed hardware. There are two reasons for this: NAS's designated pathfinding role, and the need to maintain a smoothly running operational capacity given the widely diversified nature of the vendor installations. NAS has a need to maintain support capabilities when vendors are not able; diagnose and remedy hardware or software problems where applicable; and to support ongoing system software development activities whether or not the relevant vendors feel support is justified. This note provides an informal history of these activities at NAS, and brings together the general principles that drive the requirement that systems integrated into the NAS environment run binaries built from source code, onsite.

  16. Verification test calculations for the Source Term Code Package

    SciTech Connect

    Denning, R S; Wooton, R O; Alexander, C A; Curtis, L A; Cybulskis, P; Gieseke, J A; Jordan, H; Lee, K W; Nicolosi, S L

    1986-07-01

    The purpose of this report is to demonstrate the reasonableness of the Source Term Code Package (STCP) results. Hand calculations have been performed spanning a wide variety of phenomena within the context of a single accident sequence, a loss of all ac power with late containment failure, in the Peach Bottom (BWR) plant, and compared with STCP results. The report identifies some of the limitations of the hand calculation effort. The processes involved in a core meltdown accident are complex and coupled. Hand calculations by their nature must deal with gross simplifications of these processes. Their greatest strength is as an indicator that a computer code contains an error, for example that it doesn't satisfy basic conservation laws, rather than in showing the analysis accurately represents reality. Hand calculations are an important element of verification but they do not satisfy the need for code validation. The code validation program for the STCP is a separate effort. In general the hand calculation results show that models used in the STCP codes (e.g., MARCH, TRAP-MELT, VANESA) obey basic conservation laws and produce reasonable results. The degree of agreement and significance of the comparisons differ among the models evaluated. 20 figs., 26 tabs.

  17. Interest and responsibility of ADA in dental licensure.

    PubMed

    Jones, T Howard; Neumann, Laura M; Haglund, Lois J

    2006-03-01

    Initial licensure is a critical milestone and point of entry to the profession; it should go without saying that the organization that represents more than 70% of professionally active dentists would care deeply about the process that determines the character of its future and defines its image in the eyes of the public. The American Dental Association's (ADA) documented history of activity and leadership on licensure issues and the organization's guiding documents (Strategic Plan, Current Policies, Principles of Ethics and Code of Professional Conduct and Constitution and Bylaws) all lend credence to the Association's role in the licensure process. ADA members, other dental organizations, private and governmental agencies, and the public recognize the ADA as an authority on matters relating to dentistry. These circumstances comprise the best available evidence supporting the important role of the ADA in facilitating communication, collaboration and consensus-building in the continuous enhancement of the licensure process to meet the needs of all stakeholders. PMID:17138418

  18. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (DEC VAX VERSION)

    NASA Technical Reports Server (NTRS)

    Merwarth, P. D.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  19. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (IBM VERSION)

    NASA Technical Reports Server (NTRS)

    Manteufel, R.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  20. Praxis - An alternative to Ada

    SciTech Connect

    Holloway, F.W.; Sherman, T.A.

    1987-08-01

    This report describes Praxis, a modern, complete, block structured, strongly typed, programming language, comparable to Ada, and with special emphasis toward meeting systems programming requirements on all level machines. Praxis is considered as a possible alternative to Ada in certain applications. Praxis has been used since 1980 on the distributed control system for the Nova high energy laser at Lawrence Livermore National Laboratory. A description of the intended applications, the history of development, and an overview of the features of the language with comparisons to the Pascal and Ada languages, are given. The features are described in four categories: general appearance, power, insurance of freedom from errors, and manageability.

  1. Praxis: An alternative to Ada

    SciTech Connect

    Holloway, F.W.; Sherman, T.A.

    1987-05-13

    This report describes Praxis, a modern, complete, block structures, strongly typed, programming language, comparable to Ada, and with special emphasis toward meeting systems programming requirements on all level machines. Praxis is considered as a possible alternative to Ada in certain applications. Praxis has been used since 1980 on the distributed control system for the Nova high energy laser at Lawrence Livermore National Laboratory. A description of the intended applications, the history of development, and an overview of the features of the language with comparisons to the Pascal and Ada languages, are given. The features are described in four categories: general appearance, power, insurance of freedom from errors, and manageability.

  2. SDI satellite autonomy using AI and Ada

    NASA Technical Reports Server (NTRS)

    Fiala, Harvey E.

    1990-01-01

    The use of Artificial Intelligence (AI) and the programming language Ada to help a satellite recover from selected failures that could lead to mission failure are described. An unmanned satellite will have a separate AI subsystem running in parallel with the normal satellite subsystems. A satellite monitoring subsystem (SMS), under the control of a blackboard system, will continuously monitor selected satellite subsystems to become alert to any actual or potential problems. In the case of loss of communications with the earth or the home base, the satellite will go into a survival mode to reestablish communications with the earth. The use of an AI subsystem in this manner would have avoided the tragic loss of the two recent Soviet probes that were sent to investigate the planet Mars and its moons. The blackboard system works in conjunction with an SMS and a reconfiguration control subsystem (RCS). It can be shown to be an effective way for one central control subsystem to monitor and coordinate the activities and loads of many interacting subsystems that may or may not contain redundant and/or fault-tolerant elements. The blackboard system will be coded in Ada using tools such as the ABLE development system and the Ada Production system.

  3. Robust video transmission with distributed source coded auxiliary channel.

    PubMed

    Wang, Jiajun; Majumdar, Abhik; Ramchandran, Kannan

    2009-12-01

    We propose a novel solution to the problem of robust, low-latency video transmission over lossy channels. Predictive video codecs, such as MPEG and H.26x, are very susceptible to prediction mismatch between encoder and decoder or "drift" when there are packet losses. These mismatches lead to a significant degradation in the decoded quality. To address this problem, we propose an auxiliary codec system that sends additional information alongside an MPEG or H.26x compressed video stream to correct for errors in decoded frames and mitigate drift. The proposed system is based on the principles of distributed source coding and uses the (possibly erroneous) MPEG/H.26x decoder reconstruction as side information at the auxiliary decoder. The distributed source coding framework depends upon knowing the statistical dependency (or correlation) between the source and the side information. We propose a recursive algorithm to analytically track the correlation between the original source frame and the erroneous MPEG/H.26x decoded frame. Finally, we propose a rate-distortion optimization scheme to allocate the rate used by the auxiliary encoder among the encoding blocks within a video frame. We implement the proposed system and present extensive simulation results that demonstrate significant gains in performance both visually and objectively (on the order of 2 dB in PSNR over forward error correction based solutions and 1.5 dB in PSNR over intrarefresh based solutions for typical scenarios) under tight latency constraints. PMID:19703801

  4. Ada Linear-Algebra Program

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.; Lawson, C. L.

    1988-01-01

    Routines provided for common scalar, vector, matrix, and quaternion operations. Computer program extends Ada programming language to include linear-algebra capabilities similar to HAS/S programming language. Designed for such avionics applications as software for Space Station.

  5. Documentation generator application for MatLab source codes

    NASA Astrophysics Data System (ADS)

    Niton, B.; Pozniak, K. T.; Romaniuk, R. S.

    2011-06-01

    The UML, which is a complex system modeling and description technology, has recently been expanding its uses in the field of formalization and algorithmic approach to such systems like multiprocessor photonic, optoelectronic and advanced electronics carriers; distributed, multichannel measurement systems; optical networks, industrial electronics, novel R&D solutions. The paper describes a realization of an application for documenting MatLab source codes. There are presented own novel solution based on Doxygen program which is available on the free license, with accessible source code. The used supporting tools for parser building were Bison and Flex. There are presented the practical results of the documentation generator. The program was applied for exemplary MatLab codes. The documentation generator application is used for design of large optoelectronic and electronic measurement and control systems. The paper consists of three parts which describe the following components of the documentation generator for photonic and electronic systems: concept, MatLab application and VHDL application. This is part two which describes the MatLab application. MatLab is used for description of the measured phenomena.

  6. Energy efficient wireless sensor networks using asymmetric distributed source coding

    NASA Astrophysics Data System (ADS)

    Rao, Abhishek; Kulkarni, Murlidhar

    2013-01-01

    Wireless Sensor Networks (WSNs) are networks of sensor nodes deployed over a geographical area to perform a specific task. WSNs pose many design challenges. Energy conservation is one such design issue. In literature a wide range of solutions addressing this issue have been proposed. Generally WSNs are densely deployed. Thus the nodes with the close proximity are more likely to have the same data. Transmission of such non-aggregated data may lead to an inefficient energy management. Hence the data fusion has to be performed at the nodes so as to combine the edundant information into a single data unit. Distributed Source Coding is an efficient approach in achieving this task. In this paper an attempt has been made in modeling such a system. Various energy efficient codes were considered for the analysis. System performance in terms of energy efficiency has been made.

  7. Development of parallel DEM for the open source code MFIX

    SciTech Connect

    Gopalakrishnan, Pradeep; Tafti, Danesh

    2013-02-01

    The paper presents the development of a parallel Discrete Element Method (DEM) solver for the open source code, Multiphase Flow with Interphase eXchange (MFIX) based on the domain decomposition method. The performance of the code was evaluated by simulating a bubbling fluidized bed with 2.5 million particles. The DEM solver shows strong scalability up to 256 processors with an efficiency of 81%. Further, to analyze weak scaling, the static height of the fluidized bed was increased to hold 5 and 10 million particles. The results show that global communication cost increases with problem size while the computational cost remains constant. Further, the effects of static bed height on the bubble hydrodynamics and mixing characteristics are analyzed.

  8. Users manual for doctext: Producing documentation from C source code

    SciTech Connect

    Gropp, W.

    1995-03-01

    One of the major problems that software library writers face, particularly in a research environment, is the generation of documentation. Producing good, professional-quality documentation is tedious and time consuming. Often, no documentation is produced. For many users, however, much of the need for documentation may be satisfied by a brief description of the purpose and use of the routines and their arguments. Even for more complete, hand-generated documentation, this information provides a convenient starting point. We describe here a tool that may be used to generate documentation about programs written in the C language. It uses a structured comment convention that preserves the original C source code and does not require any additional files. The markup language is designed to be an almost invisible structured comment in the C source code, retaining readability in the original source. Documentation in a form suitable for the Unix man program (nroff), LaTeX, and the World Wide Web can be produced.

  9. Compiling knowledge-based systems from KEE to Ada

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    The dominant technology for developing AI applications is to work in a multi-mechanism, integrated, knowledge-based system (KBS) development environment. Unfortunately, systems developed in such environments are inappropriate for delivering many applications - most importantly, they carry the baggage of the entire Lisp environment and are not written in conventional languages. One resolution of this problem would be to compile applications from complex environments to conventional languages. Here the first efforts to develop a system for compiling KBS developed in KEE to Ada (trademark). This system is called KATYDID, for KEE/Ada Translation Yields Development Into Delivery. KATYDID includes early prototypes of a run-time KEE core (object-structure) library module for Ada, and translation mechanisms for knowledge structures, rules, and Lisp code to Ada. Using these tools, part of a simple expert system was compiled (not quite automatically) to run in a purely Ada environment. This experience has given us various insights on Ada as an artificial intelligence programming language, potential solutions of some of the engineering difficulties encountered in early work, and inspiration on future system development.

  10. Utilities for master source code distribution: MAX and Friends

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1988-01-01

    MAX is a program for the manipulation of FORTRAN master source code (MSC). This is a technique by which one maintains one and only one master copy of a FORTRAN program under a program developing system, which for MAX is assumed to be VAX/VMS. The master copy is not intended to be directly compiled. Instead it must be pre-processed by MAX to produce compilable instances. These instances may correspond to different code versions (for example, double precision versus single precision), different machines (for example, IBM, CDC, Cray) or different operating systems (i.e., VAX/VMS versus VAX/UNIX). The advantage os using a master source is more pronounced in complex application programs that are developed and maintained over many years and are to be transported and executed on several computer environments. The version lag problem that plagues many such programs is avoided by this approach. MAX is complemented by several auxiliary programs that perform nonessential functions. The ensemble is collectively known as MAX and Friends. All of these programs, including MAX, are executed as foreign VAX/VMS commands and can easily be hidden in customized VMS command procedures.

  11. Joint source-channel coding with allpass filtering source shaping for image transmission over noisy channels

    NASA Astrophysics Data System (ADS)

    Cai, Jianfei; Chen, Chang W.

    2000-04-01

    In this paper, we proposed a fixed-length robust joint source- channel coding (JSCC) scheme for image transmission over noisy channels. Three channel models are studied: binary symmetric channels (BSC) and additive white Gaussian noise (AWGN) channels for memoryless channels, and Gilbert-Elliott channels (GEC) for bursty channels. We derive, in this research, an explicit operational rate-distortion (R-D) function, which represents an end-to-end error measurement that includes errors due to both quantization and channel noise. In particular, we are able to incorporate the channel transition probability and channel bit error rate into the R-D function in the case of bursty channels. With the operational R-D function, bits are allocated not only among different subsources, but also between source coding and channel coding so that, under a fixed transmission rate, an optimum tradeoff between source coding accuracy and channel error protection can be achieved. This JSCC scheme is also integrated with allpass filtering source shaping to further improve the robustness against channel errors. Experimental results show that the proposed scheme can achieve not only high PSNR performance, but also excellent perceptual quality. Compared with the state-of-the-art JSCC schemes, this proposed scheme outperforms most of them especially when the channel mismatch occurs.

  12. Multiprocessor performance modeling with ADAS

    NASA Technical Reports Server (NTRS)

    Hayes, Paul J.; Andrews, Asa M.

    1989-01-01

    A graph managing strategy referred to as the Algorithm to Architecture Mapping Model (ATAMM) appears useful for the time-optimized execution of application algorithm graphs in embedded multiprocessors and for the performance prediction of graph designs. This paper reports the modeling of ATAMM in the Architecture Design and Assessment System (ADAS) to make an independent verification of ATAMM's performance prediction capability and to provide a user framework for the evaluation of arbitrary algorithm graphs. Following an overview of ATAMM and its major functional rules are descriptions of the ADAS model of ATAMM, methods to enter an arbitrary graph into the model, and techniques to analyze the simulation results. The performance of a 7-node graph example is evaluated using the ADAS model and verifies the ATAMM concept by substantiating previously published performance results.

  13. AdaNET research plan

    NASA Technical Reports Server (NTRS)

    Mcbride, John G.

    1990-01-01

    The mission of the AdaNET research effort is to determine how to increase the availability of reusable Ada components and associated software engineering technology to both private and Federal sectors. The effort is structured to define the requirements for transfer of Federally developed software technology, study feasible approaches to meeting the requirements, and to gain experience in applying various technologies and practices. The overall approach to the development of the AdaNET System Specification is presented. A work breakdown structure is presented with each research activity described in detail. The deliverables for each work area are summarized. The overall organization and responsibilities for each research area are described. The schedule and necessary resources are presented for each research activity. The estimated cost is summarized for each activity. The project plan is fully described in the Super Project Expert data file contained on the floppy disk attached to the back cover of this plan.

  14. Ada (R) assessment: An important issue within European Columbus Support Technology Programme

    NASA Technical Reports Server (NTRS)

    Vielcanet, P.

    1986-01-01

    Software will be more important and more critical for Columbus than for any ESA previous project. As a simple comparison, overall software size has been in the range of 100 K source statements for EXOSAT, 500 K for Spacelab, and will probably reach several million lines of code for Columbus (all element together). Based on past experience, the total development cost of software can account for about 10 pct to 15 pct of the total space project development cost. The Ada technology may support the strong software engineering principles needed for Columbus, provided that technology is sufficiently mature and industry plans are meeting the Columbus project schedule. Over the past 3 years, Informatique Internationale has conducted a coherent program based on Ada technology assessment studies and experiments, for ESA and CNES. This specific research and development program benefits from 15 years experience in the field of space software development and is supported by the overall software engineering expertise of the company. The assessment and experiments of Ada software engineering by Informatique Internationale are detailed.

  15. A Comparison of Source Code Plagiarism Detection Engines

    NASA Astrophysics Data System (ADS)

    Lancaster, Thomas; Culwin, Fintan

    2004-06-01

    Automated techniques for finding plagiarism in student source code submissions have been in use for over 20 years and there are many available engines and services. This paper reviews the literature on the major modern detection engines, providing a comparison of them based upon the metrics and techniques they deploy. Generally the most common and effective techniques are seen to involve tokenising student submissions then searching pairs of submissions for long common substrings, an example of what is defined to be a paired structural metric. Computing academics are recommended to use one of the two Web-based detection engines, MOSS and JPlag. It is shown that whilst detection is well established there are still places where further research would be useful, particularly where visual support of the investigation process is possible.

  16. Continuation of research into language concepts for the mission support environment: Source code

    NASA Technical Reports Server (NTRS)

    Barton, Timothy J.; Ratner, Jeremiah M.

    1991-01-01

    Research into language concepts for the Mission Control Center is presented. A computer code for source codes is presented. The file contains the routines which allow source code files to be created and compiled. The build process assumes that all elements and the COMP exist in the current directory. The build process places as much code generation as possible on the preprocessor as possible. A summary is given of the source files as used and/or manipulated by the build routine.

  17. Technology Infusion of CodeSonar into the Space Network Ground Segment

    NASA Technical Reports Server (NTRS)

    Benson, Markland J.

    2009-01-01

    This slide presentation reviews the applicability of CodeSonar to the Space Network software. CodeSonar is a commercial off the shelf system that analyzes programs written in C, C++ or Ada for defects in the code. Software engineers use CodeSonar results as an input to the existing source code inspection process. The study is focused on large scale software developed using formal processes. The systems studied are mission critical in nature but some use commodity computer systems.

  18. Storage management in Ada. Three reports. Volume 1: Storage management in Ada as a risk to the development of reliable software. Volume 2: Relevant aspects of language. Volume 3: Requirements of the language versus manifestations of current implementations

    NASA Technical Reports Server (NTRS)

    Auty, David

    1988-01-01

    The risk to the development of program reliability is derived from the use of a new language and from the potential use of new storage management techniques. With Ada and associated support software, there is a lack of established guidelines and procedures, drawn from experience and common usage, which assume reliable behavior. The risk is identified and clarified. In order to provide a framework for future consideration of dynamic storage management on Ada, a description of the relevant aspects of the language is presented in two sections: Program data sources, and declaration and allocation in Ada. Storage-management characteristics of the Ada language and storage-management characteristics of Ada implementations are differentiated. Terms that are used are defined in a narrow and precise sense. The storage-management implications of the Ada language are described. The storage-management options available to the Ada implementor and the implications of the implementor's choice for the Ada programmer are also described.

  19. SOURCES: a code for calculating (alpha,n), spontaneous fission, and delayed neutron sources and spectra.

    PubMed

    Wilson, W B; Perry, R T; Charlton, W S; Parish, T A; Shores, E F

    2005-01-01

    SOURCES is a computer code that determines neutron production rates and spectra from (alpha,n) reactions, spontaneous fission and delayed neutron emission owing to the decay of radionuclides in homogeneous media, interface problems and three-region interface problems. The code is also capable of calculating the neutron production rates due to (alpha,n) reactions induced by a monoenergetic beam of alpha particles incident on a slab of target material. The (alpha,n) spectra are calculated using an assumed isotropic angular distribution in the centre-of-mass system with a library of 107 nuclide decay alpha-particle spectra, 24 sets of measured and/or evaluated (alpha,n) cross sections and product nuclide level branching fractions, and functional alpha particle stopping cross sections for Z < 106. Spontaneous fission sources and spectra are calculated with evaluated half-life, spontaneous fission branching and Watt spectrum parameters for 44 actinides. The delayed neutron spectra are taken from an evaluated library of 105 precursors. The code outputs the magnitude and spectra of the resultant neutron sources. It also provides an analysis of the contributions to that source by each nuclide in the problem. PMID:16381695

  20. Can space station software be specified through Ada?

    NASA Technical Reports Server (NTRS)

    Knoebel, Arthur

    1987-01-01

    Programming of the space station is to be done in the Ada programming language. A breadboard of selected parts of the work package for Marshall Space Flight Center is to be built, and programming this small part will be a good testing ground for Ada. One coding of the upper levels of the design brings out several problems with top-down design when it is to be carried out strictly within the language. Ada is evaluated on the basis of this experience, and the points raised are compared with other experience as related in the literature. Rapid prototyping is another approach to the initial programming; several different types of prototypes are discussed, and compared with the art of specification. Some solutions are proposed and a number of recommendations presented.

  1. Software reuse issues affecting AdaNET

    NASA Technical Reports Server (NTRS)

    Mcbride, John G.

    1989-01-01

    The AdaNet program is reviewing its long-term goals and strategies. A significant concern is whether current AdaNet plans adequately address the major strategic issues of software reuse technology. The major reuse issues of providing AdaNet services that should be addressed as part of future AdaNet development are identified and reviewed. Before significant development proceeds, a plan should be developed to resolve the aforementioned issues. This plan should also specify a detailed approach to develop AdaNet. A three phased strategy is recommended. The first phase would consist of requirements analysis and produce an AdaNet system requirements specification. It would consider the requirements of AdaNet in terms of mission needs, commercial realities, and administrative policies affecting development, and the experience of AdaNet and other projects promoting the transfer software engineering technology. Specifically, requirements analysis would be performed to better understand the requirements for AdaNet functions. The second phase would provide a detailed design of the system. The AdaNet should be designed with emphasis on the use of existing technology readily available to the AdaNet program. A number of reuse products are available upon which AdaNet could be based. This would significantly reduce the risk and cost of providing an AdaNet system. Once a design was developed, implementation would proceed in the third phase.

  2. Renovating To Meet ADA Standards.

    ERIC Educational Resources Information Center

    Huber, Judy; Jones, Garry

    2003-01-01

    Using the examples of Owen D. Young School in Van Hornesville, New York, and the Tonawanda City school district in Buffalo, New York, describes how school planners should take the accessibility standards mandated by the Americans with Disabilities Act (ADA) into account when renovating. (EV)

  3. Optimal source codes for geometrically distributed integer alphabets

    NASA Technical Reports Server (NTRS)

    Gallager, R. G.; Van Voorhis, D. C.

    1975-01-01

    An approach is shown for using the Huffman algorithm indirectly to prove the optimality of a code for an infinite alphabet if an estimate concerning the nature of the code can be made. Attention is given to nonnegative integers with a geometric probability assignment. The particular distribution considered arises in run-length coding and in encoding protocol information in data networks. Questions of redundancy of the optimal code are also investigated.

  4. FLOWTRAN-TF v1.2 source code

    SciTech Connect

    Aleman, S.E.; Cooper, R.E.; Flach, G.P.; Hamm, L.L.; Lee, S.; Smith, F.G. III

    1993-02-01

    The FLOWTRAN-TF code development effort was initiated in early 1989 as a code to monitor production reactor cooling systems at the Savannah River Plant. This report is a documentation of the various codes that make up FLOWTRAN-TF.

  5. FLOWTRAN-TF v1. 2 source code

    SciTech Connect

    Aleman, S.E.; Cooper, R.E.; Flach, G.P.; Hamm, L.L.; Lee, S.; Smith, F.G. III.

    1993-02-01

    The FLOWTRAN-TF code development effort was initiated in early 1989 as a code to monitor production reactor cooling systems at the Savannah River Plant. This report is a documentation of the various codes that make up FLOWTRAN-TF.

  6. HELIOS: A new open-source radiative transfer code

    NASA Astrophysics Data System (ADS)

    Malik, Matej; Grosheintz, Luc; Lukas Grimm, Simon; Mendonça, João; Kitzmann, Daniel; Heng, Kevin

    2015-12-01

    I present the new open-source code HELIOS, developed to accurately describe radiative transfer in a wide variety of irradiated atmospheres. We employ a one-dimensional multi-wavelength two-stream approach with scattering. Written in Cuda C++, HELIOS uses the GPU’s potential of massive parallelization and is able to compute the TP-profile of an atmosphere in radiative equilibrium and the subsequent emission spectrum in a few minutes on a single computer (for 60 layers and 1000 wavelength bins).The required molecular opacities are obtained with the recently published code HELIOS-K [1], which calculates the line shapes from an input line list and resamples the numerous line-by-line data into a manageable k-distribution format. Based on simple equilibrium chemistry theory [2] we combine the k-distribution functions of the molecules H2O, CO2, CO & CH4 to generate a k-table, which we then employ in HELIOS.I present our results of the following: (i) Various numerical tests, e.g. isothermal vs. non-isothermal treatment of layers. (ii) Comparison of iteratively determined TP-profiles with their analytical parametric prescriptions [3] and of the corresponding spectra. (iii) Benchmarks of TP-profiles & spectra for various elemental abundances. (iv) Benchmarks of averaged TP-profiles & spectra for the exoplanets GJ1214b, HD189733b & HD209458b. (v) Comparison with secondary eclipse data for HD189733b, XO-1b & Corot-2b.HELIOS is being developed, together with the dynamical core THOR and the chemistry solver VULCAN, in the group of Kevin Heng at the University of Bern as part of the Exoclimes Simulation Platform (ESP) [4], which is an open-source project aimed to provide community tools to model exoplanetary atmospheres.-----------------------------[1] Grimm & Heng 2015, ArXiv, 1503.03806[2] Heng, Lyons & Tsai, Arxiv, 1506.05501Heng & Lyons, ArXiv, 1507.01944[3] e.g. Heng, Mendonca & Lee, 2014, ApJS, 215, 4H[4] exoclime.net

  7. Ada--Programming Language of the Future.

    ERIC Educational Resources Information Center

    Rudd, David

    1983-01-01

    Ada is a programing language developed for the Department of Defense, with a registered trademark. It was named for Ada Augusta, coworker of Charles Babbage and the world's first programer. The Department of Defense hopes to prevent variations and to establish Ada as a consistent, standardized language. (MNS)

  8. Modelling RF sources using 2-D PIC codes

    SciTech Connect

    Eppley, K.R.

    1993-03-01

    In recent years, many types of RF sources have been successfully modelled using 2-D PIC codes. Both cross field devices (magnetrons, cross field amplifiers, etc.) and pencil beam devices (klystrons, gyrotrons, TWT`S, lasertrons, etc.) have been simulated. All these devices involve the interaction of an electron beam with an RF circuit. For many applications, the RF structure may be approximated by an equivalent circuit, which appears in the simulation as a boundary condition on the electric field (``port approximation``). The drive term for the circuit is calculated from the energy transfer between beam and field in the drift space. For some applications it may be necessary to model the actual geometry of the structure, although this is more expensive. One problem not entirely solved is how to accurately model in 2-D the coupling to an external waveguide. Frequently this is approximated by a radial transmission line, but this sometimes yields incorrect results. We also discuss issues in modelling the cathode and injecting the beam into the PIC simulation.

  9. Modelling RF sources using 2-D PIC codes

    SciTech Connect

    Eppley, K.R.

    1993-03-01

    In recent years, many types of RF sources have been successfully modelled using 2-D PIC codes. Both cross field devices (magnetrons, cross field amplifiers, etc.) and pencil beam devices (klystrons, gyrotrons, TWT'S, lasertrons, etc.) have been simulated. All these devices involve the interaction of an electron beam with an RF circuit. For many applications, the RF structure may be approximated by an equivalent circuit, which appears in the simulation as a boundary condition on the electric field ( port approximation''). The drive term for the circuit is calculated from the energy transfer between beam and field in the drift space. For some applications it may be necessary to model the actual geometry of the structure, although this is more expensive. One problem not entirely solved is how to accurately model in 2-D the coupling to an external waveguide. Frequently this is approximated by a radial transmission line, but this sometimes yields incorrect results. We also discuss issues in modelling the cathode and injecting the beam into the PIC simulation.

  10. An Open Source Embedding Code for the Condensed Phase

    NASA Astrophysics Data System (ADS)

    Genova, Alessandro; Ceresoli, Davide; Krishtal, Alisa; Andreussi, Oliviero; Distasio, Robert; Pavanello, Michele

    Work from our group as well as others has shown that for many systems such as molecular aggregates, liquids, and complex layered materials, subsystem Density-Functional Theory (DFT) is capable of immensely reducing the computational cost while providing a better and more intuitive insight into the underlying physics. We developed a massively parallel implementation of Subsystem DFT for the condensed phase into the open-source Quantum ESPRESSO software package. In this talk, we will discuss how we: (1) implemented such a flexible parallel framework aiming at the optimal load balancing; (2) simplified the solution of the electronic structure problem by allowing a fragment specific sampling of the first Brillouin Zone; (3) achieve enormous speedups by solving the electronic structure of each fragment in a unit cell smaller than the supersystem simulation cell, effectively introducing a fragment specific basis set, with no deterioration of the fully periodic simulation. As of March 14, 2016, the code has been released and is available to the public.

  11. Using National Drug Codes and Drug Knowledge Bases to Organize Prescription Records from Multiple Sources

    PubMed Central

    Simonaitis, Linas; McDonald, Clement J

    2009-01-01

    Purpose Pharmacy systems contain electronic prescription information needed for clinical care, decision support, performance measurements and research. The master files of most pharmacy systems include National Drug Codes (NDCs) as well as the local codes they use within their systems to identify the products they dispense. We sought to assess how well one could map the products dispensed by many pharmacies to clinically oriented codes via the mapping tables provided by Drug Knowledge Base (DKB) producers. Methods We obtained a large sample of prescription records from seven different sources. These records either carried a national product code or a local code that could be translated into a national product code via their formulary master. We obtained mapping tables from five DKBs. We measured the degree to which the DKB mapping tables covered the national product codes carried in, or associated with, our sample of prescription records. Results Considering the total prescription volume, DKBs covered 93.0% to 99.8% of the product codes (15 comparisons) from three outpatient, and 77.4% to 97.0% (20 comparisons) from four inpatient, sources. Among the inpatient sources, invented codes explained much – from 36% to 94% (3 of 4 sources) – of the non coverage. Outpatient pharmacy sources invented codes rarely – in 0.11% to 0.21% of their total prescription volume, and inpatient sources, more commonly – in 1.7% to 7.4% of their prescription volume. The distribution of prescribed products is highly skewed: from 1.4% to 4.4% of codes account for 50% of the message volume; from 10.7% to 34.5% of codes account for 90% of the volume. Conclusion DKBs cover the product codes used by outpatient sources sufficiently well to permit automatic mapping. Changes in policies and standards could increase coverage of product codes used by inpatient sources. PMID:19767382

  12. Ada and the rapid development lifecycle

    NASA Technical Reports Server (NTRS)

    Deforrest, Lloyd; Gref, Lynn

    1991-01-01

    JPL is under contract, through NASA, with the US Army to develop a state-of-the-art Command Center System for the US European Command (USEUCOM). The Command Center System will receive, process, and integrate force status information from various sources and provide this integrated information to staff officers and decision makers in a format designed to enhance user comprehension and utility. The system is based on distributed workstation class microcomputers, VAX- and SUN-based data servers, and interfaces to existing military mainframe systems and communication networks. JPL is developing the Command Center System utilizing an incremental delivery methodology called the Rapid Development Methodology with adherence to government and industry standards including the UNIX operating system, X Windows, OSF/Motif, and the Ada programming language. Through a combination of software engineering techniques specific to the Ada programming language and the Rapid Development Approach, JPL was able to deliver capability to the military user incrementally, with comparable quality and improved economies of projects developed under more traditional software intensive system implementation methodologies.

  13. BOISE RIVER STUDY IN ADA COUNTY IDAHO, 1978

    EPA Science Inventory

    The purpose of this study was to assess the impact of present point sources on the river and to obtain background information to develop effluent limitations for the City of Boise wastewater treatment facilities. The study was conducted on the Boise River (Ada County, ID) from L...

  14. A first collision source method for ATTILA, an unstructured tetrahedral mesh discrete ordinates code

    SciTech Connect

    Wareing, T.A.; Morel, J.E.; Parsons, D.K.

    1998-12-01

    A semi-analytic first collision source method is developed for the transport code, ATTILA, a three-dimensional, unstructured tetrahedral mesh, discrete-ordinates code. This first collision source method is intended to mitigate ray effects due to point sources. The method is third-order accurate, which is the same order of accuracy as the linear-discontinuous spatial differencing scheme used in ATTILA. Numerical results are provided to demonstrate the accuracy and efficiency of the first collision source method.

  15. Process Model Improvement for Source Code Plagiarism Detection in Student Programming Assignments

    ERIC Educational Resources Information Center

    Kermek, Dragutin; Novak, Matija

    2016-01-01

    In programming courses there are various ways in which students attempt to cheat. The most commonly used method is copying source code from other students and making minimal changes in it, like renaming variable names. Several tools like Sherlock, JPlag and Moss have been devised to detect source code plagiarism. However, for larger student…

  16. Presenting an Alternative Source Code Plagiarism Detection Framework for Improving the Teaching and Learning of Programming

    ERIC Educational Resources Information Center

    Hattingh, Frederik; Buitendag, Albertus A. K.; van der Walt, Jacobus S.

    2013-01-01

    The transfer and teaching of programming and programming related skills has become, increasingly difficult on an undergraduate level over the past years. This is partially due to the number of programming languages available as well as access to readily available source code over the Web. Source code plagiarism is common practice amongst many…

  17. Judge says ADA covers content of insurance products.

    PubMed

    1998-05-01

    A Federal judge in Chicago ruled that the Americans with Disabilities Act (ADA) applies to the content of insurance policies, and says that Mutual of Omaha broke the law by artificially capping AIDS-related medical benefits. Federal courts have been divided on the issue. U.S. District Court Judge Suzanne Conlon ruled against the insurance company in a case with two defendants, John Doe and [name removed]. Both Doe and [name removed] have health insurance policies with Mutual; Doe's benefits are capped at $100,000 and [name removed]'s policy is capped at $25,000. Both are nearing the policy limit with the cost of combination antiviral drugs. Doe and [name removed] charge that the caps violate the ADA and the Illinois Insurance Code because they target a specific disability without regard to sound actuarial practices. Mutual of Omaha contends that capping benefits for AIDS is no different from providing lesser coverage for mental conditions, a situation approved by the courts. Mutual also argues that the ADA regulates access to services, but not the substance of the services. It seems unlikely that Mutual can financially justify having two different dollar limits for the caps. The case is scheduled for September. Other cases supporting the view that insurance products fall under the ADA are listed. PMID:11365315

  18. QUEST/Ada (Query Utility Environment for Software Testing) of Ada: The development of a program analysis environment for Ada

    NASA Technical Reports Server (NTRS)

    Brown, David B.

    1988-01-01

    A history of the Query Utility Environment for Software Testing (QUEST)/Ada is presented. A fairly comprehensive literature review which is targeted toward issues of Ada testing is given. The definition of the system structure and the high level interfaces are then presented. The design of the three major components is described. The QUEST/Ada IORL System Specifications to this point in time are included in the Appendix. A paper is also included in the appendix which gives statistical evidence of the validity of the test case generation approach which is being integrated into QUEST/Ada.

  19. Joint-source-channel coding scheme for scalable video-coding-based digital video broadcasting, second generation satellite broadcasting system

    NASA Astrophysics Data System (ADS)

    Seo, Kwang-Deok; Chi, Won Sup; Lee, In Ki; Chang, Dae-Ig

    2010-10-01

    We propose a joint-source-channel coding (JSCC) scheme that can provide and sustain high-quality video service in spite of deteriorated transmission channel conditions of the second generation of the digital video broadcasting (DVB-S2) satellite broadcasting service. Especially by combining the layered characteristics of the SVC (scalable video coding) video and the robust channel coding capability of LDPC (low-density parity check) employed for DVB-S2, a new concept of JSCC for digital satellite broadcasting service is developed. Rain attenuation in high-frequency bands such as the Ka band is a major factor for lowering the link capacity in satellite broadcasting service. Therefore, it is necessary to devise a new technology to dynamically manage the rain attenuation by adopting a JSCC scheme that can apply variable code rates for both source and channel coding. For this purpose, we develop a JSCC scheme by combining SVC and LDPC, and prove the performance of the proposed JSCC scheme by extensive simulations where SVC coded video is transmitted over various error-prone channels with AWGN (additive white Gaussian noise) patterns in DVB-S2 broadcasting service.

  20. ART/Ada design project, phase 1

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An Ada-Based Expert System Building Tool Design Research Project was conducted. The goal was to investigate various issues in the context of the design of an Ada-based expert system building tool. An attempt was made to achieve a comprehensive understanding of the potential for embedding expert systems in Ada systems for eventual application in future projects. The current status of the project is described by introducing an operational prototype, ART/Ada. How the project was conducted is explained. The performance of the prototype is analyzed and compared with other related works. Future research directions are suggested.

  1. Multicode comparison of selected source-term computer codes

    SciTech Connect

    Hermann, O.W.; Parks, C.V.; Renier, J.P.; Roddy, J.W.; Ashline, R.C.; Wilson, W.B.; LaBauve, R.J.

    1989-04-01

    This report summarizes the results of a study to assess the predictive capabilities of three radionuclide inventory/depletion computer codes, ORIGEN2, ORIGEN-S, and CINDER-2. The task was accomplished through a series of comparisons of their output for several light-water reactor (LWR) models (i.e., verification). Of the five cases chosen, two modeled typical boiling-water reactors (BWR) at burnups of 27.5 and 40 GWd/MTU and two represented typical pressurized-water reactors (PWR) at burnups of 33 and 50 GWd/MTU. In the fifth case, identical input data were used for each of the codes to examine the results of decay only and to show differences in nuclear decay constants and decay heat rates. Comparisons were made for several different characteristics (mass, radioactivity, and decay heat rate) for 52 radionuclides and for nine decay periods ranging from 30 d to 10,000 years. Only fission products and actinides were considered. The results are presented in comparative-ratio tables for each of the characteristics, decay periods, and cases. A brief summary description of each of the codes has been included. Of the more than 21,000 individual comparisons made for the three codes (taken two at a time), nearly half (45%) agreed to within 1%, and an additional 17% fell within the range of 1 to 5%. Approximately 8% of the comparison results disagreed by more than 30%. However, relatively good agreement was obtained for most of the radionuclides that are expected to contribute the greatest impact to waste disposal. Even though some defects have been noted, each of the codes in the comparison appears to produce respectable results. 12 figs., 12 tabs.

  2. Ada training evaluation and recommendations from the Gamma Ray Observatory Ada Development Team

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Ada training experiences of the Gamma Ray Observatory Ada development team are related, and recommendations are made concerning future Ada training for software developers. Training methods are evaluated, deficiencies in the training program are noted, and a recommended approach, including course outline, time allocation, and reference materials, is offered.

  3. Joint source-channel coding: secured and progressive transmission of compressed medical images on the Internet.

    PubMed

    Babel, Marie; Parrein, Benoît; Déforges, Olivier; Normand, Nicolas; Guédon, Jean-Pierre; Coat, Véronique

    2008-06-01

    The joint source-channel coding system proposed in this paper has two aims: lossless compression with a progressive mode and the integrity of medical data, which takes into account the priorities of the image and the properties of a network with no guaranteed quality of service. In this context, the use of scalable coding, locally adapted resolution (LAR) and a discrete and exact Radon transform, known as the Mojette transform, meets this twofold requirement. In this paper, details of this joint coding implementation are provided as well as a performance evaluation with respect to the reference CALIC coding and to unequal error protection using Reed-Solomon codes. PMID:18289830

  4. A Line Source Shielding Code for Personal Computers.

    1990-12-22

    Version 00 LINEDOSE computes the gamma-ray dose from a pipe source modeled as a line. The pipe is assumed to be iron and has a concrete shield of arbitrary thickness. The calculation is made for eight source energies between 0.1 and 3.5 MeV.

  5. Learning a multi-dimensional companding function for lossy source coding.

    PubMed

    Maeda, Shin-ichi; Ishii, Shin

    2009-09-01

    Although the importance of lossy source coding has been growing, the general and practical methodology for its design has not been completely resolved. The well-known vector quantization (VQ) can represent any fixed-length lossy source coding, but requires too much computation resource. Companding vector quantization (CVQ) can reduce the complexity of non-structured VQ by replacing vector quantization with a set of scalar quantizations and can represent a wide class of practically useful VQs. Although an analytical derivation of optimal CVQ is difficult except for very limited cases, optimization using data samples can be performed instead. Here we propose a CVQ optimization method, which includes bit allocation by a newly derived distortion formula as a generalization of Bennett's formula, and test its validity. We applied the method to transform coding and compared the performance of our CVQ with those of Karhunen-Loëve transformation (KLT)-based coding and non-structured VQ. As a consequence, we found that our trained CVQ outperforms not only KLT-based coding but also non-structured VQ in the case of high bit-rate coding of linear mixtures of uniform sources. We also found that trained CVQ even outperformed KLT-based coding in the low bit-rate coding of a Gaussian source. To highlight the advantages of our approach, we also discuss the degradation of non-structured VQ and the limitations of theoretical analyses which are valid for high bit-rate coding. PMID:19556103

  6. Proceedings of the 2nd NASA Ada User's Symposium

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Several presentations, mostly in viewgraph form, on various topics relating to Ada applications are given. Topics covered include the use of Ada in NASA, Ada and the Space Station, the software support environment, Ada in the Software Engineering Laboratory, Ada at the Jet Propulsion Laboratory, the Flight Telerobotic Servicer, and lessons learned in prototyping the Space Station Remote Manipulator System control.

  7. Geophysical analysis for the Ada Tepe region (Bulgaria) - case study

    NASA Astrophysics Data System (ADS)

    Trifonova, Petya; Metodiev, Metodi; Solakov, Dimcho; Simeonova, Stela; Vatseva, Rumiana

    2013-04-01

    According to the current archeological investigations Ada Tepe is the oldest gold mine in Europe with Late Bronze and Early Iron age. It is a typical low-sulfidation epithermal gold deposit and is hosted in Maastrichtian-Paleocene sedimentary rocks above a detachment fault contact with underlying Paleozoic metamorphic rocks. Ada Tepe (25o.39'E; 41o.25'N) is located in the Eastern Rhodope unit. The region is highly segmented despite the low altitude (470-750 m) due to widespread volcanic and sediment rocks susceptible to torrential erosion during the cold season. Besides the thorough geological exploration focused on identifying cost-effective stocks of mineral resources, a detailed geophysical analysis concernig diferent stages of the gold extraction project was accomplished. We present the main results from the geophysical investigation aimed to clarify the complex seismotectonic setting of the Ada Tepe site region. The overall study methodology consists of collecting, reviewing and estimating geophysical and seismological information to constrain the model used for seismic hazard assessment of the area. Geophysical information used in the present work consists of gravity, geomagnetic and seismological data. Interpretation of gravity data is applied to outline the axes of steep gravity transitions marked as potential axes of faults, flexures and other structures of dislocation. Direct inverse techniques are also utilized to estimate the form and depth of anomalous sources. For the purposes of seismological investigation of the Ada Tepe site region an earthquake catalogue is compiled for the time period 510BC - 2011AD. Statistical parameters of seismicity - annual seismic rate parameter, ?, and the b-value of the Gutenberg-Richter exponential relation for Ada Tepe site region, are estimated. All geophysical datasets and derived results are integrated using GIS techniques ensuring interoperability of data when combining, processing and visualizing obtained

  8. Source Authentication for Code Dissemination Supporting Dynamic Packet Size in Wireless Sensor Networks †

    PubMed Central

    Kim, Daehee; Kim, Dongwan; An, Sunshin

    2016-01-01

    Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption. PMID:27409616

  9. Source Authentication for Code Dissemination Supporting Dynamic Packet Size in Wireless Sensor Networks.

    PubMed

    Kim, Daehee; Kim, Dongwan; An, Sunshin

    2016-01-01

    Code dissemination in wireless sensor networks (WSNs) is a procedure for distributing a new code image over the air in order to update programs. Due to the fact that WSNs are mostly deployed in unattended and hostile environments, secure code dissemination ensuring authenticity and integrity is essential. Recent works on dynamic packet size control in WSNs allow enhancing the energy efficiency of code dissemination by dynamically changing the packet size on the basis of link quality. However, the authentication tokens attached by the base station become useless in the next hop where the packet size can vary according to the link quality of the next hop. In this paper, we propose three source authentication schemes for code dissemination supporting dynamic packet size. Compared to traditional source authentication schemes such as μTESLA and digital signatures, our schemes provide secure source authentication under the environment, where the packet size changes in each hop, with smaller energy consumption. PMID:27409616

  10. Model-Based Least Squares Reconstruction of Coded Source Neutron Radiographs: Integrating the ORNL HFIR CG1D Source Model

    SciTech Connect

    Santos-Villalobos, Hector J; Gregor, Jens; Bingham, Philip R

    2014-01-01

    At the present, neutron sources cannot be fabricated small and powerful enough in order to achieve high resolution radiography while maintaining an adequate flux. One solution is to employ computational imaging techniques such as a Magnified Coded Source Imaging (CSI) system. A coded-mask is placed between the neutron source and the object. The system resolution is increased by reducing the size of the mask holes and the flux is increased by increasing the size of the coded-mask and/or the number of holes. One limitation of such system is that the resolution of current state-of-the-art scintillator-based detectors caps around 50um. To overcome this challenge, the coded-mask and object are magnified by making the distance from the coded-mask to the object much smaller than the distance from object to detector. In previous work, we have shown via synthetic experiments that our least squares method outperforms other methods in image quality and reconstruction precision because of the modeling of the CSI system components. However, the validation experiments were limited to simplistic neutron sources. In this work, we aim to model the flux distribution of a real neutron source and incorporate such a model in our least squares computational system. We provide a full description of the methodology used to characterize the neutron source and validate the method with synthetic experiments.

  11. Soft and Joint Source-Channel Decoding of Quasi-Arithmetic Codes

    NASA Astrophysics Data System (ADS)

    Guionnet, Thomas; Guillemot, Christine

    2004-12-01

    The issue of robust and joint source-channel decoding of quasi-arithmetic codes is addressed. Quasi-arithmetic coding is a reduced precision and complexity implementation of arithmetic coding. This amounts to approximating the distribution of the source. The approximation of the source distribution leads to the introduction of redundancy that can be exploited for robust decoding in presence of transmission errors. Hence, this approximation controls both the trade-off between compression efficiency and complexity and at the same time the redundancy ( excess rate) introduced by this suboptimality. This paper provides first a state model of a quasi-arithmetic coder and decoder for binary and[InlineEquation not available: see fulltext.]-ary sources. The design of an error-resilient soft decoding algorithm follows quite naturally. The compression efficiency of quasi-arithmetic codes allows to add extra redundancy in the form of markers designed specifically to prevent desynchronization. The algorithm is directly amenable for iterative source-channel decoding in the spirit of serial turbo codes. The coding and decoding algorithms have been tested for a wide range of channel signal-to-noise ratios (SNRs). Experimental results reveal improved symbol error rate (SER) and SNR performances against Huffman and optimal arithmetic codes.

  12. Building guide : how to build Xyce from source code.

    SciTech Connect

    Keiter, Eric Richard; Russo, Thomas V.; Schiek, Richard Louis; Sholander, Peter E.; Thornquist, Heidi K.; Mei, Ting; Verley, Jason C.

    2013-08-01

    While Xyce uses the Autoconf and Automake system to configure builds, it is often necessary to perform more than the customary %E2%80%9C./configure%E2%80%9D builds many open source users have come to expect. This document describes the steps needed to get Xyce built on a number of common platforms.

  13. Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.

    2013-12-01

    Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third

  14. Vector-matrix-quaternion, array and arithmetic packages: All HAL/S functions implemented in Ada

    NASA Technical Reports Server (NTRS)

    Klumpp, Allan R.; Kwong, David D.

    1986-01-01

    The HAL/S avionics programmers have enjoyed a variety of tools built into a language tailored to their special requirements. Ada is designed for a broader group of applications. Rather than providing built-in tools, Ada provides the elements with which users can build their own. Standard avionic packages remain to be developed. These must enable programmers to code in Ada as they have coded in HAL/S. The packages under development at JPL will provide all of the vector-matrix, array, and arithmetic functions described in the HAL/S manuals. In addition, the linear algebra package will provide all of the quaternion functions used in Shuttle steering and Galileo attitude control. Furthermore, using Ada's extensibility, many quaternion functions are being implemented as infix operations; equivalent capabilities were never implemented in HAL/S because doing so would entail modifying the compiler and expanding the language. With these packages, many HAL/S expressions will compile and execute in Ada, unchanged. Others can be converted simply by replacing the implicit HAL/S multiply operator with the Ada *. Errors will be trapped and identified. Input/output will be convenient and readable.

  15. Evaluating Open-Source Full-Text Search Engines for Matching ICD-10 Codes.

    PubMed

    Jurcău, Daniel-Alexandru; Stoicu-Tivadar, Vasile

    2016-01-01

    This research presents the results of evaluating multiple free, open-source engines on matching ICD-10 diagnostic codes via full-text searches. The study investigates what it takes to get an accurate match when searching for a specific diagnostic code. For each code the evaluation starts by extracting the words that make up its text and continues with building full-text search queries from the combinations of these words. The queries are then run against all the ICD-10 codes until a match indicates the code in question as a match with the highest relative score. This method identifies the minimum number of words that must be provided in order for the search engines choose the desired entry. The engines analyzed include a popular Java-based full-text search engine, a lightweight engine written in JavaScript which can even execute on the user's browser, and two popular open-source relational database management systems. PMID:27350484

  16. Experiences with Ada in an embedded system

    NASA Technical Reports Server (NTRS)

    Labaugh, Robert J.

    1988-01-01

    Recent experiences with using Ada in a real time environment are described. The application was the control system for an experimental robotic arm. The objectives of the effort were to experiment with developing embedded applications in Ada, evaluating the suitability of the language for the application, and determining the performance of the system. Additional objectives were to develop a control system based on the NASA/NBS Standard Reference Model for Telerobot Control System Architecture (NASREM) in Ada, and to experiment with the control laws and how to incorporate them into the NASREM architecture.

  17. Development of an Ada package library

    NASA Technical Reports Server (NTRS)

    Burton, Bruce; Broido, Michael

    1986-01-01

    A usable prototype Ada package library was developed and is currently being evaluated for use in large software development efforts. The library system is comprised of an Ada-oriented design language used to facilitate the collection of reuse information, a relational data base to store reuse information, a set of reusable Ada components and tools, and a set of guidelines governing the system's use. The prototyping exercise is discussed and the lessons learned from it have led to the definition of a comprehensive tool set to facilitate software reuse.

  18. IllinoisGRMHD: an open-source, user-friendly GRMHD code for dynamical spacetimes

    NASA Astrophysics Data System (ADS)

    Etienne, Zachariah B.; Paschalidis, Vasileios; Haas, Roland; Mösta, Philipp; Shapiro, Stuart L.

    2015-09-01

    In the extreme violence of merger and mass accretion, compact objects like black holes and neutron stars are thought to launch some of the most luminous outbursts of electromagnetic and gravitational wave energy in the Universe. Modeling these systems realistically is a central problem in theoretical astrophysics, but has proven extremely challenging, requiring the development of numerical relativity codes that solve Einstein's equations for the spacetime, coupled to the equations of general relativistic (ideal) magnetohydrodynamics (GRMHD) for the magnetized fluids. Over the past decade, the Illinois numerical relativity (ILNR) group's dynamical spacetime GRMHD code has proven itself as a robust and reliable tool for theoretical modeling of such GRMHD phenomena. However, the code was written ‘by experts and for experts’ of the code, with a steep learning curve that would severely hinder community adoption if it were open-sourced. Here we present IllinoisGRMHD, which is an open-source, highly extensible rewrite of the original closed-source GRMHD code of the ILNR group. Reducing the learning curve was the primary focus of this rewrite, with the goal of facilitating community involvement in the code's use and development, as well as the minimization of human effort in generating new science. IllinoisGRMHD also saves computer time, generating roundoff-precision identical output to the original code on adaptive-mesh grids, but nearly twice as fast at scales of hundreds to thousands of cores.

  19. Software engineering and Ada in design

    NASA Technical Reports Server (NTRS)

    Oneill, Don

    1986-01-01

    Modern software engineering promises significant reductions in software costs and improvements in software quality. The Ada language is the focus for these software methodology and tool improvements. The IBM FSD approach, including the software engineering practices that guide the systematic design and development of software products and the management of the software process are examined. The revised Ada design language adaptation is revealed. This four level design methodology is detailed including the purpose of each level, the management strategy that integrates the software design activity with the program milestones, and the technical strategy that maps the Ada constructs to each level of design. A complete description of each design level is provided along with specific design language recording guidelines for each level. Finally, some testimony is offered on education, tools, architecture, and metrics resulting from project use of the four level Ada design language adaptation.

  20. Ada programming guidelines for deterministic storage management

    NASA Technical Reports Server (NTRS)

    Auty, David

    1988-01-01

    Previous reports have established that a program can be written in the Ada language such that the program's storage management requirements are determinable prior to its execution. Specific guidelines for ensuring such deterministic usage of Ada dynamic storage requirements are described. Because requirements may vary from one application to another, guidelines are presented in a most-restrictive to least-restrictive fashion to allow the reader to match appropriate restrictions to the particular application area under investigation.

  1. Parallel Ada benchmarks for the SVMS

    NASA Technical Reports Server (NTRS)

    Collard, Philippe E.

    1990-01-01

    The use of parallel processing paradigm to design and develop faster and more reliable computers appear to clearly mark the future of information processing. NASA started the development of such an architecture: the Spaceborne VHSIC Multi-processor System (SVMS). Ada will be one of the languages used to program the SVMS. One of the unique characteristics of Ada is that it supports parallel processing at the language level through the tasking constructs. It is important for the SVMS project team to assess how efficiently the SVMS architecture will be implemented, as well as how efficiently Ada environment will be ported to the SVMS. AUTOCLASS II, a Bayesian classifier written in Common Lisp, was selected as one of the benchmarks for SVMS configurations. The purpose of the R and D effort was to provide the SVMS project team with the version of AUTOCLASS II, written in Ada, that would make use of Ada tasking constructs as much as possible so as to constitute a suitable benchmark. Additionally, a set of programs was developed that would measure Ada tasking efficiency on parallel architectures as well as determine the critical parameters influencing tasking efficiency. All this was designed to provide the SVMS project team with a set of suitable tools in the development of the SVMS architecture.

  2. Source coding with escort distributions and Rényi entropy bounds

    NASA Astrophysics Data System (ADS)

    Bercher, J.-F.

    2009-08-01

    We discuss the interest of escort distributions and Rényi entropy in the context of source coding. We first recall a source coding theorem by Campbell relating a generalized measure of length to the Rényi-Tsallis entropy. We show that the associated optimal codes can be obtained using considerations on escort-distributions. We propose a new family of measure of length involving escort-distributions and we show that these generalized lengths are also bounded below by the Rényi entropy. Furthermore, we obtain that the standard Shannon codes lengths are optimum for the new generalized lengths measures, whatever the entropic index. Finally, we show that there exists in this setting an interplay between standard and escort distributions.

  3. Toward the efficient implementation of expert systems in Ada

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel

    1990-01-01

    Here, the authors describe Ada language issues encountered during the development of ART-Ada, an expert system tool for Ada deployment. ART-Ada is being used to implement several expert system applications for the Space Station Freedom and the U.S. Air Force. Additional information is given on dynamic memory allocation.

  4. Joint source-channel coding for wireless object-based video communications utilizing data hiding.

    PubMed

    Wang, Haohong; Tsaftaris, Sotirios A; Katsaggelos, Aggelos K

    2006-08-01

    In recent years, joint source-channel coding for multimedia communications has gained increased popularity. However, very limited work has been conducted to address the problem of joint source-channel coding for object-based video. In this paper, we propose a data hiding scheme that improves the error resilience of object-based video by adaptively embedding the shape and motion information into the texture data. Within a rate-distortion theoretical framework, the source coding, channel coding, data embedding, and decoder error concealment are jointly optimized based on knowledge of the transmission channel conditions. Our goal is to achieve the best video quality as expressed by the minimum total expected distortion. The optimization problem is solved using Lagrangian relaxation and dynamic programming. The performance of the proposed scheme is tested using simulations of a Rayleigh-fading wireless channel, and the algorithm is implemented based on the MPEG-4 verification model. Experimental results indicate that the proposed hybrid source-channel coding scheme significantly outperforms methods without data hiding or unequal error protection. PMID:16900673

  5. SOURCES 4C : a code for calculating ([alpha],n), spontaneous fission, and delayed neutron sources and spectra.

    SciTech Connect

    Wilson, W. B.; Perry, R. T.; Shores, E. F.; Charlton, W. S.; Parish, Theodore A.; Estes, G. P.; Brown, T. H.; Arthur, Edward D. ,; Bozoian, Michael; England, T. R.; Madland, D. G.; Stewart, J. E.

    2002-01-01

    SOURCES 4C is a computer code that determines neutron production rates and spectra from ({alpha},n) reactions, spontaneous fission, and delayed neutron emission due to radionuclide decay. The code is capable of calculating ({alpha},n) source rates and spectra in four types of problems: homogeneous media (i.e., an intimate mixture of a-emitting source material and low-Z target material), two-region interface problems (i.e., a slab of {alpha}-emitting source material in contact with a slab of low-Z target material), three-region interface problems (i.e., a thin slab of low-Z target material sandwiched between {alpha}-emitting source material and low-Z target material), and ({alpha},n) reactions induced by a monoenergetic beam of {alpha}-particles incident on a slab of target material. Spontaneous fission spectra are calculated with evaluated half-life, spontaneous fission branching, and Watt spectrum parameters for 44 actinides. The ({alpha},n) spectra are calculated using an assumed isotropic angular distribution in the center-of-mass system with a library of 107 nuclide decay {alpha}-particle spectra, 24 sets of measured and/or evaluated ({alpha},n) cross sections and product nuclide level branching fractions, and functional {alpha}-particle stopping cross sections for Z < 106. The delayed neutron spectra are taken from an evaluated library of 105 precursors. The code provides the magnitude and spectra, if desired, of the resultant neutron source in addition to an analysis of the'contributions by each nuclide in the problem. LASTCALL, a graphical user interface, is included in the code package.

  6. The ADA Library Kit: Sample ADA-Related Documents to Help You Implement the Law.

    ERIC Educational Resources Information Center

    Mayo, Kathleen, Ed.; O'Donnell, Ruth, Ed.

    The Association of Specialized and Cooperative Library Agencies (ASCLA) formed an Americans with Disabilities Act (ADA) Assembly in 1992, and one of its first projects was to prepare this publication by collecting samples of library-produced ADA-related documents. Its aim is to help libraries increase levels of compliance and public awareness. The…

  7. Ada Run Time Support Environments and a common APSE Interface Set. [Ada Programming Support Environment

    NASA Technical Reports Server (NTRS)

    Mckay, C. W.; Bown, R. L.

    1985-01-01

    The paper discusses the importance of linking Ada Run Time Support Environments to the Common Ada Programming Support Environment (APSE) Interface Set (CAIS). A non-stop network operating systems scenario is presented to serve as a forum for identifying the important issues. The network operating system exemplifies the issues involved in the NASA Space Station data management system.

  8. Non-Uniform Contrast and Noise Correction for Coded Source Neutron Imaging

    SciTech Connect

    Santos-Villalobos, Hector J; Bingham, Philip R

    2012-01-01

    Since the first application of neutron radiography in the 1930s, the field of neutron radiography has matured enough to develop several applications. However, advances in the technology are far from concluded. In general, the resolution of scintillator-based detection systems is limited to the $10\\mu m$ range, and the relatively low neutron count rate of neutron sources compared to other illumination sources restricts time resolved measurement. One path toward improved resolution is the use of magnification; however, to date neutron optics are inefficient, expensive, and difficult to develop. There is a clear demand for cost-effective scintillator-based neutron imaging systems that achieve resolutions of $1 \\mu m$ or less. Such imaging system would dramatically extend the application of neutron imaging. For such purposes a coded source imaging system is under development. The current challenge is to reduce artifacts in the reconstructed coded source images. Artifacts are generated by non-uniform illumination of the source, gamma rays, dark current at the imaging sensor, and system noise from the reconstruction kernel. In this paper, we describe how to pre-process the coded signal to reduce noise and non-uniform illumination, and how to reconstruct the coded signal with three reconstruction methods correlation, maximum likelihood estimation, and algebraic reconstruction technique. We illustrates our results with experimental examples.

  9. Neutron imaging with coded sources: new challenges and the implementation of a simultaneous iterative reconstruction technique

    SciTech Connect

    Santos-Villalobos, Hector J; Bingham, Philip R; Gregor, Jens

    2013-01-01

    The limitations in neutron flux and resolution (L/D) of current neutron imaging systems can be addressed with a Coded Source Imaging system with magnification (xCSI). More precisely, the multiple sources in an xCSI system can exceed the flux of a single pinhole system for several orders of magnitude, while maintaining a higher L/D with the small sources. Moreover, designing for an xCSI system reduces noise from neutron scattering, because the object is placed away from the detector to achieve magnification. However, xCSI systems are adversely affected by correlated noise such as non-uniform illumination of the neutron source, incorrect sampling of the coded radiograph, misalignment of the coded masks, mask transparency, and the imperfection of the system Point Spread Function (PSF). We argue that a model-based reconstruction algorithm can overcome these problems and describe the implementation of a Simultaneous Iterative Reconstruction Technique algorithm for coded sources. Design pitfalls that preclude a satisfactory reconstruction are documented.

  10. Preliminary study of coded-source-based neutron imaging at the CPHS

    NASA Astrophysics Data System (ADS)

    Li, Yuanji; Huang, Zhifeng; Chen, Zhiqiang; Kang, Kejun; Xiao, Yongshun; Wang, Xuewu; Wei, Jie; Loong, C.-K.

    2011-09-01

    A cold neutron radiography/tomography instrument is under construction at the Compact Pulsed Hadron Source (CPHS) at Tsinghua University, China. The neutron flux is so low that an acceptable neutron radiographic image requires a long exposure time in the single-hole imaging mode. The coded-source-based imaging technique is helpful to increase the utilization of neutron flux to reduce the exposure time without loss in spatial resolution and provides high signal-to-noise ratio (SNR) images. Here we report a preliminary study on the feasibility of coded-source-based technique applied to the cold neutron imaging with a low-brilliance neutron source at the CPHS. A proper coded aperture is designed to be used in the beamline instead of the single-hole aperture. Two image retrieval algorithms, the Wiener filter algorithm and the Richardson-Lucy algorithm, are evaluated by using analytical and Monte Carlo simulations. The simulation results reveal that the coded source imaging technique is suitable for the CPHS to partially solve the problem of low neutron flux.

  11. An Ada inference engine for expert systems

    NASA Technical Reports Server (NTRS)

    Lavallee, David B.

    1986-01-01

    The purpose is to investigate the feasibility of using Ada for rule-based expert systems with real-time performance requirements. This includes exploring the Ada features which give improved performance to expert systems as well as optimizing the tradeoffs or workarounds that the use of Ada may require. A prototype inference engine was built using Ada, and rule firing rates in excess of 500 per second were demonstrated on a single MC68000 processor. The knowledge base uses a directed acyclic graph to represent production lines. The graph allows the use of AND, OR, and NOT logical operators. The inference engine uses a combination of both forward and backward chaining in order to reach goals as quickly as possible. Future efforts will include additional investigation of multiprocessing to improve performance and creating a user interface allowing rule input in an Ada-like syntax. Investigation of multitasking and alternate knowledge base representations will help to analyze some of the performance issues as they relate to larger problems.

  12. SOURCES 4A: A Code for Calculating (alpha,n), Spontaneous Fission, and Delayed Neutron Sources and Spectra

    SciTech Connect

    Madland, D.G.; Arthur, E.D.; Estes, G.P.; Stewart, J.E.; Bozoian, M.; Perry, R.T.; Parish, T.A.; Brown, T.H.; England, T.R.; Wilson, W.B.; Charlton, W.S.

    1999-09-01

    SOURCES 4A is a computer code that determines neutron production rates and spectra from ({alpha},n) reactions, spontaneous fission, and delayed neutron emission due to the decay of radionuclides. The code is capable of calculating ({alpha},n) source rates and spectra in four types of problems: homogeneous media (i.e., a mixture of {alpha}-emitting source material and low-Z target material), two-region interface problems (i.e., a slab of {alpha}-emitting source material in contact with a slab of low-Z target material), three-region interface problems (i.e., a thin slab of low-Z target material sandwiched between {alpha}-emitting source material and low-Z target material), and ({alpha},n) reactions induced by a monoenergetic beam of {alpha}-particles incident on a slab of target material. Spontaneous fission spectra are calculated with evaluated half-life, spontaneous fission branching, and Watt spectrum parameters for 43 actinides. The ({alpha},n) spectra are calculated using an assumed isotropic angular distribution in the center-of-mass system with a library of 89 nuclide decay {alpha}-particle spectra, 24 sets of measured and/or evaluated ({alpha},n) cross sections and product nuclide level branching fractions, and functional {alpha}-particle stopping cross sections for Z < 106. The delayed neutron spectra are taken from an evaluated library of 105 precursors. The code outputs the magnitude and spectra of the resultant neutron source. It also provides an analysis of the contributions to that source by each nuclide in the problem.

  13. Shared and Distributed Memory Parallel Security Analysis of Large-Scale Source Code and Binary Applications

    SciTech Connect

    Quinlan, D; Barany, G; Panas, T

    2007-08-30

    Many forms of security analysis on large scale applications can be substantially automated but the size and complexity can exceed the time and memory available on conventional desktop computers. Most commercial tools are understandably focused on such conventional desktop resources. This paper presents research work on the parallelization of security analysis of both source code and binaries within our Compass tool, which is implemented using the ROSE source-to-source open compiler infrastructure. We have focused on both shared and distributed memory parallelization of the evaluation of rules implemented as checkers for a wide range of secure programming rules, applicable to desktop machines, networks of workstations and dedicated clusters. While Compass as a tool focuses on source code analysis and reports violations of an extensible set of rules, the binary analysis work uses the exact same infrastructure but is less well developed into an equivalent final tool.

  14. 3-D localization of gamma ray sources with coded apertures for medical applications

    NASA Astrophysics Data System (ADS)

    Kaissas, I.; Papadimitropoulos, C.; Karafasoulis, K.; Potiriadis, C.; Lambropoulos, C. P.

    2015-09-01

    Several small gamma cameras for radioguided surgery using CdTe or CdZnTe have parallel or pinhole collimators. Coded aperture imaging is a well-known method for gamma ray source directional identification, applied in astrophysics mainly. The increase in efficiency due to the substitution of the collimators by the coded masks renders the method attractive for gamma probes used in radioguided surgery. We have constructed and operationally verified a setup consisting of two CdTe gamma cameras with Modified Uniform Redundant Array (MURA) coded aperture masks of rank 7 and 19 and a video camera. The 3-D position of point-like radioactive sources is estimated via triangulation using decoded images acquired by the gamma cameras. We have also developed code for both fast and detailed simulations and we have verified the agreement between experimental results and simulations. In this paper we present a simulation study for the spatial localization of two point sources using coded aperture masks with rank 7 and 19.

  15. Proceedings of the First NASA Ada Users' Symposium

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Ada has the potential to be a part of the most significant change in software engineering technology within NASA in the last twenty years. Thus, it is particularly important that all NASA centers be aware of Ada experience and plans at other centers. Ada activity across NASA are covered, with presenters representing five of the nine major NASA centers and the Space Station Freedom Program Office. Projects discussed included - Space Station Freedom Program Office: the implications of Ada on training, reuse, management and the software support environment; Johnson Space Center (JSC): early experience with the use of Ada, software engineering and Ada training and the evaluation of Ada compilers; Marshall Space Flight Center (MSFC): university research with Ada and the application of Ada to Space Station Freedom, the Orbital Maneuvering Vehicle, the Aero-Assist Flight Experiment and the Secure Shuttle Data System; Lewis Research Center (LeRC): the evolution of Ada software to support the Space Station Power Management and Distribution System; Jet Propulsion Laboratory (JPL): the creation of a centralized Ada development laboratory and current applications of Ada including the Real-time Weather Processor for the FAA; and Goddard Space Flight Center (GSFC): experiences with Ada in the Flight Dynamics Division and the Extreme Ultraviolet Explorer (EUVE) project and the implications of GSFC experience for Ada use in NASA. Despite the diversity of the presentations, several common themes emerged from the program: Methodology - NASA experience in general indicates that the effective use of Ada requires modern software engineering methodologies; Training - It is the software engineering principles and methods that surround Ada, rather than Ada itself, which requires the major training effort; Reuse - Due to training and transition costs, the use of Ada may initially actually decrease productivity, as was clearly found at GSFC; and real-time work at LeRC, JPL and GSFC shows

  16. 76 FR 79065 - Recordkeeping and Reporting Requirements Under Title VII, the ADA and GINA

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-21

    ... Employment Opportunity Commission 29 CFR Part 1602 Recordkeeping and Reporting Requirements Under Title VII, the ADA and GINA CFR Correction In Title 29 of the Code of Federal Regulations, Parts 900 to 1899, revised as of July 1, 2011, in Part 1602, remove the words ``section 709(c) of title VII or section 107...

  17. Detection and Location of Gamma-Ray Sources with a Modulating Coded Mask

    SciTech Connect

    Anderson, Dale N.; Stromswold, David C.; Wunschel, Sharon C.; Peurrung, Anthony J.; Hansen, Randy R.

    2006-01-31

    This paper presents methods of detecting and locating a concelaed nuclear gamma-ray source with a coded aperture mask. Energetic gamma rays readily penetrate moderate amounts of shielding material and can be detected at distances of many meters. The detection of high energy gamma-ray sources is vitally important to national security for several reasons, including nuclear materials smuggling interdiction, monitoring weapon components under treaties, and locating nuclear weapons and materials in the possession terrorist organizations.

  18. Study of an External Neutron Source for an Accelerator-Driven System using the PHITS Code

    SciTech Connect

    Sugawara, Takanori; Iwasaki, Tomohiko; Chiba, Takashi

    2005-05-24

    A code system for the Accelerator Driven System (ADS) has been under development for analyzing dynamic behaviors of a subcritical core coupled with an accelerator. This code system named DSE (Dynamics calculation code system for a Subcritical system with an External neutron source) consists of an accelerator part and a reactor part. The accelerator part employs a database, which is calculated by using PHITS, for investigating the effect related to the accelerator such as the changes of beam energy, beam diameter, void generation, and target level. This analysis method using the database may introduce some errors into dynamics calculations since the neutron source data derived from the database has some errors in fitting or interpolating procedures. In this study, the effects of various events are investigated to confirm that the method based on the database is appropriate.

  19. Automated Source-Code-Based Testing of Object-Oriented Software

    NASA Astrophysics Data System (ADS)

    Gerlich, Ralf; Gerlich, Rainer; Dietrich, Carsten

    2014-08-01

    With the advent of languages such as C++ and Java in mission- and safety-critical space on-board software, new challenges for testing and specifically automated testing arise. In this paper we discuss some of these challenges, consequences and solutions based on an experiment in automated source-code-based testing for C++.

  20. VizieR Online Data Catalog: Transiting planets search Matlab/Octave source code (Ofir+, 2014)

    NASA Astrophysics Data System (ADS)

    Ofir, A.

    2014-01-01

    The Matlab/Octave source code for Optimal BLS is made available here. Detailed descriptions of all inputs and outputs are given by comment lines in the file. Note: Octave does not currently support parallel for loops ("parfor"). Octave users therefore need to change the "parfor" command (line 217 of OptimalBLS.m) to "for". (7 data files).

  1. A database management capability for Ada

    NASA Technical Reports Server (NTRS)

    Chan, Arvola; Danberg, SY; Fox, Stephen; Landers, Terry; Nori, Anil; Smith, John M.

    1986-01-01

    The data requirements of mission critical defense systems have been increasing dramatically. Command and control, intelligence, logistics, and even weapons systems are being required to integrate, process, and share ever increasing volumes of information. To meet this need, systems are now being specified that incorporate data base management subsystems for handling storage and retrieval of information. It is expected that a large number of the next generation of mission critical systems will contain embedded data base management systems. Since the use of Ada has been mandated for most of these systems, it is important to address the issues of providing data base management capabilities that can be closely coupled with Ada. A comprehensive distributed data base management project has been investigated. The key deliverables of this project are three closely related prototype systems implemented in Ada. These three systems are discussed.

  2. Knowledge representation into Ada parallel processing

    NASA Technical Reports Server (NTRS)

    Masotto, Tom; Babikyan, Carol; Harper, Richard

    1990-01-01

    The Knowledge Representation into Ada Parallel Processing project is a joint NASA and Air Force funded project to demonstrate the execution of intelligent systems in Ada on the Charles Stark Draper Laboratory fault-tolerant parallel processor (FTPP). Two applications were demonstrated - a portion of the adaptive tactical navigator and a real time controller. Both systems are implemented as Activation Framework Objects on the Activation Framework intelligent scheduling mechanism developed by Worcester Polytechnic Institute. The implementations, results of performance analyses showing speedup due to parallelism and initial efficiency improvements are detailed and further areas for performance improvements are suggested.

  3. Phase 1 Validation Testing and Simulation for the WEC-Sim Open Source Code

    NASA Astrophysics Data System (ADS)

    Ruehl, K.; Michelen, C.; Gunawan, B.; Bosma, B.; Simmons, A.; Lomonaco, P.

    2015-12-01

    WEC-Sim is an open source code to model wave energy converters performance in operational waves, developed by Sandia and NREL and funded by the US DOE. The code is a time-domain modeling tool developed in MATLAB/SIMULINK using the multibody dynamics solver SimMechanics, and solves the WEC's governing equations of motion using the Cummins time-domain impulse response formulation in 6 degrees of freedom. The WEC-Sim code has undergone verification through code-to-code comparisons; however validation of the code has been limited to publicly available experimental data sets. While these data sets provide preliminary code validation, the experimental tests were not explicitly designed for code validation, and as a result are limited in their ability to validate the full functionality of the WEC-Sim code. Therefore, dedicated physical model tests for WEC-Sim validation have been performed. This presentation provides an overview of the WEC-Sim validation experimental wave tank tests performed at the Oregon State University's Directional Wave Basin at Hinsdale Wave Research Laboratory. Phase 1 of experimental testing was focused on device characterization and completed in Fall 2015. Phase 2 is focused on WEC performance and scheduled for Winter 2015/2016. These experimental tests were designed explicitly to validate the performance of WEC-Sim code, and its new feature additions. Upon completion, the WEC-Sim validation data set will be made publicly available to the wave energy community. For the physical model test, a controllable model of a floating wave energy converter has been designed and constructed. The instrumentation includes state-of-the-art devices to measure pressure fields, motions in 6 DOF, multi-axial load cells, torque transducers, position transducers, and encoders. The model also incorporates a fully programmable Power-Take-Off system which can be used to generate or absorb wave energy. Numerical simulations of the experiments using WEC-Sim will be

  4. Ada in AI or AI in Ada. On developing a rationale for integration

    NASA Technical Reports Server (NTRS)

    Collard, Philippe E.; Goforth, Andre

    1988-01-01

    The use of Ada as an Artificial Intelligence (AI) language is gaining interest in the NASA Community, i.e., by parties who have a need to deploy Knowledge Based-Systems (KBS) compatible with the use of Ada as the software standard for the Space Station. A fair number of KBS and pseudo-KBS implementations in Ada exist today. Currently, no widely used guidelines exist to compare and evaluate these with one another. The lack of guidelines illustrates a fundamental problem inherent in trying to compare and evaluate implementations of any sort in languages that are procedural or imperative in style, such as Ada, with those in languages that are functional in style, such as Lisp. Discussed are the strengths and weakness of using Ada as an AI language and a preliminary analysis provided of factors needed for the development of criteria for the integration of these two families of languages and the environments in which they are implemented. The intent for developing such criteria is to have a logical rationale that may be used to guide the development of Ada tools and methodology to support KBS requirements, and to identify those AI technology components that may most readily and effectively be deployed in Ada.

  5. Translating an AI application from Lisp to Ada: A case study

    NASA Technical Reports Server (NTRS)

    Davis, Gloria J.

    1991-01-01

    A set of benchmarks was developed to test the performance of a newly designed computer executing both Lisp and Ada. Among these was AutoClassII -- a large Artificial Intelligence (AI) application written in Common Lisp. The extraction of a representative subset of this complex application was aided by a Lisp Code Analyzer (LCA). The LCA enabled rapid analysis of the code, putting it in a concise and functionally readable form. An equivalent benchmark was created in Ada through manual translation of the Lisp version. A comparison of the execution results of both programs across a variety of compiler-machine combinations indicate that line-by-line translation coupled with analysis of the initial code can produce relatively efficient and reusable target code.

  6. A novel Multi-Agent Ada-Boost algorithm for predicting protein structural class with the information of protein secondary structure.

    PubMed

    Fan, Ming; Zheng, Bin; Li, Lihua

    2015-10-01

    Knowledge of the structural class of a given protein is important for understanding its folding patterns. Although a lot of efforts have been made, it still remains a challenging problem for prediction of protein structural class solely from protein sequences. The feature extraction and classification of proteins are the main problems in prediction. In this research, we extended our earlier work regarding these two aspects. In protein feature extraction, we proposed a scheme by calculating the word frequency and word position from sequences of amino acid, reduced amino acid, and secondary structure. For an accurate classification of the structural class of protein, we developed a novel Multi-Agent Ada-Boost (MA-Ada) method by integrating the features of Multi-Agent system into Ada-Boost algorithm. Extensive experiments were taken to test and compare the proposed method using four benchmark datasets in low homology. The results showed classification accuracies of 88.5%, 96.0%, 88.4%, and 85.5%, respectively, which are much better compared with the existing methods. The source code and dataset are available on request. PMID:26350693

  7. A coded-aperture technique allowing x-ray phase contrast imaging with conventional sources

    SciTech Connect

    Olivo, Alessandro; Speller, Robert

    2007-08-13

    Phase contrast imaging (PCI) solves the basic limitation of x-ray imaging, i.e., poor image contrast resulting from small absorption differences. Up to now, it has been mostly limited to synchrotron radiation facilities, due to the stringent requirements on the x-ray source and detectors, and only one technique was shown to provide PCI images with conventional sources but with limits in practical implementation. The authors propose a different approach, based on coded apertures, which provides high PCI signals with conventional sources and detectors and imposes practically no applicability limits. They expect this method to cast the basis of a widespread diffusion of PCI.

  8. Gamma ray observatory dynamics simulator in Ada (GRODY)

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This experiment involved the parallel development of dynamics simulators for the Gamma Ray Observatory in both FORTRAN and Ada for the purpose of evaluating the applicability of Ada to the NASA/Goddard Space Flight Center's flight dynamics environment. The experiment successfully demonstrated that Ada is a viable, valuable technology for use in this environment. In addition to building a simulator, the Ada team evaluated training approaches, developed an Ada methodology appropriate to the flight dynamics environment, and established a baseline for evaluating future Ada projects.

  9. The Courts, the ADA, and the Academy

    ERIC Educational Resources Information Center

    Cope, David D.

    2005-01-01

    Litigation influences what goes on in the classroom. The Americans with Disabilities Act (ADA), other statutes, and legal precedent have defined reasonable restrictions on what qualifies as a handicap. Still, universities tend to go overboard--out of ignorance, and influenced by a culture that seems to champion every conceivable victim--in…

  10. AdaNET prototype library administration manual

    NASA Technical Reports Server (NTRS)

    Hanley, Lionel

    1989-01-01

    The functions of the AdaNET Prototype Library of Reusable Software Parts is described. Adopted from the Navy Research Laboratory's Reusability Guidebook (V.5.0), this is a working document, customized for use the the AdaNET Project. Within this document, the term part is used to denote the smallest unit controlled by a library and retrievable from it. A part may have several constituents, which may not be individually tracked. Presented are the types of parts which may be stored in the library and the relationships among those parts; a concept of trust indicators which provide measures of confidence that a user of a previously developed part may reasonably apply to a part for a new application; search and retrieval, configuration management, and communications among those who interact with the AdaNET Prototype Library; and the AdaNET Prototype, described from the perspective of its three major users: the part reuser and retriever, the part submitter, and the librarian and/or administrator.

  11. Is Your Queuing System ADA-Compliant?

    ERIC Educational Resources Information Center

    Lawrence, David

    2002-01-01

    Discusses the Americans with Disabilities (ADA) and Uniform Federal Accessibility Standards (UFAS) regulations regarding public facilities' crowd control stanchions and queuing systems. The major elements are protruding objects and wheelchair accessibility. Describes how to maintain compliance with the regulations and offers a list of additional…

  12. Computer vision for detecting and quantifying gamma-ray sources in coded-aperture images

    SciTech Connect

    Schaich, P.C.; Clark, G.A.; Sengupta, S.K.; Ziock, K.P.

    1994-11-02

    The authors report the development of an automatic image analysis system that detects gamma-ray source regions in images obtained from a coded aperture, gamma-ray imager. The number of gamma sources in the image is not known prior to analysis. The system counts the number (K) of gamma sources detected in the image and estimates the lower bound for the probability that the number of sources in the image is K. The system consists of a two-stage pattern classification scheme in which the Probabilistic Neural Network is used in the supervised learning mode. The algorithms were developed and tested using real gamma-ray images from controlled experiments in which the number and location of depleted uranium source disks in the scene are known.

  13. Instance transfer learning with multisource dynamic TrAdaBoost.

    PubMed

    Zhang, Qian; Li, Haigang; Zhang, Yong; Li, Ming

    2014-01-01

    Since the transfer learning can employ knowledge in relative domains to help the learning tasks in current target domain, compared with the traditional learning it shows the advantages of reducing the learning cost and improving the learning efficiency. Focused on the situation that sample data from the transfer source domain and the target domain have similar distribution, an instance transfer learning method based on multisource dynamic TrAdaBoost is proposed in this paper. In this method, knowledge from multiple source domains is used well to avoid negative transfer; furthermore, the information that is conducive to target task learning is obtained to train candidate classifiers. The theoretical analysis suggests that the proposed algorithm improves the capability that weight entropy drifts from source to target instances by means of adding the dynamic factor, and the classification effectiveness is better than single source transfer. Finally, experimental results show that the proposed algorithm has higher classification accuracy. PMID:25152906

  14. Programming fault-tolerant distributed systems in Ada

    NASA Technical Reports Server (NTRS)

    Voigt, Susan J.

    1985-01-01

    Viewgraphs on the topic of programming fault-tolerant distributed systems in the Ada programming language are presented. Topics covered include project goals, Ada difficulties and solutions, testbed requirements, and virtual processors.

  15. HELIOS-R: An Ultrafast, Open-Source Retrieval Code For Exoplanetary Atmosphere Characterization

    NASA Astrophysics Data System (ADS)

    LAVIE, Baptiste

    2015-12-01

    Atmospheric retrieval is a growing, new approach in the theory of exoplanet atmosphere characterization. Unlike self-consistent modeling it allows us to fully explore the parameter space, as well as the degeneracies between the parameters using a Bayesian framework. We present HELIOS-R, a very fast retrieving code written in Python and optimized for GPU computation. Once it is ready, HELIOS-R will be the first open-source atmospheric retrieval code accessible to the exoplanet community. As the new generation of direct imaging instruments (SPHERE, GPI) have started to gather data, the first version of HELIOS-R focuses on emission spectra. We use a 1D two-stream forward model for computing fluxes and couple it to an analytical temperature-pressure profile that is constructed to be in radiative equilibrium. We use our ultra-fast opacity calculator HELIOS-K (also open-source) to compute the opacities of CO2, H2O, CO and CH4 from the HITEMP database. We test both opacity sampling (which is typically used by other workers) and the method of k-distributions. Using this setup, we compute a grid of synthetic spectra and temperature-pressure profiles, which is then explored using a nested sampling algorithm. By focusing on model selection (Occam’s razor) through the explicit computation of the Bayesian evidence, nested sampling allows us to deal with current sparse data as well as upcoming high-resolution observations. Once the best model is selected, HELIOS-R provides posterior distributions of the parameters. As a test for our code we studied HR8799 system and compared our results with the previous analysis of Lee, Heng & Irwin (2013), which used the proprietary NEMESIS retrieval code. HELIOS-R and HELIOS-K are part of the set of open-source community codes we named the Exoclimes Simulation Platform (www.exoclime.org).

  16. An Adaptive Source-Channel Coding with Feedback for Progressive Transmission of Medical Images

    PubMed Central

    Lo, Jen-Lung; Sanei, Saeid; Nazarpour, Kianoush

    2009-01-01

    A novel adaptive source-channel coding with feedback for progressive transmission of medical images is proposed here. In the source coding part, the transmission starts from the region of interest (RoI). The parity length in the channel code varies with respect to both the proximity of the image subblock to the RoI and the channel noise, which is iteratively estimated in the receiver. The overall transmitted data can be controlled by the user (clinician). In the case of medical data transmission, it is vital to keep the distortion level under control as in most of the cases certain clinically important regions have to be transmitted without any visible error. The proposed system significantly reduces the transmission time and error. Moreover, the system is very user friendly since the selection of the RoI, its size, overall code rate, and a number of test features such as noise level can be set by the users in both ends. A MATLAB-based TCP/IP connection has been established to demonstrate the proposed interactive and adaptive progressive transmission system. The proposed system is simulated for both binary symmetric channel (BSC) and Rayleigh channel. The experimental results verify the effectiveness of the design. PMID:19190770

  17. An adaptive source-channel coding with feedback for progressive transmission of medical images.

    PubMed

    Lo, Jen-Lung; Sanei, Saeid; Nazarpour, Kianoush

    2009-01-01

    A novel adaptive source-channel coding with feedback for progressive transmission of medical images is proposed here. In the source coding part, the transmission starts from the region of interest (RoI). The parity length in the channel code varies with respect to both the proximity of the image subblock to the RoI and the channel noise, which is iteratively estimated in the receiver. The overall transmitted data can be controlled by the user (clinician). In the case of medical data transmission, it is vital to keep the distortion level under control as in most of the cases certain clinically important regions have to be transmitted without any visible error. The proposed system significantly reduces the transmission time and error. Moreover, the system is very user friendly since the selection of the RoI, its size, overall code rate, and a number of test features such as noise level can be set by the users in both ends. A MATLAB-based TCP/IP connection has been established to demonstrate the proposed interactive and adaptive progressive transmission system. The proposed system is simulated for both binary symmetric channel (BSC) and Rayleigh channel. The experimental results verify the effectiveness of the design. PMID:19190770

  18. Severe accident source term characteristics for selected Peach Bottom sequences predicted by the MELCOR Code

    SciTech Connect

    Carbajo, J.J.

    1993-09-01

    The purpose of this report is to compare in-containment source terms developed for NUREG-1159, which used the Source Term Code Package (STCP), with those generated by MELCOR to identify significant differences. For this comparison, two short-term depressurized station blackout sequences (with a dry cavity and with a flooded cavity) and a Loss-of-Coolant Accident (LOCA) concurrent with complete loss of the Emergency Core Cooling System (ECCS) were analyzed for the Peach Bottom Atomic Power Station (a BWR-4 with a Mark I containment). The results indicate that for the sequences analyzed, the two codes predict similar total in-containment release fractions for each of the element groups. However, the MELCOR/CORBH Package predicts significantly longer times for vessel failure and reduced energy of the released material for the station blackout sequences (when compared to the STCP results). MELCOR also calculated smaller releases into the environment than STCP for the station blackout sequences.

  19. The FORTRAN static source code analyzer program (SAP) user's guide, revision 1

    NASA Technical Reports Server (NTRS)

    Decker, W.; Taylor, W.; Eslinger, S.

    1982-01-01

    The FORTRAN Static Source Code Analyzer Program (SAP) User's Guide (Revision 1) is presented. SAP is a software tool designed to assist Software Engineering Laboratory (SEL) personnel in conducting studies of FORTRAN programs. SAP scans FORTRAN source code and produces reports that present statistics and measures of statements and structures that make up a module. This document is a revision of the previous SAP user's guide, Computer Sciences Corporation document CSC/TM-78/6045. SAP Revision 1 is the result of program modifications to provide several new reports, additional complexity analysis, and recognition of all statements described in the FORTRAN 77 standard. This document provides instructions for operating SAP and contains information useful in interpreting SAP output.

  20. Joint source/channel coding for image transmission with JPEG2000 over memoryless channels.

    PubMed

    Wu, Zhenyu; Bilgin, Ali; Marcellin, Michael W

    2005-08-01

    The high compression efficiency and various features provided by JPEG2000 make it attractive for image transmission purposes. A novel joint source/channel coding scheme tailored for JPEG2000 is proposed in this paper to minimize the end-to-end image distortion within a given total transmission rate through memoryless channels. It provides unequal error protection by combining the forward error correction capability from channel codes and the error detection/localization functionality from JPEG2000 in an effective way. The proposed scheme generates quality scalable and error-resilient codestreams. It gives competitive performance with other existing schemes for JPEG2000 in the matched channel condition case and provides more graceful quality degradation for mismatched cases. Furthermore, both fixed-length source packets and fixed-length channel packets can be efficiently formed with the same algorithm. PMID:16121451

  1. Source-Search Sensitivity of a Large-Area, Coded-Aperture, Gamma-Ray Imager

    SciTech Connect

    Ziock, K P; Collins, J W; Craig, W W; Fabris, L; Lanza, R C; Gallagher, S; Horn, B P; Madden, N W; Smith, E; Woodring, M L

    2004-10-27

    We have recently completed a large-area, coded-aperture, gamma-ray imager for use in searching for radiation sources. The instrument was constructed to verify that weak point sources can be detected at considerable distances if one uses imaging to overcome fluctuations in the natural background. The instrument uses a rank-19, one-dimensional coded aperture to cast shadow patterns onto a 0.57 m{sup 2} NaI(Tl) detector composed of 57 individual cubes each 10 cm on a side. These are arranged in a 19 x 3 array. The mask is composed of four-centimeter thick, one-meter high, 10-cm wide lead blocks. The instrument is mounted in the back of a small truck from which images are obtained as one drives through a region. Results of first measurements obtained with the system are presented.

  2. CACTI: free, open-source software for the sequential coding of behavioral interactions.

    PubMed

    Glynn, Lisa H; Hallgren, Kevin A; Houck, Jon M; Moyers, Theresa B

    2012-01-01

    The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery. PMID:22815713

  3. The computerization of programming: Ada (R) lessons learned

    NASA Technical Reports Server (NTRS)

    Struble, Dennis D.

    1986-01-01

    One of the largest systems yet written in Ada has been constructed. This system is the Intermetrics Ada compiler. Many lessons have been learned during the implementation of this Ada compiler. Some of these lessons, concentrating on those lessons relevant to large system implementations are described. The characteristics of the Ada compiler implementation project at Intermetrics are also described. Some specific experiences during the implementation are pointed out.

  4. The NASA Langley Research Center 0.3-meter transonic cryogenic tunnel microcomputer controller source code

    NASA Technical Reports Server (NTRS)

    Kilgore, W. Allen; Balakrishna, S.

    1991-01-01

    The 0.3 m Transonic Cryogenic Tunnel (TCT) microcomputer based controller has been operating for several thousand hours in a safe and efficient manner. A complete listing is provided of the source codes for the tunnel controller and tunnel simulator. Included also is a listing of all the variables used in these programs. Several changes made to the controller are described. These changes are to improve the controller ease of use and safety.

  5. A source-channel coding approach to digital image protection and self-recovery.

    PubMed

    Sarreshtedari, Saeed; Akhaee, Mohammad Ali

    2015-07-01

    Watermarking algorithms have been widely applied to the field of image forensics recently. One of these very forensic applications is the protection of images against tampering. For this purpose, we need to design a watermarking algorithm fulfilling two purposes in case of image tampering: 1) detecting the tampered area of the received image and 2) recovering the lost information in the tampered zones. State-of-the-art techniques accomplish these tasks using watermarks consisting of check bits and reference bits. Check bits are used for tampering detection, whereas reference bits carry information about the whole image. The problem of recovering the lost reference bits still stands. This paper is aimed at showing that having the tampering location known, image tampering can be modeled and dealt with as an erasure error. Therefore, an appropriate design of channel code can protect the reference bits against tampering. In the present proposed method, the total watermark bit-budget is dedicated to three groups: 1) source encoder output bits; 2) channel code parity bits; and 3) check bits. In watermark embedding phase, the original image is source coded and the output bit stream is protected using appropriate channel encoder. For image recovery, erasure locations detected by check bits help channel erasure decoder to retrieve the original source encoded image. Experimental results show that our proposed scheme significantly outperforms recent techniques in terms of image quality for both watermarked and recovered image. The watermarked image quality gain is achieved through spending less bit-budget on watermark, while image recovery quality is considerably improved as a consequence of consistent performance of designed source and channel codes. PMID:25807568

  6. Knowledge, programming, and programming cultures: LISP, C, and Ada

    NASA Technical Reports Server (NTRS)

    Rochowiak, Daniel

    1990-01-01

    The results of research 'Ada as an implementation language for knowledge based systems' are presented. The purpose of the research was to compare Ada to other programming languages. The report focuses on the programming languages Ada, C, and Lisp, the programming cultures that surround them, and the programming paradigms they support.

  7. Code System for Calculating Alpha, N; Spontaneous Fission; and Delayed Neutron Sources and Spectra.

    2002-07-18

    Version: 04 SOURCES4C is a code system that determines neutron production rates and spectra from (alpha,n) reactions, spontaneous fission, and delayed neutron emission due to radionuclide decay. In this release the three-region problem was modified to correct several bugs, and new documentation was added to the package. Details are available in the included LA-UR-02-1617 (2002) report. The code is capable of calculating (alpha,n) source rates and spectra in four types of problems: homogeneous media (i.e.,more » an intimate mixture of alpha-emitting source material and low-Z target material), two-region interface problems (i.e., a slab of alpha-emitting source material in contact with a slab of low-Z target material), three-region interface problems (i.e., a thin slab of low-Z target material sandwiched between alpha-emitting source material and low-Z target material), and (alpha,n) reactions induced by a monoenergetic beam of alpha-particles incident on a slab of target material. The process of creating a SOURCES input file (tape1) is streamlined with the Los Alamos SOURCES Tape1 Creator and Library Link (LASTCALL) Version 1. Intended to supplement the SOURCES manual, LASTCALL is a simple graphical user interface designed to minimize common errors made during input. The optional application, LASTCALL, consists of a single dialog window launched from an executable (lastcall.exe) on Windows-based personal computers.« less

  8. OpenAD/F : a modular, open-source tool for automatic differentiation of Fortran codes.

    SciTech Connect

    Utke, J.; Naumann, U.; Fagan, M.; Tallent, N.; Strout, M.; Heimbach, P.; Hill, C.; Wunsch, C.; Mathematics and Computer Science; Rheinisch Westfalische Technische Hochschule Aachen; Rice Univ.; Colorado State Univ.; MIT

    2008-01-01

    The OpenAD/F tool allows the evaluation of derivatives of functions defined by a Fortran program. The derivative evaluation is performed by a Fortran code resulting from the analysis and transformation of the original program that defines the function of interest. OpenAD/F has been designed with a particular emphasis on modularity, flexibility, and the use of open source components. While the code transformation follows the basic principles of automatic differentiation, the tool implements new algorithmic approaches at various levels, for example, for basic block preaccumulation and call graph reversal. Unlike most other automatic differentiation tools, OpenAD/F uses components provided by the OpenAD framework, which supports a comparatively easy extension of the code transformations in a language-independent fashion. It uses code analysis results implemented in the OpenAnalysis component. The interface to the language-independent transformation engine is an XML-based format, specified through an XML schema. The implemented transformation algorithms allow efficient derivative computations utilizing locally optimized cross-country sequences of vertex, edge, and face elimination steps. Specifically, for the generation of adjoint codes, OpenAD/F supports various code reversal schemes with hierarchical checkpointing at the subroutine level. As an example from geophysical fluid dynamics a nonlinear time-dependent scalable, yet simple, barotropic ocean model is considered. OpenAD/F's reverse mode is applied to compute sensitivities of some of the model's transport properties with respect to gridded fields such as bottom topography as independent (control) variables.

  9. Comparing host and target environments for distributed Ada programs

    NASA Technical Reports Server (NTRS)

    Paulk, Mark C.

    1986-01-01

    The Ada programming language provides a means of specifying logical concurrency by using multitasking. Extending the Ada multitasking concurrency mechanism into a physically concurrent distributed environment which imposes its own requirements can lead to incompatibilities. These problems are discussed. Using distributed Ada for a target system may be appropriate, but when using the Ada language in a host environment, a multiprocessing model may be more suitable than retargeting an Ada compiler for the distributed environment. The tradeoffs between multitasking on distributed targets and multiprocessing on distributed hosts are discussed. Comparisons of the multitasking and multiprocessing models indicate different areas of application.

  10. Ada and software management in NASA: Assessment and recommendations

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Recent NASA missions have required software systems that are larger, more complex, and more critical than NASA software systems of the past. The Ada programming language and the software methods and support environments associated with it are seen as potential breakthroughs in meeting NASA's software requirements. The findings of a study by the Ada and Software Management Assessment Working Group (ASMAWG) are presented. The study was chartered to perform three tasks: (1) assess the agency's ongoing and planned Ada activities; (2) assess the infrastructure (standards, policies, and internal organizations) supporting software management and the Ada activities; and (3) present an Ada implementation and use strategy appropriate for NASA over the next 5 years.

  11. Interpreting observations of molecular outflow sources: the MHD shock code mhd_vode

    NASA Astrophysics Data System (ADS)

    Flower, D. R.; Pineau des Forêts, G.

    2015-06-01

    The planar MHD shock code mhd_vode has been developed in order to simulate both continuous (C) type shock waves and jump (J) type shock waves in the interstellar medium. The physical and chemical state of the gas in steady-state may also be computed and used as input to a shock wave model. The code is written principally in FORTRAN 90, although some routines remain in FORTRAN 77. The documented program and its input data are described and provided as supplementary material, and the results of exemplary test runs are presented. Our intention is to enable the interested user to run the code for any sensible parameter set and to comprehend the results. With applications to molecular outflow sources in mind, we have computed, and are making available as supplementary material, integrated atomic and molecular line intensities for grids of C- and J-type models; these computations are summarized in the Appendices. Appendix tables, a copy of the current version of the code, and of the two model grids are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/578/A63

  12. SEL Ada reuse analysis and representations

    NASA Technical Reports Server (NTRS)

    Kester, Rush

    1990-01-01

    Overall, it was revealed that the pattern of Ada reuse has evolved from initial reuse of utility components into reuse of generalized application architectures. Utility components were both domain-independent utilities, such as queues and stacks, and domain-specific utilities, such as those that implement spacecraft orbit and attitude mathematical functions and physics or astronomical models. The level of reuse was significantly increased with the development of a generalized telemetry simulator architecture. The use of Ada generics significantly increased the level of verbatum reuse, which is due to the ability, using Ada generics, to parameterize the aspects of design that are configurable during reuse. A key factor in implementing generalized architectures was the ability to use generic subprogram parameters to tailor parts of the algorithm embedded within the architecture. The use of object oriented design (in which objects model real world entities) significantly improved the modularity for reuse. Encapsulating into packages the data and operations associated with common real world entities creates natural building blocks for reuse.

  13. Sources of financial pressure and up coding behavior in French public hospitals.

    PubMed

    Georgescu, Irène; Hartmann, Frank G H

    2013-05-01

    Drawing upon role theory and the literature concerning unintended consequences of financial pressure, this study investigates the effects of health care decision pressure from the hospital's administration and from the professional peer group on physician's inclination to engage in up coding. We explore two kinds of up coding, information-related and action-related, and develop hypothesis that connect these kinds of data manipulation to the sources of pressure via the intermediate effect of role conflict. Qualitative data from initial interviews with physicians and subsequent questionnaire evidence from 578 physicians in 14 French hospitals suggest that the source of pressure is a relevant predictor of physicians' inclination to engage in data-manipulation. We further find that this effect is partly explained by the extent to which these pressures create role conflict. Given the concern about up coding in treatment-based reimbursement systems worldwide, our analysis adds to understanding how the design of the hospital's management control system may enhance this undesired type of behavior. PMID:23477807

  14. Homozygosity for a novel adenosine deaminase (ADA) nonsense mutation (Q3>X) in a child with severe combined immunodeficiency (SCID)

    SciTech Connect

    Santisteban, I.; Arrendondo-Vega, F.X.; Kelly, S. |

    1994-09-01

    A Somali girl was diagnosed with ADA-deficient SCID at 7 mo; she responded well to PEG-ADA replacement and is now 3.3 yr old. ADA mRNA was undetectable (Northern) in her cultured T cells, but was present in T cells of her parents and two sibs. All PCR-amplified exon 1 genomic clones from the patient had a C>T transition at bp 7 relative to the start of translation, replacing Gln at codon 3 (AGA) with a termination codon (TGA, Q3>X). Patient cDNA (prepared by RT-PCR with a 5{prime} primer that covered codons 1-7) had a previously described polymorphism, K80>R, but was otherwise normal, indicating that no other coding mutations were present. A predicted new genomic BfaI restriction site was used to establish her homozygosity for Q3>X and to analyze genotypes of family members. We also analyzed the segregation of a variable Alu polyA-associated TAAA repeat (AluVpA) situated 5{prime} of the ADA gene. Three different AluVpA alleles were found, one of which was only present in the father and was not associated with his Q3>X allele. Because the father`s RBCs had only {approximately}15% of normal ADA activity, we analyzed his ADA cDNA. We found a G>A transition at bp 425 that substitutes Gln for Arg142, a solvent-accessible residue, and eliminates a BsmAI site in exon 5. ADA activity of the R142>Q in vitro translation product was 20-25% of wild type ADA translation product, suggesting that R142>Q is a new {open_quote}partial{close_quote} ADA deficiency mutation. As expected, Q3>X mRNA did not yield a detectable in vitro translation product. We conclude that the patient`s father is a compound heterozygote carrying the ADA Q3>X/R142>Q genotype. {open_quote}Partial{close_quote} ADA deficiency unassociated with immunodeficiency is relatively common in individuals of African descent. The present findings and previous observations suggest that {open_quote}partial{close_quote} ADA deficiency may have had an evolutionary advantage.

  15. Evolving impact of Ada on a production software environment

    NASA Technical Reports Server (NTRS)

    Mcgarry, F.; Esker, L.; Quimby, K.

    1988-01-01

    Many aspects of software development with Ada have evolved as our Ada development environment has matured and personnel have become more experienced in the use of Ada. The Software Engineering Laboratory (SEL) has seen differences in the areas of cost, reliability, reuse, size, and use of Ada features. A first Ada project can be expected to cost about 30 percent more than an equivalent FORTRAN project. However, the SEL has observed significant improvements over time as a development environment progresses to second and third uses of Ada. The reliability of Ada projects is initially similar to what is expected in a mature FORTRAN environment. However, with time, one can expect to gain improvements as experience with the language increases. Reuse is one of the most promising aspects of Ada. The proportion of reusable Ada software on our Ada projects exceeds the proportion of reusable FORTRAN software on our FORTRAN projects. This result was noted fairly early in our Ada projects, and experience shows an increasing trend over time.

  16. User`s Manual for the SOURCE1 and SOURCE2 Computer Codes: Models for Evaluating Low-Level Radioactive Waste Disposal Facility Source Terms (Version 2.0)

    SciTech Connect

    Icenhour, A.S.; Tharp, M.L.

    1996-08-01

    The SOURCE1 and SOURCE2 computer codes calculate source terms (i.e. radionuclide release rates) for performance assessments of low-level radioactive waste (LLW) disposal facilities. SOURCE1 is used to simulate radionuclide releases from tumulus-type facilities. SOURCE2 is used to simulate releases from silo-, well-, well-in-silo-, and trench-type disposal facilities. The SOURCE codes (a) simulate the degradation of engineered barriers and (b) provide an estimate of the source term for LLW disposal facilities. This manual summarizes the major changes that have been effected since the codes were originally developed.

  17. Ada education in a software life-cycle context

    NASA Technical Reports Server (NTRS)

    Clough, Anne J.

    1986-01-01

    Some of the experience gained from a comprehensive educational program undertaken at The Charles Stark Draper Lab. to introduce the Ada language and to transition modern software engineering technology into the development of Ada and non-Ada applications is described. Initially, a core group, which included manager, engineers and programmers, received training in Ada. An Ada Office was established to assume the major responsibility for training, evaluation, acquisition and benchmarking of tools, and consultation on Ada projects. As a first step in this process, and in-house educational program was undertaken to introduce Ada to the Laboratory. Later, a software engineering course was added to the educational program as the need to address issues spanning the entire software life cycle became evident. Educational efforts to date are summarized, with an emphasis on the educational approach adopted. Finally, lessons learned in administering this program are addressed.

  18. Ada (trademark) projects at NASA. Runtime environment issues and recommendations

    NASA Technical Reports Server (NTRS)

    Roy, Daniel M.; Wilke, Randall W.

    1988-01-01

    Ada practitioners should use this document to discuss and establish common short term requirements for Ada runtime environments. The major current Ada runtime environment issues are identified through the analysis of some of the Ada efforts at NASA and other research centers. The runtime environment characteristics of major compilers are compared while alternate runtime implementations are reviewed. Modifications and extensions to the Ada Language Reference Manual to address some of these runtime issues are proposed. Three classes of projects focusing on the most critical runtime features of Ada are recommended, including a range of immediately feasible full scale Ada development projects. Also, a list of runtime features and procurement issues is proposed for consideration by the vendors, contractors and the government.

  19. Probabilistic positional association of catalogs of astrophysical sources: the Aspects code

    NASA Astrophysics Data System (ADS)

    Fioc, Michel

    2014-06-01

    We describe a probabilistic method of cross-identifying astrophysical sources in two catalogs from their positions and positional uncertainties. The probability that an object is associated with a source from the other catalog, or that it has no counterpart, is derived under two exclusive assumptions: first, the classical case of several-to-one associations, and then the more realistic but more difficult problem of one-to-one associations. In either case, the likelihood of observing the objects in the two catalogs at their effective positions is computed and a maximum likelihood estimator of the fraction of sources with a counterpart - a quantity needed to compute the probabilities of association - is built. When the positional uncertainty in one or both catalogs is unknown, this method may be used to estimate its typical value and even to study its dependence on the size of objects. It may also be applied when the true centers of a source and of its counterpart at another wavelength do not coincide. To compute the likelihood and association probabilities under the different assumptions, we developed a Fortran 95 code called Aspects ([aspɛ], "Association positionnelle/probabiliste de catalogues de sources" in French); its source files are made freely available. To test Aspects, all-sky mock catalogs containing up to 105 objects were created, forcing either several-to-one or one-to-one associations. The analysis of these simulations confirms that, in both cases, the assumption with the highest likelihood is the right one and that estimators of unknown parameters built for the appropriate association model are reliable. Available at http://www2.iap.fr/users/fioc/Aspects/The Aspects code is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/566/A8

  20. Optimal power allocation and joint source-channel coding for wireless DS-CDMA visual sensor networks

    NASA Astrophysics Data System (ADS)

    Pandremmenou, Katerina; Kondi, Lisimachos P.; Parsopoulos, Konstantinos E.

    2011-01-01

    In this paper, we propose a scheme for the optimal allocation of power, source coding rate, and channel coding rate for each of the nodes of a wireless Direct Sequence Code Division Multiple Access (DS-CDMA) visual sensor network. The optimization is quality-driven, i.e. the received quality of the video that is transmitted by the nodes is optimized. The scheme takes into account the fact that the sensor nodes may be imaging scenes with varying levels of motion. Nodes that image low-motion scenes will require a lower source coding rate, so they will be able to allocate a greater portion of the total available bit rate to channel coding. Stronger channel coding will mean that such nodes will be able to transmit at lower power. This will both increase battery life and reduce interference to other nodes. Two optimization criteria are considered. One that minimizes the average video distortion of the nodes and one that minimizes the maximum distortion among the nodes. The transmission powers are allowed to take continuous values, whereas the source and channel coding rates can assume only discrete values. Thus, the resulting optimization problem lies in the field of mixed-integer optimization tasks and is solved using Particle Swarm Optimization. Our experimental results show the importance of considering the characteristics of the video sequences when determining the transmission power, source coding rate and channel coding rate for the nodes of the visual sensor network.

  1. Object-oriented programming with mixins in Ada

    NASA Technical Reports Server (NTRS)

    Seidewitz, ED

    1992-01-01

    Recently, I wrote a paper discussing the lack of 'true' object-oriented programming language features in Ada 83, why one might desire them in Ada, and how they might be added in Ada 9X. The approach I took in this paper was to build the new object-oriented features of Ada 9X as much as possible on the basic constructs and philosophy of Ada 83. The object-oriented features proposed for Ada 9X, while different in detail, are based on the same kind of approach. Further consideration of this approach led me on a long reflection on the nature of object-oriented programming and its application to Ada. The results of this reflection, presented in this paper, show how a fairly natural object-oriented style can indeed be developed even in Ada 83. The exercise of developing this style is useful for at least three reasons: (1) it provides a useful style for programming object-oriented applications in Ada 83 until new features become available with Ada 9X; (2) it demystifies many of the mechanisms that seem to be 'magic' in most object-oriented programming languages by making them explicit; and (3) it points out areas that are and are not in need of change in Ada 83 to make object-oriented programming more natural in Ada 9X. In the next four sections I will address in turn the issues of object-oriented classes, mixins, self-reference and supertyping. The presentation is through a sequence of examples. This results in some overlap with that paper, but all the examples in the present paper are written entirely in Ada 83. I will return to considerations for Ada 9X in the last section of the paper.

  2. Source Term Model for Steady Micro Jets in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2005-01-01

    A source term model for steady micro jets was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the mass flow and momentum created by a steady blowing micro jet. The model is obtained by adding the momentum and mass flow created by the jet to the Navier-Stokes equations. The model was tested by comparing with data from numerical simulations of a single, steady micro jet on a flat plate in two and three dimensions. The source term model predicted the velocity distribution well compared to the two-dimensional plate using a steady mass flow boundary condition, which was used to simulate a steady micro jet. The model was also compared to two three-dimensional flat plate cases using a steady mass flow boundary condition to simulate a steady micro jet. The three-dimensional comparison included a case with a grid generated to capture the circular shape of the jet and a case without a grid generated for the micro jet. The case without the jet grid mimics the application of the source term. The source term model compared well with both of the three-dimensional cases. Comparisons of velocity distribution were made before and after the jet and Mach and vorticity contours were examined. The source term model allows a researcher to quickly investigate different locations of individual or several steady micro jets. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  3. Documentation generator for VHDL and MatLab source codes for photonic and electronic systems

    NASA Astrophysics Data System (ADS)

    Niton, B.; Pozniak, K. T.; Romaniuk, R. S.

    2011-06-01

    The UML, which is a complex system modeling and description technology, has recently been expanding its uses in the field of formalization and algorithmic approach to such systems like multiprocessor photonic, optoelectronic and advanced electronics carriers; distributed, multichannel measurement systems; optical networks, industrial electronics, novel R&D solutions. The paper describes a new concept of software dedicated for documenting the source codes written in VHDL and MatLab. The work starts with the analysis of available documentation generators for both programming languages, with an emphasis on the open source solutions. There are presented own solutions which base on the Doxygen program available as a free license with the source code. The supporting tools for parsers building were used like Bison and Flex. The documentation generator application is used for design of large optoelectronic and electronic measurement and control systems. The paper consists of three parts which describe the following components of the documentation generator for photonic and electronic systems: concept, MatLab application and VHDL application. This is part one which describes the system concept. Part two describes the MatLab application. MatLab is used for description of the measured phenomena. Part three describes the VHDL application. VHDL is used for behavioral description of the optoelectronic system. All the proposed approach and application documents big, complex software configurations for large systems.

  4. LENSED: a code for the forward reconstruction of lenses and sources from strong lensing observations

    NASA Astrophysics Data System (ADS)

    Tessore, Nicolas; Bellagamba, Fabio; Metcalf, R. Benton

    2016-09-01

    Robust modelling of strong lensing systems is fundamental to exploit the information they contain about the distribution of matter in galaxies and clusters. In this work, we present LENSED, a new code which performs forward parametric modelling of strong lenses. LENSED takes advantage of a massively parallel ray-tracing kernel to perform the necessary calculations on a modern graphics processing unit (GPU). This makes the precise rendering of the background lensed sources much faster, and allows the simultaneous optimisation of tens of parameters for the selected model. With a single run, the code is able to obtain the full posterior probability distribution for the lens light, the mass distribution and the background source at the same time. LENSED is first tested on mock images which reproduce realistic space-based observations of lensing systems. In this way, we show that it is able to recover unbiased estimates of the lens parameters, even when the sources do not follow exactly the assumed model. Then, we apply it to a subsample of the SLACS lenses, in order to demonstrate its use on real data. The results generally agree with the literature, and highlight the flexibility and robustness of the algorithm.

  5. Source Listings for Computer Code SPIRALI Incompressible, Turbulent Spiral Grooved Cylindrical and Face Seals

    NASA Technical Reports Server (NTRS)

    Walowit, Jed A.; Shapiro, Wibur

    2005-01-01

    This is the source listing of the computer code SPIRALI which predicts the performance characteristics of incompressible cylindrical and face seals with or without the inclusion of spiral grooves. Performance characteristics include load capacity (for face seals), leakage flow, power requirements and dynamic characteristics in the form of stiffness, damping and apparent mass coefficients in 4 degrees of freedom for cylindrical seals and 3 degrees of freedom for face seals. These performance characteristics are computed as functions of seal and groove geometry, load or film thickness, running and disturbance speeds, fluid viscosity, and boundary pressures.

  6. Languages for artificial intelligence: Implementing a scheduler in LISP and in Ada

    NASA Technical Reports Server (NTRS)

    Hays, Dan

    1988-01-01

    A prototype scheduler for space experiments originally programmed in a dialect of LISP using some of the more traditional techniques of that language, was recast using an object-oriented LISP, Common LISP with Flavors on the Symbolics. This object-structured version was in turn partially implemented in Ada. The Flavors version showed a decided improvement in both speed of execution and readability of code. The recasting into Ada involved various practical problems of implementation as well as certain challenges of reconceptualization in going from one language to the other. Advantages were realized, however, in greater clarity of the code, especially where more standard flow of control was used. This exercise raised issues about the influence of programming language on the design of flexible and sensitive programs such as schedule planners, and called attention to the importance of factors external to the languages themselves such as system embeddedness, hardware context, and programmer practice.

  7. QUEST/Ada: Query utility environment for software testing of Ada

    NASA Technical Reports Server (NTRS)

    Brown, David B.

    1989-01-01

    Results of research and development efforts are presented for Task 1, Phase 2 of a general project entitled, The Development of a Program Analysis Environment for Ada. A prototype of the QUEST/Ada system was developed to collect data to determine the effectiveness of the rule-based testing paradigm. The prototype consists of five parts: the test data generator, the parser/scanner, the test coverage analyzer, a symbolic evaluator, and a data management facility, known as the Librarian. These components are discussed at length. Also presented is an experimental design for the evaluations, an overview of the project, and a schedule for its completion.

  8. Compressed X-ray phase-contrast imaging using a coded source

    NASA Astrophysics Data System (ADS)

    Sung, Yongjin; Xu, Ling; Nagarkar, Vivek; Gupta, Rajiv

    2014-12-01

    X-ray phase-contrast imaging (XPCI) holds great promise for medical X-ray imaging with high soft-tissue contrast. Obviating optical elements in the imaging chain, propagation-based XPCI (PB-XPCI) has definite advantages over other XPCI techniques in terms of cost, alignment and scalability. However, it imposes strict requirements on the spatial coherence of the source and the resolution of the detector. In this study, we demonstrate that using a coded X-ray source and sparsity-based reconstruction, we can significantly relax these requirements. Using numerical simulation, we assess the feasibility of our approach and study the effect of system parameters on the reconstructed image. The results are demonstrated with images obtained using a bench-top micro-focus XPCI system.

  9. GRODY - GAMMA RAY OBSERVATORY DYNAMICS SIMULATOR IN ADA

    NASA Technical Reports Server (NTRS)

    Stark, M.

    1994-01-01

    analyst can send results output in graphical or tabular form to a terminal, disk, or hardcopy device, and can choose to have any or all items plotted against time or against each other. Goddard researchers developed GRODY on a VAX 8600 running VMS version 4.0. For near real time performance, GRODY requires a VAX at least as powerful as a model 8600 running VMS 4.0 or a later version. To use GRODY, the VAX needs an Ada Compilation System (ACS), Code Management System (CMS), and 1200K memory. GRODY is written in Ada and FORTRAN.

  10. Digital data sets that describe aquifer characteristics of the Vamoosa-Ada aquifer in east-central Oklahoma

    USGS Publications Warehouse

    Abbott, Marvin M.; Runkle, D.L.; Rea, Alan

    1997-01-01

    Nonproprietary format files This diskette contains digitized aquifer boundaries and maps of hydraulic conductivity, recharge, and ground-water level elevation contours for the Vamoosa-Ada aquifer in east-central Oklahoma. The Vamoosa-Ada aquifer is an important source of water that underlies about 2,320-square miles of parts of Osage, Pawnee, Payne, Creek, Lincoln, Okfuskee, and Seminole Counties. Approximately 75 percent of the water withdrawn from the Vamoosa-Ada aquifer is for municipal use. Rural domestic use and water for stock animals account for most of the remaining water withdrawn. The Vamoosa-Ada aquifer is defined in a ground-water report as consisting principally of the rocks of the Late Pennsylvanian-age Vamoosa Formation and overlying Ada Group. The Vamoosa-Ada aquifer consists of a complex sequence of fine- to very fine-grained sandstone, siltstone, shale, and conglomerate interbedded with very thin limestones. The water-yielding capabilities of the aquifer are generally controlled by lateral and vertical distribution of the sandstone beds and their physical characteristics. The Vamoosa-Ada aquifer is unconfined where it outcrops in about an 1,700-square-mile area. Most of the lines in the aquifer boundary, hydraulic conductivity, and recharge data sets were extracted from published digital surficial geology data sets based on a scale of 1:250,000, and represent geologic contacts. Some of lines in the data sets were interpolated in areas where the Vamoosa-Ada aquifer is overlain by alluvial and terrace deposits near streams and rivers. These data sets include only the outcrop area of the Vamoosa-Ada aquifer and where the aquifer is overlain by alluvial and terrace deposits. The hydraulic conductivity value and recharge rate are from a ground-water report about the Vamoosa-Ada aquifer. The water-level elevation contours were digitized from a mylar map, at a scale of 1:250,000, used to publish a plate in a ground-water report about the Vamoosa-Ada

  11. Source Term Model for Vortex Generator Vanes in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2004-01-01

    A source term model for an array of vortex generators was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the side force created by a vortex generator vane. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on the local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low profile vortex generator vane on a flat plate. In addition, the model was compared to experimental data of an S-duct with 22 co-rotating, low profile vortex generators. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator on a flat plate without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream-wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data. The peak vorticity and its location were also predicted very well when compared to numerical simulations and experimental data. The circulation predicted by the source term model matches the prediction of the numerical simulation. The source term model predicted the engine fan face distortion and total pressure recovery of the S-duct with 22 co-rotating vortex generators very well. The source term model allows a researcher to quickly investigate different locations of individual or a row of vortex generators. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  12. Open Source Physics: Code and Curriculum Material for Teachers, Authors, and Developers

    NASA Astrophysics Data System (ADS)

    Christian, Wolfgang

    2004-03-01

    The continued use of procedural languages in education is due in part to the lack of up-to-date curricular materials that combine science topics with an object-oriented programming framework. Although there are many resources for teaching computational physics, few are object-oriented. What is needed by the broader science education community is not another computational physics, numerical analysis, or Java programming book (although such books are essential for discipline-specific practitioners), but a synthesis of curriculum development, computational physics, computer science, and physics education that will be useful for scientists and students wishing to write their own simulations and develop their own curricular material. The Open Source Physics (OSP) project was established to meet this need. OSP is an NSF-funded curriculum development project that is developing and distributing a code library, programs, and examples of computer-based interactive curricular material. In this talk, we will describe this library, demonstrate its use, and report on its adoption by curriculum authors. The Open Source Physics code library, documentation, and sample curricular material can be downloaded from http://www.opensourcephysics.org/. Partial funding for this work was obtained through NSF grant DUE-0126439.

  13. PRIMUS: a computer code for the preparation of radionuclide ingrowth matrices from user-specified sources

    SciTech Connect

    Hermann, O.W.; Baes, C.F. III; Miller, C.W.; Begovich, C.L.; Sjoreen, A.L.

    1984-10-01

    The computer program, PRIMUS, reads a library of radionuclide branching fractions and half-lives and constructs a decay-chain data library and a problem-specific decay-chain data file. PRIMUS reads the decay data compiled for 496 nuclides from the Evaluated Nuclear Structure Data File (ENSDF). The ease of adding radionuclides to the input library allows the CRRIS system to further expand its comprehensive data base. The decay-chain library produced is input to the ANEMOS code. Also, PRIMUS produces a data set reduced to only the decay chains required in a particular problem, for input to the SUMIT, TERRA, MLSOIL, and ANDROS codes. Air concentrations and deposition rates from the PRIMUS decay-chain data file. Source term data may be entered directly to PRIMUS to be read by MLSOIL, TERRA, and ANDROS. The decay-chain data prepared by PRIMUS is needed for a matrix-operator method that computes either time-dependent decay products from an initial concentration generated from a constant input source. This document describes the input requirements and the output obtained. Also, sections are included on methods, applications, subroutines, and sample cases. A short appendix indicates a method of utilizing PRIMUS and the associated decay subroutines from TERRA or ANDROS for applications to other decay problems. 18 references.

  14. Fast space-varying convolution using matrix source coding with applications to camera stray light reduction.

    PubMed

    Wei, Jianing; Bouman, Charles A; Allebach, Jan P

    2014-05-01

    Many imaging applications require the implementation of space-varying convolution for accurate restoration and reconstruction of images. Here, we use the term space-varying convolution to refer to linear operators whose impulse response has slow spatial variation. In addition, these space-varying convolution operators are often dense, so direct implementation of the convolution operator is typically computationally impractical. One such example is the problem of stray light reduction in digital cameras, which requires the implementation of a dense space-varying deconvolution operator. However, other inverse problems, such as iterative tomographic reconstruction, can also depend on the implementation of dense space-varying convolution. While space-invariant convolution can be efficiently implemented with the fast Fourier transform, this approach does not work for space-varying operators. So direct convolution is often the only option for implementing space-varying convolution. In this paper, we develop a general approach to the efficient implementation of space-varying convolution, and demonstrate its use in the application of stray light reduction. Our approach, which we call matrix source coding, is based on lossy source coding of the dense space-varying convolution matrix. Importantly, by coding the transformation matrix, we not only reduce the memory required to store it; we also dramatically reduce the computation required to implement matrix-vector products. Our algorithm is able to reduce computation by approximately factoring the dense space-varying convolution operator into a product of sparse transforms. Experimental results show that our method can dramatically reduce the computation required for stray light reduction while maintaining high accuracy. PMID:24710398

  15. Association of G22A and A4223C ADA1 gene polymorphisms and ADA activity with PCOS.

    PubMed

    Salehabadi, Mahshid; Farimani, Marzieh; Tavilani, Heidar; Ghorbani, Marzieh; Poormonsefi, Faranak; Poorolajal, Jalal; Shafiei, Gholamreza; Ghasemkhani, Neda; Khodadadi, Iraj

    2016-06-01

    Adenosine deaminase-1 (ADA1) regulates the concentration of adenosine as the main modulator of oocyte maturation. There is compelling evidence for the association of ADA1 gene polymorphisms with many diseases but the importance of ADA1 polymorphisms in polycystic ovary syndrome (PCOS) has not been studied before. This study investigates serum total ADA activity (tADA), ADA1 and ADA2 isoenzyme activities, and genotype and allele frequencies of G22A and A4223C polymorphisms in healthy and PCOS women. In this case-control study 200 PCOS patients and 200 healthy women were enrolled. Genomic DNA was extracted from whole blood and the PCR-RFLP technique was used to determine the G22A and A4223C variants. The genotype frequencies were calculated and the association between polymorphic genotypes and enzyme activities were determined. tADA activity was significantly lower in the PCOS group compared with the control group (27.76±6.0 vs. 39.63±7.48, respectively). PCOS patients also showed reduced activity of ADA1 and ADA2. PCOS was not associated with G22A polymorphism whereas AA, AC, and CC genotypes of A4223C polymorphism were found distributed differently between the control and the PCOS women where the C allele showed a strong protective role for PCOS (odds ratio=1.876, p=0.033). The present study for the first time showed that lower ADA activity may be involved in pathogenesis of PCOS by maintaining a higher concentration of adenosine affecting follicular growth. As a novel finding, we also showed great differences in genotype distribution and allele frequencies of A4223C polymorphism between groups indicating a protective role for C allele against PCOS. AbbreviationsADA: adenosine deaminase PCOS: polycystic ovary syndrome PCR-RFLP: polymerase chain reaction-restriction fragment length polymorphism tADA: total adenosine deaminase. PMID:26980102

  16. Effectiveness Evaluation of Skin Covers against Intravascular Brachytherapy Sources Using VARSKIN3 Code

    PubMed Central

    Baghani, H R; Nazempour, A R; Aghamiri, S M R; Hosseini Daghigh, S M; Mowlavi, A A

    2013-01-01

    Background and Objective: The most common intravascular brachytherapy sources include 32P, 188Re, 106Rh and 90Sr/90Y. In this research, skin absorbed dose for different covering materials in dealing with these sources were evaluated and the best covering material for skin protection and reduction of absorbed dose by radiation staff was recognized and recommended. Method: Four materials including polyethylene, cotton and two different kinds of plastic were proposed as skin covers and skin absorbed dose at different depths for each kind of the materials was calculated separately using the VARSKIN3 code. Results: The results suggested that for all sources, skin absorbed dose was minimized when using polyethylene. Considering this material as skin cover, maximum and minimum doses at skin surface were related to 90Sr/90Y and 106Rh, respectively. Conclusion: polyethylene was found the most effective cover in reducing skin dose and protecting the skin. Furthermore, proper agreement between the results of VARSKIN3 and other experimental measurements indicated that VRASKIN3 is a powerful tool for skin dose calculations when working with beta emitter sources. Therefore, it can be utilized in dealing with the issue of radiation protection. PMID:25505758

  17. Ada(R) Test and Verification System (ATVS)

    NASA Technical Reports Server (NTRS)

    Strelich, Tom

    1986-01-01

    The Ada Test and Verification System (ATVS) functional description and high level design are completed and summarized. The ATVS will provide a comprehensive set of test and verification capabilities specifically addressing the features of the Ada language, support for embedded system development, distributed environments, and advanced user interface capabilities. Its design emphasis was on effective software development environment integration and flexibility to ensure its long-term use in the Ada software development community.

  18. Towards a formal semantics for Ada 9X

    NASA Technical Reports Server (NTRS)

    Guaspari, David; Mchugh, John; Wolfgang, Polak; Saaltink, Mark

    1995-01-01

    The Ada 9X language precision team was formed during the revisions of Ada 83, with the goal of analyzing the proposed design, identifying problems, and suggesting improvements, through the use of mathematical models. This report defines a framework for formally describing Ada 9X, based on Kahn's 'natural semantics', and applies the framework to portions of the language. The proposals for exceptions and optimization freedoms are also analyzed, using a different technique.

  19. Software engineering and the role of Ada: Executive seminar

    NASA Technical Reports Server (NTRS)

    Freedman, Glenn B.

    1987-01-01

    The objective was to introduce the basic terminology and concepts of software engineering and Ada. The life cycle model is reviewed. The application of the goals and principles of software engineering is applied. An introductory understanding of the features of the Ada language is gained. Topics addressed include: the software crises; the mandate of the Space Station Program; software life cycle model; software engineering; and Ada under the software engineering umbrella.

  20. ADA (adenosine deaminase) gene therapy enters the competition

    SciTech Connect

    Culliton, B.J.

    1990-08-31

    Around the world, some 70 children are members of a select and deadly club. Born with an immune deficiency so severe that they will die of infection unless their immune systems can be repaired, they have captured the attention of would-be gene therapists who believe that a handful of these kids--the 15 or 20 who lack functioning levels of the enzyme adenosine deaminase (ADA)--could be saved by a healthy ADA gene. A team of gene therapists is ready to put the theory to the test. In April 1987, a team of NIH researchers headed by R. Michael Blaese and W. French Anderson came up with the first formal protocol to introduce a healthy ADA gene into an unhealthy human. After 3 years of line-by-line scrutiny by five review committees, they have permission to go ahead. Two or three children will be treated in the next year, and will be infused with T lymphocytes carrying the gene for ADA. If the experiment works, the ADA gene will begin producing normal amounts of ADA. An interesting feature of ADA deficiency, that makes it ideal for initial gene studies, is that the amount of ADA one needs for a healthy immune system is quite variable. Hence, once inside a patient's T cells, the new ADA gene needs only to express the enzyme in moderate amounts. No precise gene regulation is necessary.

  1. Implementation of a production Ada project: The GRODY study

    NASA Technical Reports Server (NTRS)

    Godfrey, Sara; Brophy, Carolyn Elizabeth

    1989-01-01

    The use of the Ada language and design methodologies that encourage full use of its capabilities have a strong impact on all phases of the software development project life cycle. At the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC), the Software Engineering Laboratory (SEL) conducted an experiment in parallel development of two flight dynamics systems in FORTRAN and Ada. The differences observed during the implementation, unit testing, and integration phases of the two projects are described and the lessons learned during the implementation phase of the Ada development are outlined. Included are recommendations for future Ada development projects.

  2. Multiple-source models for electron beams of a medical linear accelerator using BEAMDP computer code

    PubMed Central

    Jabbari, Nasrollah; Barati, Amir Hoshang; Rahmatnezhad, Leili

    2012-01-01

    Aim The aim of this work was to develop multiple-source models for electron beams of the NEPTUN 10PC medical linear accelerator using the BEAMDP computer code. Background One of the most accurate techniques of radiotherapy dose calculation is the Monte Carlo (MC) simulation of radiation transport, which requires detailed information of the beam in the form of a phase-space file. The computing time required to simulate the beam data and obtain phase-space files from a clinical accelerator is significant. Calculation of dose distributions using multiple-source models is an alternative method to phase-space data as direct input to the dose calculation system. Materials and methods Monte Carlo simulation of accelerator head was done in which a record was kept of the particle phase-space regarding the details of the particle history. Multiple-source models were built from the phase-space files of Monte Carlo simulations. These simplified beam models were used to generate Monte Carlo dose calculations and to compare those calculations with phase-space data for electron beams. Results Comparison of the measured and calculated dose distributions using the phase-space files and multiple-source models for three electron beam energies showed that the measured and calculated values match well each other throughout the curves. Conclusion It was found that dose distributions calculated using both the multiple-source models and the phase-space data agree within 1.3%, demonstrating that the models can be used for dosimetry research purposes and dose calculations in radiotherapy. PMID:24377026

  3. Coded apertures allow high-energy x-ray phase contrast imaging with laboratory sources

    NASA Astrophysics Data System (ADS)

    Ignatyev, K.; Munro, P. R. T.; Chana, D.; Speller, R. D.; Olivo, A.

    2011-07-01

    This work analyzes the performance of the coded-aperture based x-ray phase contrast imaging approach, showing that it can be used at high x-ray energies with acceptable exposure times. Due to limitations in the used source, we show images acquired at tube voltages of up to 100 kVp, however, no intrinsic reason indicates that the method could not be extended to even higher energies. In particular, we show quantitative agreement between the contrast extracted from the experimental x-ray images and the theoretical one, determined by the behavior of the material's refractive index as a function of energy. This proves that all energies in the used spectrum contribute to the image formation, and also that there are no additional factors affecting image contrast as the x-ray energy is increased. We also discuss the method flexibility by displaying and analyzing the first set of images obtained while varying the relative displacement between coded-aperture sets, which leads to image variations to some extent similar to those observed when changing the crystal angle in analyzer-based imaging. Finally, we discuss the method's possible advantages in terms of simplification of the set-up, scalability, reduced exposure times, and complete achromaticity. We believe this would helpful in applications requiring the imaging of highly absorbing samples, e.g., material science and security inspection, and, in the way of example, we demonstrate a possible application in the latter.

  4. Living Up to the Code's Exhortations? Social Workers' Political Knowledge Sources, Expectations, and Behaviors.

    PubMed

    Felderhoff, Brandi Jean; Hoefer, Richard; Watson, Larry Dan

    2016-01-01

    The National Association of Social Workers' (NASW's) Code of Ethics urges social workers to engage in political action. However, little recent research has been conducted to examine whether social workers support this admonition and the extent to which they actually engage in politics. The authors gathered data from a survey of social workers in Austin, Texas, to address three questions. First, because keeping informed about government and political news is an important basis for action, the authors asked what sources of knowledge social workers use. Second, they asked what the respondents believe are appropriate political behaviors for other social workers and NASW. Third, they asked for self-reports regarding respondents' own political behaviors. Results indicate that social workers use the Internet and traditional media services to stay informed; expect other social workers and NASW to be active; and are, overall, more active than the general public in many types of political activities. The comparisons made between expectations for others and their own behaviors are interesting in their complex outcomes. Social workers should strive for higher levels of adherence to the code's urgings on political activity. Implications for future work are discussed. PMID:26897996

  5. An Open-Source, Pseudo-Spectral Convection Code for O(105) Cores

    NASA Astrophysics Data System (ADS)

    Featherstone, N. A.

    2014-12-01

    Spectral algorithms are a popular choice for modeling systems of turbulent, incompressible flow, due in part to their inherent numerical accuracy and also, as in the case of the sphere, geometrical considerations. These advantages must be weighed against the high cost of communication, however, as any time step taken by a spectral method will typically require multiple, global reorganizations (i.e. transposes) of the distributed flow fields and thermal variables. As more processors are employed in the solution of a particular problem, the total computation time decreases, but the number of inter-processor messages initiated increases. It is this property of spectral algorithms that ultimately limits their parallel scalability because, for any given problem size, there exists a sufficiently large process count such that the message initiation time overwhelms any gains in computation time. I will discuss the parallelization of a community-sourced spectral code that has been designed to mitigate this problem by minimizing the number of messages initiated within a single time step. The resulting algorithm possesses efficient strong scalability for problems both small (5123 grid points, 16,000 cores) and large (20483 grid points, 130,000 cores). This code, named Rayleigh, has been designed with the study of planetary and stellar dynamos in mind, and can efficiently simulate anelastic MHD convection within both spherical and Cartesian geometries. Rayleigh is being developed through the Computational Infrastructure for Geodynamics (UC Davis), and will be made publicly available in winter of 2015.

  6. Using the EGS4 computer code to determine radiation sources along beam lines at electron accelerators

    SciTech Connect

    Mao, S.; Liu, J.; Nelson, W.R.

    1992-01-01

    The EGS computer code, developed for the Monte Carlo simulation of the transport of electrons and photons, has been used since 1970 in the design of accelerators and detectors for high-energy physics. In this paper we present three examples demonstrating how the current version, EGS4, is used to determine energy-loss patterns and source terms along beam pipes, (i.e., including flanges, collimators, etc.). This information is useful for further shielding and dosimetry studies. The calculated results from the analysis are in close agreement with the measured values. To facilitate this review, a new add-on package called SHOWGRAF, is used in order to display shower trajectories for the three examples.

  7. RIES - Rijnland Internet Election System: A Cursory Study of Published Source Code

    NASA Astrophysics Data System (ADS)

    Gonggrijp, Rop; Hengeveld, Willem-Jan; Hotting, Eelco; Schmidt, Sebastian; Weidemann, Frederik

    The Rijnland Internet Election System (RIES) is a system designed for voting in public elections over the internet. A rather cursory scan of the source code to RIES showed a significant lack of security-awareness among the programmers which - among other things - appears to have left RIES vulnerable to near-trivial attacks. If it had not been for independent studies finding problems, RIES would have been used in the 2008 Water Board elections, possibly handling a million votes or more. While RIES was more extensively studied to find cryptographic shortcomings, our work shows that more down-to-earth secure design practices can be at least as important, and the aspects need to be examined much sooner than right before an election.

  8. A design for a reusable Ada library

    NASA Technical Reports Server (NTRS)

    Litke, John D.

    1986-01-01

    A goal of the Ada language standardization effort is to promote reuse of software, implying the existence of substantial software libraries and the storage/retrieval mechanisms to support them. A searching/cataloging mechanism is proposed that permits full or partial distribution of the database, adapts to a variety of searching mechanisms, permits a changine taxonomy with minimal disruption, and minimizes the requirement of specialized cataloger/indexer skills. The important observation is that key words serve not only as indexing mechanism, but also as an identification mechanism, especially via concatenation and as support for a searching mechanism. By deliberately separating these multiple uses, the modifiability and ease of growth that current libraries require, is achieved.

  9. The ADA and IDEA Basics: Inclusion of Children with Disabilities

    ERIC Educational Resources Information Center

    Motwani, Mona

    2007-01-01

    This article discusses the American with Disabilities Act (ADA) and the Individuals with Disabilities Education Act (IDEA). The ADA is a federal civil rights law that was passed in 1990 with the aim of securing equal rights for persons with disabilities in the employment, housing, government, transportation, and public accommodation contexts. It…

  10. Common ADA Errors and Omissions in New Construction and Alterations.

    ERIC Educational Resources Information Center

    Department of Justice, Washington, DC. Civil Rights Div.

    The Americans with Disabilities Act (ADA) 1990 includes a provision requiring that new construction and alterations to existing facilities comply with the ADA Standards for Accessible Design. This report explains 23 common accessibility errors or omissions that the Department of Justice has identified during the course of its enforcement efforts.…

  11. Artificial Intelligence in ADA: Pattern-Directed Processing. Final Report.

    ERIC Educational Resources Information Center

    Reeker, Larry H.; And Others

    To demonstrate to computer programmers that the programming language Ada provides superior facilities for use in artificial intelligence applications, the three papers included in this report investigate the capabilities that exist within Ada for "pattern-directed" programming. The first paper (Larry H. Reeker, Tulane University) is designed to…

  12. Alma Flor Ada and the Quest for Change

    ERIC Educational Resources Information Center

    Manna, Anthony, L.; Hill, Janet; Kellogg, Kathy

    2004-01-01

    Alma Flor Ada, a folklorist, novelist, scholar, teacher, and children's book author has passionate dedication to education for social justice, equality, and peace. As a faculty member at the University of San Francisco, Ada has developed programs that help students and others transform their lives and has written several bilingual legends and…

  13. 49 CFR 37.123 - ADA paratransit eligibility: Standards.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 1 2014-10-01 2014-10-01 false ADA paratransit eligibility: Standards. 37.123 Section 37.123 Transportation Office of the Secretary of Transportation TRANSPORTATION SERVICES FOR INDIVIDUALS WITH DISABILITIES (ADA) Paratransit as a Complement to Fixed Route Service § 37.123...

  14. Translation and execution of distributed Ada programs - Is it still Ada?

    NASA Technical Reports Server (NTRS)

    Volz, Richard A.; Mudge, Trevor N.; Buzzard, Gregory D.; Krishnan, Padmanabhan

    1987-01-01

    Some of the fundamental issues and tradeoffs for distributed execution systems for the Ada language are examined. Steps that need to be taken to deal with heterogeneity of addressing program objects, of processing resources, and of the individual processor environment are considered. The ways in which program elements can be assigned are examined in the context of four issues: implied remote object access, object visibility and recursive execution, task termination problems, and distributed types.

  15. 76 FR 38124 - Applications for New Awards; Americans With Disabilities Act (ADA) National Network Regional...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-29

    ... Applications for New Awards; Americans With Disabilities Act (ADA) National Network Regional Centers and ADA National Network Collaborative Research Projects AGENCY: Office of Special Education and Rehabilitative... Program--Disability Rehabilitation Research Projects (DRRP)--ADA National Network Regional...

  16. First International Conference on Ada (R) Programming Language Applications for the NASA Space Station, volume 1

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L. (Editor)

    1986-01-01

    Topics discussed include: test and verification; environment issues; distributed Ada issues; life cycle issues; Ada in Europe; management/training issues; common Ada interface set; and run time issues.

  17. Comparison of TG-43 dosimetric parameters of brachytherapy sources obtained by three different versions of MCNP codes.

    PubMed

    Zaker, Neda; Zehtabian, Mehdi; Sina, Sedigheh; Koontz, Craig; Meigooni, Ali S

    2016-01-01

    Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross-sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross-sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in 125I and 103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code - MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low-energy sources such as 125I and 103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for 103Pd and 10 cm for 125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for 192Ir and less than 1.2% for 137Cs between the three codes. PMID:27074460

  18. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  19. Assume-Guarantee Verification of Source Code with Design-Level Assumptions

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.

    2004-01-01

    Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.

  20. PyVCI: A flexible open-source code for calculating accurate molecular infrared spectra

    NASA Astrophysics Data System (ADS)

    Sibaev, Marat; Crittenden, Deborah L.

    2016-06-01

    The PyVCI program package is a general purpose open-source code for simulating accurate molecular spectra, based upon force field expansions of the potential energy surface in normal mode coordinates. It includes harmonic normal coordinate analysis and vibrational configuration interaction (VCI) algorithms, implemented primarily in Python for accessibility but with time-consuming routines written in C. Coriolis coupling terms may be optionally included in the vibrational Hamiltonian. Non-negligible VCI matrix elements are stored in sparse matrix format to alleviate the diagonalization problem. CPU and memory requirements may be further controlled by algorithmic choices and/or numerical screening procedures, and recommended values are established by benchmarking using a test set of 44 molecules for which accurate analytical potential energy surfaces are available. Force fields in normal mode coordinates are obtained from the PyPES library of high quality analytical potential energy surfaces (to 6th order) or by numerical differentiation of analytic second derivatives generated using the GAMESS quantum chemical program package (to 4th order).

  1. Source size and temporal coherence requirements of coded aperture type x-ray phase contrast imaging systems.

    PubMed

    Munro, Peter R T; Ignatyev, Konstantin; Speller, Robert D; Olivo, Alessandro

    2010-09-13

    There is currently much interest in developing X-ray Phase Contrast Imaging (XPCI) systems which employ laboratory sources in order to deploy the technique in real world applications. The challenge faced by nearly all XPCI techniques is that of efficiently utilising the x-ray flux emitted by an x-ray tube which is polychromatic and possesses only partial spatial coherence. Techniques have, however, been developed which overcome these limitations. Such a technique, known as coded aperture XPCI, has been under development in our laboratories in recent years for application principally in medical imaging and security screening. In this paper we derive limitations imposed upon source polychromaticity and spatial extent by the coded aperture system. We also show that although other grating XPCI techniques employ a different physical principle, they satisfy design constraints similar to those of the coded aperture XPCI. PMID:20940863

  2. Source coherence impairments in a direct detection direct sequence optical code-division multiple-access system.

    PubMed

    Fsaifes, Ihsan; Lepers, Catherine; Lourdiane, Mounia; Gallion, Philippe; Beugin, Vincent; Guignard, Philippe

    2007-02-01

    We demonstrate that direct sequence optical code- division multiple-access (DS-OCDMA) encoders and decoders using sampled fiber Bragg gratings (S-FBGs) behave as multipath interferometers. In that case, chip pulses of the prime sequence codes generated by spreading in time-coherent data pulses can result from multiple reflections in the interferometers that can superimpose within a chip time duration. We show that the autocorrelation function has to be considered as the sum of complex amplitudes of the combined chip as the laser source coherence time is much greater than the integration time of the photodetector. To reduce the sensitivity of the DS-OCDMA system to the coherence time of the laser source, we analyze the use of sparse and nonperiodic quadratic congruence and extended quadratic congruence codes. PMID:17230236

  3. Applications of an architecture design and assessment system (ADAS)

    NASA Technical Reports Server (NTRS)

    Gray, F. Gail; Debrunner, Linda S.; White, Tennis S.

    1988-01-01

    A new Architecture Design and Assessment System (ADAS) tool package is introduced, and a range of possible applications is illustrated. ADAS was used to evaluate the performance of an advanced fault-tolerant computer architecture in a modern flight control application. Bottlenecks were identified and possible solutions suggested. The tool was also used to inject faults into the architecture and evaluate the synchronization algorithm, and improvements are suggested. Finally, ADAS was used as a front end research tool to aid in the design of reconfiguration algorithms in a distributed array architecture.

  4. Software Engineering Laboratory (SEL) Ada performance study report

    NASA Technical Reports Server (NTRS)

    Booth, Eric W.; Stark, Michael E.

    1991-01-01

    The goals of the Ada Performance Study are described. The methods used are explained. Guidelines for future Ada development efforts are given. The goals and scope of the study are detailed, and the background of Ada development in the Flight Dynamics Division (FDD) is presented. The organization and overall purpose of each test are discussed. The purpose, methods, and results of each test and analyses of these results are given. Guidelines for future development efforts based on the analysis of results from this study are provided. The approach used on the performance tests is discussed.

  5. Ada and software management in NASA: Symposium/forum

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The promises of Ada to improve software productivity and quality, and the claims that a transition to Ada would require significant changes in NASA's training programs and ways of doing business were investigated. The study assesses the agency's ongoing and planned Ada activities. A series of industry representatives (Computer Sciences Corporation, General Electric Aerospace, McDonnell Douglas Space Systems Company, TRW, Lockheed, and Boeing) reviewed the recommendations and assessed their impact from the Company's perspective. The potential effects on NASA programs were then discussed.

  6. Use of WIMS-E lattice code for prediction of the transuranic source term for spent fuel dose estimation

    SciTech Connect

    Schwinkendorf, K.N.

    1996-04-15

    A recent source term analysis has shown a discrepancy between ORIGEN2 transuranic isotopic production estimates and those produced with the WIMS-E lattice physics code. Excellent agreement between relevant experimental measurements and WIMS-E was shown, thus exposing an error in the cross section library used by ORIGEN2.

  7. Acoustic Scattering by Three-Dimensional Stators and Rotors Using the SOURCE3D Code. Volume 2; Scattering Plots

    NASA Technical Reports Server (NTRS)

    Meyer, Harold D.

    1999-01-01

    This second volume of Acoustic Scattering by Three-Dimensional Stators and Rotors Using the SOURCE3D Code provides the scattering plots referenced by Volume 1. There are 648 plots. Half are for the 8750 rpm "high speed" operating condition and the other half are for the 7031 rpm "mid speed" operating condition.

  8. Interesting viewpoints to those who will put Ada into practice

    NASA Technical Reports Server (NTRS)

    Carlsson, Arne

    1986-01-01

    Ada will most probably be used as the programming language for computers in the NASA Space Station. It is reasonable to suppose that Ada will be used for at least embedded computers, because the high software costs for these embedded computers were the reason why Ada activities were initiated about ten years ago. The on-board computers are designed for use in space applications, where maintenance by man is impossible. All manipulation of such computers has to be performed in an autonomous way or remote with commands from the ground. In a manned Space Station some maintenance work can be performed by service people on board, but there are still a lot of applications, which require autonomous computers, for example, vital Space Station functions and unmanned orbital transfer vehicles. Those aspect which have come out of the analysis of Ada characteristics together with the experience of requirements for embedded on-board computers in space applications are examined.

  9. A report on NASA software engineering and Ada training requirements

    NASA Technical Reports Server (NTRS)

    Legrand, Sue; Freedman, Glenn B.; Svabek, L.

    1987-01-01

    NASA's software engineering and Ada skill base are assessed and information that may result in new models for software engineering, Ada training plans, and curricula are provided. A quantitative assessment which reflects the requirements for software engineering and Ada training across NASA is provided. A recommended implementation plan including a suggested curriculum with associated duration per course and suggested means of delivery is also provided. The distinction between education and training is made. Although it was directed to focus on NASA's need for the latter, the key relationships to software engineering education are also identified. A rationale and strategy for implementing a life cycle education and training program are detailed in support of improved software engineering practices and the transition to Ada.

  10. The development of a program analysis environment for Ada

    NASA Technical Reports Server (NTRS)

    Brown, David B.; Carlisle, Homer W.; Chang, Kai-Hsiung; Cross, James H.; Deason, William H.; Haga, Kevin D.; Huggins, John R.; Keleher, William R. A.; Starke, Benjamin B.; Weyrich, Orville R.

    1989-01-01

    A unit level, Ada software module testing system, called Query Utility Environment for Software Testing of Ada (QUEST/Ada), is described. The project calls for the design and development of a prototype system. QUEST/Ada design began with a definition of the overall system structure and a description of component dependencies. The project team was divided into three groups to resolve the preliminary designs of the parser/scanner: the test data generator, and the test coverage analyzer. The Phase 1 report is a working document from which the system documentation will evolve. It provides history, a guide to report sections, a literature review, the definition of the system structure and high level interfaces, descriptions of the prototype scope, the three major components, and the plan for the remainder of the project. The appendices include specifications, statistics, two papers derived from the current research, a preliminary users' manual, and the proposal and work plan for Phase 2.

  11. The Adam language: Ada extended with support for multiway activities

    NASA Technical Reports Server (NTRS)

    Charlesworth, Arthur

    1993-01-01

    The Adam language is an extension of Ada that supports multiway activities, which are cooperative activities involving two or more processes. This support is provided by three new constructs: diva procedures, meet statements, and multiway accept statements. Diva procedures are recursive generic procedures having a particular restrictive syntax that facilitates translation for parallel computers. Meet statements and multiway accept statements provide two ways to express a multiway rendezvous, which is an n-way rendezvous generalizing Ada's 2-way rendezvous. While meet statements tend to have simpler rules than multiway accept statements, the latter approach is a more straightforward extension of Ada. The only nonnull statements permitted within meet statements and multiway accept statements are calls on instantiated diva procedures. A call on an instantiated diva procedure is also permitted outside a multiway rendezvous; thus sequential Adam programs using diva procedures can be written. Adam programs are translated into Ada programs appropriate for use on parallel computers.

  12. Designing with Ada for satellite simulation: A case study

    NASA Technical Reports Server (NTRS)

    Agresti, W. W.; Church, V. E.; Card, D. N.; Lo, P. L.

    1986-01-01

    A FORTRAN oriented and an Ada oriented design for the same system are compared to learn whether an essentially different design was produced using Ada. The designs were produced by an experiment that involves the parallel development of software for a spacecraft dynamics simulator. Design differences are identified in the use of abstractions, system structure, and simulator operations. Although the designs were vastly different, this result may be influenced by some special characteristics discussed.

  13. Designing with Ada for satellite simulation: A case study

    NASA Technical Reports Server (NTRS)

    Agresti, William W.; Church, Victor E.; Card, David N.; Lo, P. L.

    1986-01-01

    A FORTRAN-operated and an Ada-oriented design for the same system are compared to learn whether an essentially different design was produced using Ada. The designs were produced by an experiment that involves the parallel development of software for a spacecraft dynamics simulator. Design differences are identified in the use of abstractions, system structure, and simulator operations. Although the designs were significantly different, this result may be influenced by some special characteristics discussed.

  14. Considerations for the design of Ada reusable packages

    NASA Technical Reports Server (NTRS)

    Nise, Norman S.; Giffin, Chuck

    1986-01-01

    Two important considerations that precede the design of Ada reusable packages (commonality and programming standards) are discuessed. First, the importance of designing packages to yield widespread commonality is expressed. A means of measuring the degree of applicability of packages both within and across applications areas is presented. Design consideration that will improve commonality are also discussed. Second, considerations for the development of programming standards are set forth. These considerations will lead to standards that will improve the reusability of Ada packages.

  15. Benchmarking Ada tasking on tightly coupled multiprocessor architectures

    NASA Technical Reports Server (NTRS)

    Collard, Philippe; Goforth, Andre; Marquardt, Matthew

    1989-01-01

    The development of benchmarks and performance measures for parallel Ada tasking is reported with emphasis on the macroscopic behavior of the benchmark across a set of load parameters. The application chosen for the study was the NASREM model for telerobot control, relevant to many NASA missions. The results of the study demonstrate the potential of parallel Ada in accomplishing the task of developing a control system for a system such as the Flight Telerobotic Servicer using the NASREM framework.

  16. Compiling knowledge-based systems specified in KEE to Ada

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Feldman, Roy D.

    1991-01-01

    The first year of the PrKAda project is recounted. The primary goal was to develop a system for delivering Artificial Intelligence applications developed in the ProKappa system in a pure-Ada environment. The following areas are discussed: the ProKappa core and ProTalk programming language; the current status of the implementation; the limitations and restrictions of the current system; and the development of Ada-language message handlers in the ProKappa environment.

  17. ANEMOS: A computer code to estimate air concentrations and ground deposition rates for atmospheric nuclides emitted from multiple operating sources

    SciTech Connect

    Miller, C.W.; Sjoreen, A.L.; Begovich, C.L.; Hermann, O.W.

    1986-11-01

    This code estimates concentrations in air and ground deposition rates for Atmospheric Nuclides Emitted from Multiple Operating Sources. ANEMOS is one component of an integrated Computerized Radiological Risk Investigation System (CRRIS) developed for the US Environmental Protection Agency (EPA) for use in performing radiological assessments and in developing radiation standards. The concentrations and deposition rates calculated by ANEMOS are used in subsequent portions of the CRRIS for estimating doses and risks to man. The calculations made in ANEMOS are based on the use of a straight-line Gaussian plume atmospheric dispersion model with both dry and wet deposition parameter options. The code will accommodate a ground-level or elevated point and area source or windblown source. Adjustments may be made during the calculations for surface roughness, building wake effects, terrain height, wind speed at the height of release, the variation in plume rise as a function of downwind distance, and the in-growth and decay of daughter products in the plume as it travels downwind. ANEMOS can also accommodate multiple particle sizes and clearance classes, and it may be used to calculate the dose from a finite plume of gamma-ray-emitting radionuclides passing overhead. The output of this code is presented for 16 sectors of a circular grid. ANEMOS can calculate both the sector-average concentrations and deposition rates at a given set of downwind distances in each sector and the average of these quantities over an area within each sector bounded by two successive downwind distances. ANEMOS is designed to be used primarily for continuous, long-term radionuclide releases. This report describes the models used in the code, their computer implementation, the uncertainty associated with their use, and the use of ANEMOS in conjunction with other codes in the CRRIS. A listing of the code is included in Appendix C.

  18. Fan Noise Prediction System Development: Source/Radiation Field Coupling and Workstation Conversion for the Acoustic Radiation Code

    NASA Technical Reports Server (NTRS)

    Meyer, H. D.

    1993-01-01

    The Acoustic Radiation Code (ARC) is a finite element program used on the IBM mainframe to predict far-field acoustic radiation from a turbofan engine inlet. In this report, requirements for developers of internal aerodynamic codes regarding use of their program output an input for the ARC are discussed. More specifically, the particular input needed from the Bolt, Beranek and Newman/Pratt and Whitney (turbofan source noise generation) Code (BBN/PWC) is described. In a separate analysis, a method of coupling the source and radiation models, that recognizes waves crossing the interface in both directions, has been derived. A preliminary version of the coupled code has been developed and used for initial evaluation of coupling issues. Results thus far have shown that reflection from the inlet is sufficient to indicate that full coupling of the source and radiation fields is needed for accurate noise predictions ' Also, for this contract, the ARC has been modified for use on the Sun and Silicon Graphics Iris UNIX workstations. Changes and additions involved in this effort are described in an appendix.

  19. RT_BUILD: An expert programmer for implementing and simulating Ada real-time control software

    NASA Technical Reports Server (NTRS)

    Lehman, Larry L.; Houtchens, Steve; Navab, Massoud; Shah, Sunil C.

    1986-01-01

    The RT BUILD is an expert control system programmer that creates real-time Ada code from block-diagram descriptions of control systems. Since RT BUILD embodies substantial knowledge about the implementation of real-time control systems, it can perform many, if not most of the functions normally performed by human real-time programmers. Though much basic research was done in automatic programming, RT BUILD appears to be the first application of this research to an important problem in flight control system development. In particular, RT BUILD was designed to directly increase productivity and reliability for control implementations of large complex systems.

  20. Hydraulic Capacity of an ADA Compliant Street Drain Grate

    SciTech Connect

    Lottes, Steven A.; Bojanowski, Cezary

    2015-09-01

    Resurfacing of urban roads with concurrent repairs and replacement of sections of curb and sidewalk may require pedestrian ramps that are compliant with the American Disabilities Act (ADA), and when street drains are in close proximity to the walkway, ADA compliant street grates may also be required. The Minnesota Department of Transportation ADA Operations Unit identified a foundry with an available grate that meets ADA requirements. Argonne National Laboratory’s Transportation Research and Analysis Computing Center used full scale three dimensional computational fluid dynamics to determine the performance of the ADA compliant grate and compared it to that of a standard vane grate. Analysis of a parametric set of cases was carried out, including variation in longitudinal, gutter, and cross street slopes and the water spread from the curb. The performance of the grates was characterized by the fraction of the total volume flow approaching the grate from the upstream that was captured by the grate and diverted into the catch basin. The fraction of the total flow entering over the grate from the side and the fraction of flow directly over a grate diverted into the catch basin were also quantities of interest that aid in understanding the differences in performance of the grates. The ADA compliant grate performance lagged that of the vane grate, increasingly so as upstream Reynolds number increased. The major factor leading to the performance difference between the two grates was the fraction of flow directly over the grates that is captured by the grates.

  1. Programming in a proposed 9X distributed Ada

    NASA Technical Reports Server (NTRS)

    Waldrop, Raymond S.; Volz, Richard A.; Goldsack, Stephen J.; Holzbach-Valero, A. A.

    1991-01-01

    The studies of the proposed Ada 9X constructs for distribution, now referred to as AdaPT are reported. The goals for this time period were to revise the chosen example scenario and to begin studying about how the proposed constructs might be implemented. The example scenario chosen is the Submarine Combat Information Center (CIC) developed by IBM for the Navy. The specification provided by IBM was preliminary and had several deficiencies. To address these problems, some changes to the scenario specification were made. Some of the more important changes include: (1) addition of a system database management function; (2) addition of a fourth processing unit to the standard resources; (3) addition of an operator console interface function; and (4) removal of the time synchronization function. To implement the CIC scenario in AdaPT, the decided strategy were publics, partitions, and nodes. The principle purpose for implementing the CIC scenario was to demonstrate how the AdaPT constructs interact with the program structure. While considering ways that the AdaPt constructs might be translated to Ada 83, it was observed that the partition construct could reasonably be modeled as an abstract data type. Although this gives a useful method of modeling partitions, it does not at all address the configuration aspects on the node construct.

  2. Examining the reliability of ADAS-Cog change scores.

    PubMed

    Grochowalski, Joseph H; Liu, Ying; Siedlecki, Karen L

    2016-09-01

    The purpose of this study was to estimate and examine ways to improve the reliability of change scores on the Alzheimer's Disease Assessment Scale, Cognitive Subtest (ADAS-Cog). The sample, provided by the Alzheimer's Disease Neuroimaging Initiative, included individuals with Alzheimer's disease (AD) (n = 153) and individuals with mild cognitive impairment (MCI) (n = 352). All participants were administered the ADAS-Cog at baseline and 1 year, and change scores were calculated as the difference in scores over the 1-year period. Three types of change score reliabilities were estimated using multivariate generalizability. Two methods to increase change score reliability were evaluated: reweighting the subtests of the scale and adding more subtests. Reliability of ADAS-Cog change scores over 1 year was low for both the AD sample (ranging from .53 to .64) and the MCI sample (.39 to .61). Reweighting the change scores from the AD sample improved reliability (.68 to .76), but lengthening provided no useful improvement for either sample. The MCI change scores had low reliability, even with reweighting and adding additional subtests. The ADAS-Cog scores had low reliability for measuring change. Researchers using the ADAS-Cog should estimate and report reliability for their use of the change scores. The ADAS-Cog change scores are not recommended for assessment of meaningful clinical change. PMID:26708116

  3. Dosimetry characterization of 32P intravascular brachytherapy source wires using Monte Carlo codes PENELOPE and GEANT4.

    PubMed

    Torres, Javier; Buades, Manuel J; Almansa, Julio F; Guerrero, Rafael; Lallena, Antonio M

    2004-02-01

    Monte Carlo calculations using the codes PENELOPE and GEANT4 have been performed to characterize the dosimetric parameters of the new 20 mm long catheter-based 32P beta source manufactured by the Guidant Corporation. The dose distribution along the transverse axis and the two-dimensional dose rate table have been calculated. Also, the dose rate at the reference point, the radial dose function, and the anisotropy function were evaluated according to the adapted TG-60 formalism for cylindrical sources. PENELOPE and GEANT4 codes were first verified against previous results corresponding to the old 27 mm Guidant 32P beta source. The dose rate at the reference point for the unsheathed 27 mm source in water was calculated to be 0.215 +/- 0.001 cGy s(-1) mCi(-1), for PENELOPE, and 0.2312 +/- 0.0008 cGy s(-1) mCi(-1), for GEANT4. For the unsheathed 20 mm source, these values were 0.2908 +/- 0.0009 cGy s(-1) mCi(-1) and 0.311 0.001 cGy s(-1) mCi(-1), respectively. Also, a comparison with the limited data available on this new source is shown. We found non-negligible differences between the results obtained with PENELOPE and GEANT4. PMID:15000615

  4. Description of a MIL-STD-1553B Data Bus Ada Driver for the LeRC EPS Testbed

    NASA Technical Reports Server (NTRS)

    Mackin, Michael A.

    1995-01-01

    This document describes the software designed to provide communication between control computers in the NASA Lewis Research Center Electrical Power System Testbed using MIL-STD-1553B. The software drivers are coded in the Ada programming language and were developed on a MSDOS-based computer workstation. The Electrical Power System (EPS) Testbed is a reduced-scale prototype space station electrical power system. The power system manages and distributes electrical power from the sources (batteries or photovoltaic arrays) to the end-user loads. The electrical system primary operates at 120 volts DC, and the secondary system operates at 28 volts DC. The devices which direct the flow of electrical power are controlled by a network of six control computers. Data and control messages are passed between the computers using the MIL-STD-1553B network. One of the computers, the Power Management Controller (PMC), controls the primary power distribution and another, the Load Management Controller (LMC), controls the secondary power distribution. Each of these computers communicates with two other computers which act as subsidiary controllers. These subsidiary controllers are, in turn, connected to the devices which directly control the flow of electrical power.

  5. Production version of the extended NASA-Langley vortex lattice FORTRAN computer program. Volume 2: Source code

    NASA Technical Reports Server (NTRS)

    Herbert, H. E.; Lamar, J. E.

    1982-01-01

    The source code for the latest production version, MARK IV, of the NASA-Langley Vortex Lattice Computer Program is presented. All viable subcritical aerodynamic features of previous versions were retained. This version extends the previously documented program capabilities to four planforms, 400 panels, and enables the user to obtain vortex-flow aerodynamics on cambered planforms, flowfield properties off the configuration in attached flow, and planform longitudinal load distributions.

  6. Ada and knowledge-based systems: A prototype combining the best of both worlds

    NASA Technical Reports Server (NTRS)

    Brauer, David C.

    1986-01-01

    A software architecture is described which facilitates the construction of distributed expert systems using Ada and selected knowledge based systems. This architecture was utilized in the development of a Knowledge-based Maintenance Expert System (KNOMES) prototype for the Space Station Mobile Service Center (MSC). The KNOMES prototype monitors a simulated data stream from MSC sensors and built-in test equipment. It detects anomalies in the data and performs diagnosis to determine the cause. The software architecture which supports the KNOMES prototype allows for the monitoring and diagnosis tasks to be performed concurrently. The basic concept of this software architecture is named ACTOR (Ada Cognitive Task ORganization Scheme). An individual ACTOR is a modular software unit which contains both standard data processing and artificial intelligence components. A generic ACTOR module contains Ada packages for communicating with other ACTORs and accessing various data sources. The knowledge based component of an ACTOR determines the role it will play in a system. In this prototype, an ACTOR will monitor the MSC data stream.

  7. General Purpose Kernel Integration Shielding Code System-Point and Extended Gamma-Ray Sources.

    1981-06-11

    PELSHIE3 calculates dose rates from gamma-emitting sources with different source geometries and shielding configurations. Eight source geometries are provided and are called by means of geometry index numbers. Gamma-emission characteristics for 134 isotopes, attenuation coefficients for 57 elements or shielding materials and Berger build-up parameters for 17 shielding materials can be obtained from a direct access data library by specifying only the appropriate library numbers. A different option allows these data to be read frommore » cards. For extended sources, constant source strengths as well as exponential and Bessel function source strength distributions are allowed in most cases.« less

  8. Investigation of Coded Source Neutron Imaging at the North Carolina State University PULSTAR Reactor

    SciTech Connect

    Xiao, Ziyu; Mishra, Kaushal; Hawari, Ayman; Bingham, Philip R; Bilheux, Hassina Z; Tobin Jr, Kenneth William

    2010-10-01

    A neutron imaging facility is located on beam-tube #5 of the 1-MWth PULSTAR reactor at the North Carolina State University. An investigation has been initiated to explore the application of coded imaging techniques at the facility. Coded imaging uses a mosaic of pinholes to encode an aperture, thus generating an encoded image of the object at the detector. To reconstruct the image recorded by the detector, corresponding decoding patterns are used. The optimized design of coded masks is critical for the performance of this technique and will depend on the characteristics of the imaging beam. In this work, Monte Carlo (MCNP) simulations were utilized to explore the needed modifications to the PULSTAR thermal neutron beam to support coded imaging techniques. In addition, an assessment of coded mask design has been performed. The simulations indicated that a 12 inch single crystal sapphire filter is suited for such an application at the PULSTAR beam in terms of maximizing flux with good neutron-to-gamma ratio. Computational simulations demonstrate the feasibility of correlation reconstruction methods on neutron transmission imaging. A gadolinium aperture with thickness of 500 m was used to construct the mask using a 38 34 URA pattern. A test experiment using such a URA design has been conducted and the point spread function of the system has been measured.

  9. Supporting the Cybercrime Investigation Process: Effective Discrimination of Source Code Authors Based on Byte-Level Information

    NASA Astrophysics Data System (ADS)

    Frantzeskou, Georgia; Stamatatos, Efstathios; Gritzalis, Stefanos

    Source code authorship analysis is the particular field that attempts to identify the author of a computer program by treating each program as a linguistically analyzable entity. This is usually based on other undisputed program samples from the same author. There are several cases where the application of such a method could be of a major benefit, such as tracing the source of code left in the system after a cyber attack, authorship disputes, proof of authorship in court, etc. In this paper, we present our approach which is based on byte-level n-gram profiles and is an extension of a method that has been successfully applied to natural language text authorship attribution. We propose a simplified profile and a new similarity measure which is less complicated than the algorithm followed in text authorship attribution and it seems more suitable for source code identification since is better able to deal with very small training sets. Experiments were performed on two different data sets, one with programs written in C++ and the second with programs written in Java. Unlike the traditional language-dependent metrics used by previous studies, our approach can be applied to any programming language with no additional cost. The presented accuracy rates are much better than the best reported results for the same data sets.

  10. Transparent ICD and DRG Coding Using Information Technology: Linking and Associating Information Sources with the eXtensible Markup Language

    PubMed Central

    Hoelzer, Simon; Schweiger, Ralf K.; Dudeck, Joachim

    2003-01-01

    With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or “semantically associated” parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach. PMID:12807813

  11. Transparent ICD and DRG coding using information technology: linking and associating information sources with the eXtensible Markup Language.

    PubMed

    Hoelzer, Simon; Schweiger, Ralf K; Dudeck, Joachim

    2003-01-01

    With the introduction of ICD-10 as the standard for diagnostics, it becomes necessary to develop an electronic representation of its complete content, inherent semantics, and coding rules. The authors' design relates to the current efforts by the CEN/TC 251 to establish a European standard for hierarchical classification systems in health care. The authors have developed an electronic representation of ICD-10 with the eXtensible Markup Language (XML) that facilitates integration into current information systems and coding software, taking different languages and versions into account. In this context, XML provides a complete processing framework of related technologies and standard tools that helps develop interoperable applications. XML provides semantic markup. It allows domain-specific definition of tags and hierarchical document structure. The idea of linking and thus combining information from different sources is a valuable feature of XML. In addition, XML topic maps are used to describe relationships between different sources, or "semantically associated" parts of these sources. The issue of achieving a standardized medical vocabulary becomes more and more important with the stepwise implementation of diagnostically related groups, for example. The aim of the authors' work is to provide a transparent and open infrastructure that can be used to support clinical coding and to develop further software applications. The authors are assuming that a comprehensive representation of the content, structure, inherent semantics, and layout of medical classification systems can be achieved through a document-oriented approach. PMID:12807813

  12. Active Fault Near-Source Zones Within and Bordering the State of California for the 1997 Uniform Building Code

    USGS Publications Warehouse

    Petersen, M.D.; Toppozada, Tousson R.; Cao, T.; Cramer, C.H.; Reichle, M.S.; Bryant, W.A.

    2000-01-01

    The fault sources in the Project 97 probabilistic seismic hazard maps for the state of California were used to construct maps for defining near-source seismic coefficients, Na and Nv, incorporated in the 1997 Uniform Building Code (ICBO 1997). The near-source factors are based on the distance from a known active fault that is classified as either Type A or Type B. To determine the near-source factor, four pieces of geologic information are required: (1) recognizing a fault and determining whether or not the fault has been active during the Holocene, (2) identifying the location of the fault at or beneath the ground surface, (3) estimating the slip rate of the fault, and (4) estimating the maximum earthquake magnitude for each fault segment. This paper describes the information used to produce the fault classifications and distances.

  13. The Impact of Causality on Information-Theoretic Source and Channel Coding Problems

    ERIC Educational Resources Information Center

    Palaiyanur, Harikrishna R.

    2011-01-01

    This thesis studies several problems in information theory where the notion of causality comes into play. Causality in information theory refers to the timing of when information is available to parties in a coding system. The first part of the thesis studies the error exponent (or reliability function) for several communication problems over…

  14. Pre-coding method and apparatus for multiple source or time-shifted single source data and corresponding inverse post-decoding method and apparatus

    NASA Technical Reports Server (NTRS)

    Yeh, Pen-Shu (Inventor)

    1997-01-01

    A pre-coding method and device for improving data compression performance by removing correlation between a first original data set and a second original data set, each having M members, respectively. The pre-coding method produces a compression-efficiency-enhancing double-difference data set. The method and device produce a double-difference data set, i.e., an adjacent-delta calculation performed on a cross-delta data set or a cross-delta calculation performed on two adjacent-delta data sets, from either one of (1) two adjacent spectral bands coming from two discrete sources, respectively, or (2) two time-shifted data sets coming from a single source. The resulting double-difference data set is then coded using either a distortionless data encoding scheme (entropy encoding) or a lossy data compression scheme. Also, a post-decoding method and device for recovering a second original data set having been represented by such a double-difference data set.

  15. Pre-coding method and apparatus for multiple source or time-shifted single source data and corresponding inverse post-decoding method and apparatus

    NASA Technical Reports Server (NTRS)

    Yeh, Pen-Shu (Inventor)

    1998-01-01

    A pre-coding method and device for improving data compression performance by removing correlation between a first original data set and a second original data set, each having M members, respectively. The pre-coding method produces a compression-efficiency-enhancing double-difference data set. The method and device produce a double-difference data set, i.e., an adjacent-delta calculation performed on a cross-delta data set or a cross-delta calculation performed on two adjacent-delta data sets, from either one of (1) two adjacent spectral bands coming from two discrete sources, respectively, or (2) two time-shifted data sets coming from a single source. The resulting double-difference data set is then coded using either a distortionless data encoding scheme (entropy encoding) or a lossy data compression scheme. Also, a post-decoding method and device for recovering a second original data set having been represented by such a double-difference data set.

  16. An automated quality assessor for Ada object-oriented designs

    NASA Technical Reports Server (NTRS)

    Bailin, Sidney C.

    1988-01-01

    A tool for evaluating object-oriented designs (OODs) for Ada software is described. The tool assumes a design expressed as a hierarchy of object diagrams. A design of this type identifies the objects of a system, an interface to each object, and the usage relationships between objects. When such a design is implemented in Ada, objects become packages, interfaces become package specifications, and usage relationships become Ada `with' clauses and package references. An automated quality assessor has been developed that is based on flagging undesirable design constructs. For convenience, distinctions are made among three levels of severity: questionable, undesirable, and hazardous. A questionable construct is one that may well be appropriate. An undesirable construct is one that should be changed because it is potentially harmful to the reliability, maintainability, or reusability of the software. A hazardous construct is one that is undesirable and that introduces a high level of risk.

  17. Using Ada for a distributed, fault tolerant system

    NASA Technical Reports Server (NTRS)

    Dewolf, J. B.; Sodano, N. M.; Whittredge, R. S.

    1984-01-01

    It is pointed out that advanced avionics applications increasingly require underlying machine architectures which are damage and fault tolerant, and which provide access to distributed sensors, effectors and high-throughput computational resources. The Advanced Information Processing System (AIPS), sponsored by NASA, is to provide an architecture which can meet the considered requirements. Ada was selected for implementing the AIPS system software. Advantages of Ada are related to its provisions for real-time programming, error detection, modularity and separate compilation, and standardization and portability. Chief drawbacks of this language are currently limited availability and maturity of language implementations, and limited experience in applying the language to real-time applications. The present investigation is concerned with current plans for employing Ada in the design of the software for AIPS. Attention is given to an overview of AIPS, AIPS software services, and representative design issues in each of four major software categories.

  18. Toward real-time performance benchmarks for Ada

    NASA Technical Reports Server (NTRS)

    Clapp, Russell M.; Duchesneau, Louis; Volz, Richard A.; Mudge, Trevor N.; Schultze, Timothy

    1986-01-01

    The issue of real-time performance measurements for the Ada programming language through the use of benchmarks is addressed. First, the Ada notion of time is examined and a set of basic measurement techniques are developed. Then a set of Ada language features believed to be important for real-time performance are presented and specific measurement methods discussed. In addition, other important time related features which are not explicitly part of the language but are part of the run-time related features which are not explicitly part of the language but are part of the run-time system are also identified and measurement techniques developed. The measurement techniques are applied to the language and run-time system features and the results are presented.

  19. ADA plaintiff must show AIDS limits major life activities.

    PubMed

    1998-05-15

    In a rare case, a Federal court ruled that AIDS does not automatically qualify a plaintiff for legal protection under the Americans with Disabilities Act (ADA). [Name removed], an Illinois Wal-Mart stock clerk, was fired weeks after telling the store's general manager of his HIV status. [Name removed] alleges that the firing was due solely to his disease. Wal-Mart contends that [name removed] was fired for sexually harassing a co-worker, and says that since [name removed] was asymptomatic and asked for no accommodations, he does not qualify for ADA protection. Magistrate Morton Denlow agreed, saying that [name removed] raised no genuine issues about whether the ADA should protect him. A trial is scheduled for May. PMID:11365337

  20. Impact of Ada in the Flight Dynamics Division: Excitement and frustration

    NASA Technical Reports Server (NTRS)

    Bailey, John; Waligora, Sharon; Stark, Mike

    1993-01-01

    In 1985, NASA Goddard's Flight Dynamics Division (FDD) began investigating how the Ada language might apply to their software development projects. Although they began cautiously using Ada on only a few pilot projects, they expected that, if the Ada pilots showed promising results, they would fully transition their entire development organization from FORTRAN to Ada within 10 years. However, nearly 9 years later, the FDD still produces 80 percent of its software in FORTRAN, despite positive results on Ada projects. This paper reports preliminary results of an ongoing study, commissioned by the FDD, to quantify the impact of Ada in the FDD, to determine why Ada has not flourished, and to recommend future directions regarding Ada. Project trends in both languages are examined as are external factors and cultural issues that affected the infusion of this technology. This paper is the first public report on the Ada assessment study, which will conclude with a comprehensive final report in mid 1994.

  1. Bacteria-induced natural product formation in the fungus Aspergillus nidulans requires Saga/Ada-mediated histone acetylation.

    PubMed

    Nützmann, Hans-Wilhelm; Reyes-Dominguez, Yazmid; Scherlach, Kirstin; Schroeckh, Volker; Horn, Fabian; Gacek, Agnieszka; Schümann, Julia; Hertweck, Christian; Strauss, Joseph; Brakhage, Axel A

    2011-08-23

    Sequence analyses of fungal genomes have revealed that the potential of fungi to produce secondary metabolites is greatly underestimated. In fact, most gene clusters coding for the biosynthesis of antibiotics, toxins, or pigments are silent under standard laboratory conditions. Hence, it is one of the major challenges in microbiology to uncover the mechanisms required for pathway activation. Recently, we discovered that intimate physical interaction of the important model fungus Aspergillus nidulans with the soil-dwelling bacterium Streptomyces rapamycinicus specifically activated silent fungal secondary metabolism genes, resulting in the production of the archetypal polyketide orsellinic acid and its derivatives. Here, we report that the streptomycete triggers modification of fungal histones. Deletion analysis of 36 of 40 acetyltransferases, including histone acetyltransferases (HATs) of A. nidulans, demonstrated that the Saga/Ada complex containing the HAT GcnE and the AdaB protein is required for induction of the orsellinic acid gene cluster by the bacterium. We also showed that Saga/Ada plays a major role for specific induction of other biosynthesis gene clusters, such as sterigmatocystin, terrequinone, and penicillin. Chromatin immunoprecipitation showed that the Saga/Ada-dependent increase of histone 3 acetylation at lysine 9 and 14 occurs during interaction of fungus and bacterium. Furthermore, the production of secondary metabolites in A. nidulans is accompanied by a global increase in H3K14 acetylation. Increased H3K9 acetylation, however, was only found within gene clusters. This report provides previously undescribed evidence of Saga/Ada dependent histone acetylation triggered by prokaryotes. PMID:21825172

  2. A modernized PDL approach for Ada software development

    NASA Technical Reports Server (NTRS)

    Usavage, Paul, Jr.

    1988-01-01

    The desire to integrate newly available, graphically-oriented Computed Aided Software Engineering (CASE) tools with existing software design approaches is changing the way Program Design Language (PDL) or Process Description Language is used for large system development. In the approach documented here, Software Engineers use graphics tools to model the problem and to describe high level software design in diagrams. An Ada-based PDL is used to document low level design. Some results are provided along with an analysis for each of three smaller General Electric (GE) Ada development projects that utilized variations on this approach. Finally some considerations are identified for larger scale implementation.

  3. Formal methods in the design of Ada 1995

    NASA Technical Reports Server (NTRS)

    Guaspari, David

    1995-01-01

    Formal, mathematical methods are most useful when applied early in the design and implementation of a software system--that, at least, is the familiar refrain. I will report on a modest effort to apply formal methods at the earliest possible stage, namely, in the design of the Ada 95 programming language itself. This talk is an 'experience report' that provides brief case studies illustrating the kinds of problems we worked on, how we approached them, and the extent (if any) to which the results proved useful. It also derives some lessons and suggestions for those undertaking future projects of this kind. Ada 95 is the first revision of the standard for the Ada programming language. The revision began in 1988, when the Ada Joint Programming Office first asked the Ada Board to recommend a plan for revising the Ada standard. The first step in the revision was to solicit criticisms of Ada 83. A set of requirements for the new language standard, based on those criticisms, was published in 1990. A small design team, the Mapping Revision Team (MRT), became exclusively responsible for revising the language standard to satisfy those requirements. The MRT, from Intermetrics, is led by S. Tucker Taft. The work of the MRT was regularly subject to independent review and criticism by a committee of distinguished Reviewers and by several advisory teams--for example, the two User/Implementor teams, each consisting of an industrial user (attempting to make significant use of the new language on a realistic application) and a compiler vendor (undertaking, experimentally, to modify its current implementation in order to provide the necessary new features). One novel decision established the Language Precision Team (LPT), which investigated language proposals from a mathematical point of view. The LPT applied formal mathematical analysis to help improve the design of Ada 95 (e.g., by clarifying the language proposals) and to help promote its acceptance (e.g., by identifying a

  4. V Olimpíada Brasileira de Astronomia

    NASA Astrophysics Data System (ADS)

    Villas da Rocha, J. F.; Canalle, J. B. G.; Wuesnche, C. A.; de Medeiros, J. R., Silva, A. V. R.; Lavouras, D. F.; Dottori, H. A.; Maia, M. A. G.; Vieira Martins, R.; Poppe, P. C. R.

    2003-08-01

    Neste trabalho apresentamos os resultados da V Olimpíada Brasileira de Astronomia, a qual ocorreu em 11/05/2002 em todos os estabelecimentos de ensino fundamental ou médio previamente cadastrados. Participaram do evento 60.338 alunos distribuídos por 1469 escolas pertencentes a todos os Estados brasileiros. Uma equipe de 5 alunos foi escolhida para representar o Brasil na VII Olimpíada Internacional de Astronomia que ocorreu na Rússia em 2002 e dois de nossos alunos ganharam a medalha de bronze naquele evento.

  5. Supreme Court to tackle ADA/social security conflict.

    PubMed

    1998-10-30

    The Supreme Court is scheduled to hear arguments involving [name removed]. [Name removed] who claims that he lost his job because he has AIDS. Originally, the Third U.S. Circuit Court of Appeals ruled that he was barred from suing because he had accepted disability payments while waiting for the Equal Employment Opportunity Commission to issue a determination regarding his claim. The Circuit Court's ruling was not unanimous. Also, Federal courts have increasingly ruled that receipt of benefits is not an automatic bar to an ADA claim. The different definitions of disability by Social Security and by the ADA need clarification. PMID:11366014

  6. kspectrum: an open-source code for high-resolution molecular absorption spectra production

    NASA Astrophysics Data System (ADS)

    Eymet, V.; Coustet, C.; Piaud, B.

    2016-01-01

    We present the kspectrum, scientific code that produces high-resolution synthetic absorption spectra from public molecular transition parameters databases. This code was originally required by the atmospheric and astrophysics communities, and its evolution is now driven by new scientific projects among the user community. Since it was designed without any optimization that would be specific to any particular application field, its use could also be extended to other domains. kspectrum produces spectral data that can subsequently be used either for high-resolution radiative transfer simulations, or for producing statistic spectral model parameters using additional tools. This is a open project that aims at providing an up-to-date tool that takes advantage of modern computational hardware and recent parallelization libraries. It is currently provided by Méso-Star (http://www.meso-star.com) under the CeCILL license, and benefits from regular updates and improvements.

  7. The development of an Ada programming support environment database: SEAD (Software Engineering and Ada Database), user's manual

    NASA Technical Reports Server (NTRS)

    Liaw, Morris; Evesson, Donna

    1988-01-01

    This is a manual for users of the Software Engineering and Ada Database (SEAD). SEAD was developed to provide an information resource to NASA and NASA contractors with respect to Ada-based resources and activities that are available or underway either in NASA or elsewhere in the worldwide Ada community. The sharing of such information will reduce the duplication of effort while improving quality in the development of future software systems. The manual describes the organization of the data in SEAD, the user interface from logging in to logging out, and concludes with a ten chapter tutorial on how to use the information in SEAD. Two appendices provide quick reference for logging into SEAD and using the keyboard of an IBM 3270 or VT100 computer terminal.

  8. An ion-source model for first-order beam dynamic codes

    SciTech Connect

    Fink, C.L.; Curry, B.P.

    1993-08-01

    A model of a plasma ion source has been developed that approximates the system of Poisson and Boltzmann-Vlasov equations normally used to describe ion sources by an external electric field, a collective electric field due to the charge column, and the starting boundary conditions. The equations of this model can be used directly in the Lorentz force equation to calculate trajectories without iteration.

  9. BIOTC: An open-source CFD code for simulating biomass fast pyrolysis

    NASA Astrophysics Data System (ADS)

    Xiong, Qingang; Aramideh, Soroush; Passalacqua, Alberto; Kong, Song-Charng

    2014-06-01

    The BIOTC code is a computer program that combines a multi-fluid model for multiphase hydrodynamics and global chemical kinetics for chemical reactions to simulate fast pyrolysis of biomass at reactor scale. The object-oriented characteristic of BIOTC makes it easy for researchers to insert their own sub-models, while the user-friendly interface provides users a friendly environment as in commercial software. A laboratory-scale bubbling fluidized bed reactor for biomass fast pyrolysis was simulated using BIOTC to demonstrate its capability.

  10. Induction of resistance to alkylating agents in E. coli: the ada+ gene product serves both as a regulatory protein and as an enzyme for repair of mutagenic damage.

    PubMed Central

    Teo, I; Sedgwick, B; Demple, B; Li, B; Lindahl, T

    1984-01-01

    The expression of several inducible enzymes for repair of alkylated DNA in Escherichia coli is controlled by the ada+ gene. This regulatory gene has been cloned into a multicopy plasmid and shown to code for a 37-kd protein. Antibodies raised against homogeneous O6-methylguanine-DNA methyltransferase (the main repair activity for mutagenic damage in alkylated DNA) were found to cross-react with this 37-kd protein. Cell extracts from several independently derived ada mutants contain variable amounts of an altered 37-kd protein after an inducing alkylation treatment. In addition, an 18-kd protein identical with the previously isolated O6-methyl-guanine-DNA methyltransferase has been identified as a product of the ada+ gene. The smaller polypeptide is derived from the 37-kd protein by proteolytic processing. Images Fig. 1. Fig. 2. Fig. 4. Fig. 5. Fig. 6. Fig. 7. Fig. 8. PMID:6092060

  11. The Impact of Business Size on Employer ADA Response

    ERIC Educational Resources Information Center

    Bruyere, Susanne M.; Erickson, William A.; VanLooy, Sara A.

    2006-01-01

    More than 10 years have passed since the employment provisions of the Americans with Disabilities Act of 1990 (ADA) came into effect for employers of 15 or more employees. Americans with disabilities continue to be more unemployed and underemployed than their nondisabled peers. Small businesses, with fewer than 500 employees, continue to be the…

  12. Predicting protein structural class with AdaBoost Learner.

    PubMed

    Niu, Bing; Cai, Yu-Dong; Lu, Wen-Cong; Li, Guo-Zheng; Chou, Kuo-Chen

    2006-01-01

    The structural class is an important feature in characterizing the overall topological folding type of a protein or the domains therein. Prediction of protein structural classification has attracted the attention and efforts from many investigators. In this paper a novel predictor, the AdaBoost Learner, was introduced to deal with this problem. The essence of the AdaBoost Learner is that a combination of many 'weak' learning algorithms, each performing just slightly better than a random guessing algorithm, will generate a 'strong' learning algorithm. Demonstration thru jackknife cross-validation on two working datasets constructed by previous investigators indicated that AdaBoost outperformed other predictors such as SVM (support vector machine), a powerful algorithm widely used in biological literatures. It has not escaped our notice that AdaBoost may hold a high potential for improving the quality in predicting the other protein features as well, such as subcellular location and receptor type, among many others. Or at the very least, it will play a complementary role to many of the existing algorithms in this regard. PMID:16800803

  13. Software Engineering Laboratory Ada performance study: Results and implications

    NASA Technical Reports Server (NTRS)

    Booth, Eric W.; Stark, Michael E.

    1992-01-01

    The SEL is an organization sponsored by NASA/GSFC to investigate the effectiveness of software engineering technologies applied to the development of applications software. The SEL was created in 1977 and has three organizational members: NASA/GSFC, Systems Development Branch; The University of Maryland, Computer Sciences Department; and Computer Sciences Corporation, Systems Development Operation. The goals of the SEL are as follows: (1) to understand the software development process in the GSFC environments; (2) to measure the effect of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that include the Ada Performance Study Report. This paper describes the background of Ada in the Flight Dynamics Division (FDD), the objectives and scope of the Ada Performance Study, the measurement approach used, the performance tests performed, the major test results, and the implications for future FDD Ada development efforts.

  14. 49 CFR 37.125 - ADA paratransit eligibility: Process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... administrative appeal process through which individuals who are denied eligibility can obtain review of the... the appeal is issued. (h) The entity may establish an administrative process to suspend, for a... 49 Transportation 1 2010-10-01 2010-10-01 false ADA paratransit eligibility: Process....

  15. 49 CFR 37.123 - ADA paratransit eligibility: Standards.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... undue financial burden waiver under §§ 37.151-37.155 of this part. (e) The following individuals are ADA... with a personal care attendant, the entity shall provide service to one other individual in addition to... as a person accompanying the eligible individual, and not as a personal care attendant, unless...

  16. Section 504/ADA: Guidelines for Educators in Kansas. Revised.

    ERIC Educational Resources Information Center

    Miller, Joan; Bieker, Rod; Copenhaver, John

    This document presents the Kansas State Department of Education's guidelines to Section 504 of the Rehabilitation Act and the Americans with Disabilities Act (ADA). The guidelines specifically address Subparts A, B, C, and D of the regulations for Section 504 which deal with general provisions, employment practices, accessibility and education. An…

  17. Ada H. H. Lewis Middle School Curriculum Guide.

    ERIC Educational Resources Information Center

    Philadelphia School District, PA. Office of Curriculum and Instruction.

    This curriculum guide describes the instructional program at the Ada H. H. Lewis Middle School in Philadelphia, Pennsylvania. In brief, the goals of the program are to provide the schools' fifth-grade through eighth-grade students with educational opportunities based on an eclectic team-teaching approach. Four separate "houses" accommodate…

  18. Ada as an implementation language for knowledge based systems

    NASA Technical Reports Server (NTRS)

    Rochowiak, Daniel

    1990-01-01

    Debates about the selection of programming languages often produce cultural collisions that are not easily resolved. This is especially true in the case of Ada and knowledge based programming. The construction of programming tools provides a desirable alternative for resolving the conflict.

  19. Supreme Court to hear ADA suit involving arbitration clause.

    PubMed

    1998-03-20

    The U.S. Supreme Court agreed to review a third case under the Americans with Disabilities Act (ADA) this year. The Supreme Court previously agreed to hear [name removed] v. [Name removed], involving a dentist who refused to treat an HIV-positive patient in his office. The second case is Pennsylvania Department of Corrections v. [Name removed], in which the State asserts that the ADA does not apply to prisons. The third involves whether an arbitration clause in a labor union's collective bargaining agreement prevents a court from hearing a union member's discrimination claim. [Name removed] longshoreman [name removed] alleges that the [name removed] and several employers violated the ADA when they refused to help him for employment referral. [Name removed] previously settled a workers' compensation disability claim with [name removed] and Terminal Co., his employer. Three years later he applied for work at the International Longshoreman's Association hiring hall and was referred to four different employers. The employers discovered he had received a worker's compensation settlement and would no longer accept [name removed] for employment referral. The case is important because union members can continue to file ADA charges with the EEOC and the outcomes will vary depending on the circuit where the union member happens to file the claim. PMID:11365192

  20. [Section] 504/ADA Student Issues: The Latest and the Greatest.

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    Recent case law within and outside the school context has revised or refined various concepts concerning eligibility and other K-12 issues under Section 504 and the Americans with Disabilities Act (ADA). Ten case lessons are described in this paper, seven of which are: (1) The frame of reference for determining "substantially limits" in the…

  1. Learn about the ADA in Your Local Library.

    ERIC Educational Resources Information Center

    Department of Justice, Washington, DC. Civil Rights Div.

    This bibliography lists 90 documents contained within the Americans with Disabilities Act (ADA) Information File, which has been sent to 15,000 libraries across the country. The listing is organized into the following categories: laws and regulations (eight documents), technical assistance manuals and highlights (nine documents), question and…

  2. On the Efficacy of Source Code Optimizations for Cache-Based Systems

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Saphir, William C.; Saini, Subhash (Technical Monitor)

    1998-01-01

    Obtaining high performance without machine-specific tuning is an important goal of scientific application programmers. Since most scientific processing is done on commodity microprocessors with hierarchical memory systems, this goal of "portable performance" can be achieved if a common set of optimization principles is effective for all such systems. It is widely believed, or at least hoped, that portable performance can be realized. The rule of thumb for optimization on hierarchical memory systems is to maximize temporal and spatial locality of memory references by reusing data and minimizing memory access stride. We investigate the effects of a number of optimizations on the performance of three related kernels taken from a computational fluid dynamics application. Timing the kernels on a range of processors, we observe an inconsistent and often counterintuitive impact of the optimizations on performance. In particular, code variations that have a positive impact on one architecture can have a negative impact on another, and variations expected to be unimportant can produce large effects. Moreover, we find that cache miss rates-as reported by a cache simulation tool, and confirmed by hardware counters-only partially explain the results. By contrast, the compiler-generated assembly code provides more insight by revealing the importance of processor-specific instructions and of compiler maturity, both of which strongly, and sometimes unexpectedly, influence performance. We conclude that it is difficult to obtain performance portability on modern cache-based computers, and comment on the implications of this result.

  3. On the Efficacy of Source Code Optimizations for Cache-Based Systems

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Saphir, William C.

    1998-01-01

    Obtaining high performance without machine-specific tuning is an important goal of scientific application programmers. Since most scientific processing is done on commodity microprocessors with hierarchical memory systems, this goal of "portable performance" can be achieved if a common set of optimization principles is effective for all such systems. It is widely believed, or at least hoped, that portable performance can be realized. The rule of thumb for optimization on hierarchical memory systems is to maximize temporal and spatial locality of memory references by reusing data and minimizing memory access stride. We investigate the effects of a number of optimizations on the performance of three related kernels taken from a computational fluid dynamics application. Timing the kernels on a range of processors, we observe an inconsistent and often counterintuitive impact of the optimizations on performance. In particular, code variations that have a positive impact on one architecture can have a negative impact on another, and variations expected to be unimportant can produce large effects. Moreover, we find that cache miss rates - as reported by a cache simulation tool, and confirmed by hardware counters - only partially explain the results. By contrast, the compiler-generated assembly code provides more insight by revealing the importance of processor-specific instructions and of compiler maturity, both of which strongly, and sometimes unexpectedly, influence performance. We conclude that it is difficult to obtain performance portability on modern cache-based computers, and comment on the implications of this result.

  4. NONCODE 2016: an informative and valuable data source of long non-coding RNAs

    PubMed Central

    Zhao, Yi; Li, Hui; Fang, Shuangsang; Kang, Yue; wu, Wei; Hao, Yajing; Li, Ziyang; Bu, Dechao; Sun, Ninghui; Zhang, Michael Q.; Chen, Runsheng

    2016-01-01

    NONCODE (http://www.bioinfo.org/noncode/) is an interactive database that aims to present the most complete collection and annotation of non-coding RNAs, especially long non-coding RNAs (lncRNAs). The recently reduced cost of RNA sequencing has produced an explosion of newly identified data. Revolutionary third-generation sequencing methods have also contributed to more accurate annotations. Accumulative experimental data also provides more comprehensive knowledge of lncRNA functions. In this update, NONCODE has added six new species, bringing the total to 16 species altogether. The lncRNAs in NONCODE have increased from 210 831 to 527,336. For human and mouse, the lncRNA numbers are 167,150 and 130,558, respectively. NONCODE 2016 has also introduced three important new features: (i) conservation annotation; (ii) the relationships between lncRNAs and diseases; and (iii) an interface to choose high-quality datasets through predicted scores, literature support and long-read sequencing method support. NONCODE is also accessible through http://www.noncode.org/. PMID:26586799

  5. Knowledge and potential impact of the WHO Global code of practice on the international recruitment of health personnel: Does it matter for source and destination country stakeholders?

    PubMed

    Bourgeault, Ivy Lynn; Labonté, Ronald; Packer, Corinne; Runnels, Vivien; Tomblin Murphy, Gail

    2016-01-01

    The WHO Global Code of Practice on the International Recruitment of Health Personnel was implemented in May 2010. The present commentary offers some insights into what is known about the Code five years on, as well as its potential impact, drawing from interviews with health care and policy stakeholders from a number of 'source' and 'destination' countries. PMID:27381004

  6. Photoplus: auxiliary information for printed images based on distributed source coding

    NASA Astrophysics Data System (ADS)

    Samadani, Ramin; Mukherjee, Debargha

    2008-01-01

    A printed photograph is difficult to reuse because the digital information that generated the print may no longer be available. This paper describes a mechanism for approximating the original digital image by combining a scan of the printed photograph with small amounts of digital auxiliary information kept together with the print. The auxiliary information consists of a small amount of digital data to enable accurate registration and color-reproduction, followed by a larger amount of digital data to recover residual errors and lost frequencies by distributed Wyner-Ziv coding techniques. Approximating the original digital image enables many uses, including making good quality reprints from the original print, even when they are faded many years later. In essence, the print itself becomes the currency for archiving and repurposing digital images, without requiring computer infrastructure.

  7. ACT-ARA: Code System for the Calculation of Changes in Radiological Source Terms with Time

    1988-02-01

    The program calculates the source term activity as a function of time for parent isotopes as well as daughters. Also, at each time, the "probable release" is produced. Finally, the program determines the time integrated probable release for each isotope over the time period of interest.

  8. Optimization of a photoneutron source based on 10 MeV electron beam using Geant4 Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Askri, Boubaker

    2015-10-01

    Geant4 Monte Carlo code has been used to conceive and optimize a simple and compact neutron source based on a 10 MeV electron beam impinging on a tungsten target adjoined to a beryllium target. For this purpose, a precise photonuclear reaction cross-section model issued from the International Atomic Energy Agency (IAEA) database was linked to Geant4 to accurately simulate the interaction of low energy bremsstrahlung photons with beryllium material. A benchmark test showed that a good agreement was achieved when comparing the emitted neutron flux spectra predicted by Geant4 and Fluka codes for a beryllium cylinder bombarded with a 5 MeV photon beam. The source optimization was achieved through a two stage Monte Carlo simulation. In the first stage, the distributions of the seven phase space coordinates of the bremsstrahlung photons at the boundaries of the tungsten target were determined. In the second stage events corresponding to photons emitted according to these distributions were tracked. A neutron yield of 4.8 × 1010 neutrons/mA/s was obtained at 20 cm from the beryllium target. A thermal neutron yield of 1.5 × 109 neutrons/mA/s was obtained after introducing a spherical shell of polyethylene as a neutron moderator.

  9. Validation and verification of RELAP5 for Advanced Neutron Source accident analysis: Part I, comparisons to ANSDM and PRSDYN codes

    SciTech Connect

    Chen, N.C.J.; Ibn-Khayat, M.; March-Leuba, J.A.; Wendel, M.W.

    1993-12-01

    As part of verification and validation, the Advanced Neutron Source reactor RELAP5 system model was benchmarked by the Advanced Neutron Source dynamic model (ANSDM) and PRSDYN models. RELAP5 is a one-dimensional, two-phase transient code, developed by the Idaho National Engineering Laboratory for reactor safety analysis. Both the ANSDM and PRSDYN models use a simplified single-phase equation set to predict transient thermal-hydraulic performance. Brief descriptions of each of the codes, models, and model limitations were included. Even though comparisons were limited to single-phase conditions, a broad spectrum of accidents was benchmarked: a small loss-of-coolant-accident (LOCA), a large LOCA, a station blackout, and a reactivity insertion accident. The overall conclusion is that the three models yield similar results if the input parameters are the same. However, ANSDM does not capture pressure wave propagation through the coolant system. This difference is significant in very rapid pipe break events. Recommendations are provided for further model improvements.

  10. Measuring Ada as a software development technology in the Software Engineering Laboratory (SEL)

    NASA Technical Reports Server (NTRS)

    Agresti, W. W.

    1985-01-01

    An experiment is in progress to measure the effectiveness of Ada in the National Aeronautics and Space Administration/Goddard Space Flight Center flight dynamics software development environment. The experiment features the parallel development of software in FORTRAN and Ada. The experiment organization, objectives, and status are discussed. Experiences with an Ada training program and data from the development of a 5700-line Ada training exercise are reported.

  11. ELAPSE - NASA AMES LISP AND ADA BENCHMARK SUITE: EFFICIENCY OF LISP AND ADA PROCESSING - A SYSTEM EVALUATION

    NASA Technical Reports Server (NTRS)

    Davis, G. J.

    1994-01-01

    One area of research of the Information Sciences Division at NASA Ames Research Center is devoted to the analysis and enhancement of processors and advanced computer architectures, specifically in support of automation and robotic systems. To compare systems' abilities to efficiently process Lisp and Ada, scientists at Ames Research Center have developed a suite of non-parallel benchmarks called ELAPSE. The benchmark suite was designed to test a single computer's efficiency as well as alternate machine comparisons on Lisp, and/or Ada languages. ELAPSE tests the efficiency with which a machine can execute the various routines in each environment. The sample routines are based on numeric and symbolic manipulations and include two-dimensional fast Fourier transformations, Cholesky decomposition and substitution, Gaussian elimination, high-level data processing, and symbol-list references. Also included is a routine based on a Bayesian classification program sorting data into optimized groups. The ELAPSE benchmarks are available for any computer with a validated Ada compiler and/or Common Lisp system. Of the 18 routines that comprise ELAPSE, provided within this package are 14 developed or translated at Ames. The others are readily available through literature. The benchmark that requires the most memory is CHOLESKY.ADA. Under VAX/VMS, CHOLESKY.ADA requires 760K of main memory. ELAPSE is available on either two 5.25 inch 360K MS-DOS format diskettes (standard distribution) or a 9-track 1600 BPI ASCII CARD IMAGE format magnetic tape. The contents of the diskettes are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. The ELAPSE benchmarks were written in 1990. VAX and VMS are trademarks of Digital Equipment Corporation. MS-DOS is a registered trademark of Microsoft Corporation.

  12. Development and Demonstration of an Ada Test Generation System

    NASA Technical Reports Server (NTRS)

    1996-01-01

    In this project we have built a prototype system that performs Feasible Path Analysis on Ada programs: given a description of a set of control flow paths through a procedure, and a predicate at a program point feasible path analysis determines if there is input data which causes execution to flow down some path in the collection reaching the point so that tile predicate is true. Feasible path analysis can be applied to program testing, program slicing, array bounds checking, and other forms of anomaly checking. FPA is central to most applications of program analysis. But, because this problem is formally unsolvable, syntactic-based approximations are used in its place. For example, in dead-code analysis the problem is to determine if there are any input values which cause execution to reach a specified program point. Instead an approximation to this problem is computed: determine whether there is a control flow path from the start of the program to the point. This syntactic approximation is efficiently computable and conservative: if there is no such path the program point is clearly unreachable, but if there is such a path, the analysis is inconclusive, and the code is assumed to be live. Such conservative analysis too often yields unsatisfactory results because the approximation is too weak. As another example, consider data flow analysis. A du-pair is a pair of program points such that the first point is a definition of a variable and the second point a use and for which there exists a definition-free path from the definition to the use. The sharper, semantic definition of a du-pair requires that there be a feasible definition-free path from the definition to the use. A compiler using du-pairs for detecting dead variables may miss optimizations by not considering feasibility. Similarly, a program analyzer computing program slices to merge parallel versions may report conflicts where none exist. In the context of software testing, feasibility analysis plays an

  13. Anode optimization for miniature electronic brachytherapy X-ray sources using Monte Carlo and computational fluid dynamic codes.

    PubMed

    Khajeh, Masoud; Safigholi, Habib

    2016-03-01

    A miniature X-ray source has been optimized for electronic brachytherapy. The cooling fluid for this device is water. Unlike the radionuclide brachytherapy sources, this source is able to operate at variable voltages and currents to match the dose with the tumor depth. First, Monte Carlo (MC) optimization was performed on the tungsten target-buffer thickness layers versus energy such that the minimum X-ray attenuation occurred. Second optimization was done on the selection of the anode shape based on the Monte Carlo in water TG-43U1 anisotropy function. This optimization was carried out to get the dose anisotropy functions closer to unity at any angle from 0° to 170°. Three anode shapes including cylindrical, spherical, and conical were considered. Moreover, by Computational Fluid Dynamic (CFD) code the optimal target-buffer shape and different nozzle shapes for electronic brachytherapy were evaluated. The characterization criteria of the CFD were the minimum temperature on the anode shape, cooling water, and pressure loss from inlet to outlet. The optimal anode was conical in shape with a conical nozzle. Finally, the TG-43U1 parameters of the optimal source were compared with the literature. PMID:26966563

  14. Anode optimization for miniature electronic brachytherapy X-ray sources using Monte Carlo and computational fluid dynamic codes

    PubMed Central

    Khajeh, Masoud; Safigholi, Habib

    2015-01-01

    A miniature X-ray source has been optimized for electronic brachytherapy. The cooling fluid for this device is water. Unlike the radionuclide brachytherapy sources, this source is able to operate at variable voltages and currents to match the dose with the tumor depth. First, Monte Carlo (MC) optimization was performed on the tungsten target-buffer thickness layers versus energy such that the minimum X-ray attenuation occurred. Second optimization was done on the selection of the anode shape based on the Monte Carlo in water TG-43U1 anisotropy function. This optimization was carried out to get the dose anisotropy functions closer to unity at any angle from 0° to 170°. Three anode shapes including cylindrical, spherical, and conical were considered. Moreover, by Computational Fluid Dynamic (CFD) code the optimal target-buffer shape and different nozzle shapes for electronic brachytherapy were evaluated. The characterization criteria of the CFD were the minimum temperature on the anode shape, cooling water, and pressure loss from inlet to outlet. The optimal anode was conical in shape with a conical nozzle. Finally, the TG-43U1 parameters of the optimal source were compared with the literature. PMID:26966563

  15. Acoustic Scattering by Three-Dimensional Stators and Rotors Using the SOURCE3D Code. Volume 1; Analysis and Results

    NASA Technical Reports Server (NTRS)

    Meyer, Harold D.

    1999-01-01

    This report provides a study of rotor and stator scattering using the SOURCE3D Rotor Wake/Stator Interaction Code. SOURCE3D is a quasi-three-dimensional computer program that uses three-dimensional acoustics and two-dimensional cascade load response theory to calculate rotor and stator modal reflection and transmission (scattering) coefficients. SOURCE3D is at the core of the TFaNS (Theoretical Fan Noise Design/Prediction System), developed for NASA, which provides complete fully coupled (inlet, rotor, stator, exit) noise solutions for turbofan engines. The reason for studying scattering is that we must first understand the behavior of the individual scattering coefficients provided by SOURCE3D, before eventually understanding the more complicated predictions from TFaNS. To study scattering, we have derived a large number of scattering curves for vane and blade rows. The curves are plots of output wave power divided by input wave power (in dB units) versus vane/blade ratio. Some of these plots are shown in this report. All of the plots are provided in a separate volume. To assist in understanding the plots, formulas have been derived for special vane/blade ratios for which wavefronts are either parallel or normal to rotor or stator chords. From the plots, we have found that, for the most part, there was strong transmission and weak reflection over most of the vane/blade ratio range for the stator. For the rotor, there was little transmission loss.

  16. Formal verification and testing: An integrated approach to validating Ada programs

    NASA Technical Reports Server (NTRS)

    Cohen, Norman H.

    1986-01-01

    An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.

  17. Evolution of Ada technology in the flight dynamics area: Implementation/testing phase analysis

    NASA Technical Reports Server (NTRS)

    Quimby, Kelvin L.; Esker, Linda; Miller, John; Smith, Laurie; Stark, Mike; Mcgarry, Frank

    1989-01-01

    An analysis is presented of the software engineering issues related to the use of Ada for the implementation and system testing phases of four Ada projects developed in the flight dynamics area. These projects reflect an evolving understanding of more effective use of Ada features. In addition, the testing methodology used on these projects has changed substantially from that used on previous FORTRAN projects.

  18. 76 FR 38129 - Applications for New Awards; Americans With Disabilities Act (ADA) National Network Knowledge...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-29

    ... April 28, 2006 (71 FR 25472). The ADA National Network Knowledge Translation Center priority is from the... Applications for New Awards; Americans With Disabilities Act (ADA) National Network Knowledge Translation... Rehabilitation Research Projects (DRRP)--The ADA National Network Knowledge Translation Center Notice...

  19. FORTRAN codes to implement enhanced local wave number technique to determine the depth and location and shape of the causative source using magnetic anomaly

    NASA Astrophysics Data System (ADS)

    Agarwal, B. N. P.; Srivastava, Shalivahan

    2008-12-01

    The total field magnetic anomaly is analyzed to compute the depth and location and geometry of the causative source using two FORTRAN source codes, viz., FRCON1D and ELW. No assumption on the nature of source geometry, susceptibility contrast, etc. has been made. The source geometry is estimated by computing the structural index from previously determined depth and location. A detailed procedure is outlined for using these codes through a theoretical anomaly. The suppression of high-frequency noise in the observed data is tackled by designing a box-car window with cosine termination. The termination criterion is based on the peak position of the derivative operator computed for a pre-assumed depth of a shallow source below which the target is situated. The applicability of these codes has been demonstrated by analyzing a total field aeromagnetic anomaly of the Matheson area of northern Ontario, Canada.

  20. VADER: A flexible, robust, open-source code for simulating viscous thin accretion disks

    NASA Astrophysics Data System (ADS)

    Krumholz, M. R.; Forbes, J. C.

    2015-06-01

    The evolution of thin axisymmetric viscous accretion disks is a classic problem in astrophysics. While models based on this simplified geometry provide only approximations to the true processes of instability-driven mass and angular momentum transport, their simplicity makes them invaluable tools for both semi-analytic modeling and simulations of long-term evolution where two- or three-dimensional calculations are too computationally costly. Despite the utility of these models, the only publicly-available frameworks for simulating them are rather specialized and non-general. Here we describe a highly flexible, general numerical method for simulating viscous thin disks with arbitrary rotation curves, viscosities, boundary conditions, grid spacings, equations of state, and rates of gain or loss of mass (e.g., through winds) and energy (e.g., through radiation). Our method is based on a conservative, finite-volume, second-order accurate discretization of the equations, which we solve using an unconditionally-stable implicit scheme. We implement Anderson acceleration to speed convergence of the scheme, and show that this leads to factor of ∼5 speed gains over non-accelerated methods in realistic problems, though the amount of speedup is highly problem-dependent. We have implemented our method in the new code Viscous Accretion Disk Evolution Resource (VADER), which is freely available for download from

  1. Self characterization of a coded aperture array for neutron source imaging.

    PubMed

    Volegov, P L; Danly, C R; Fittinghoff, D N; Guler, N; Merrill, F E; Wilde, C H

    2014-12-01

    The neutron imaging system at the National Ignition Facility (NIF) is an important diagnostic tool for measuring the two-dimensional size and shape of the neutrons produced in the burning deuterium-tritium plasma during the stagnation stage of inertial confinement fusion implosions. Since the neutron source is small (∼100 μm) and neutrons are deeply penetrating (>3 cm) in all materials, the apertures used to achieve the desired 10-μm resolution are 20-cm long, triangular tapers machined in gold foils. These gold foils are stacked to form an array of 20 apertures for pinhole imaging and three apertures for penumbral imaging. These apertures must be precisely aligned to accurately place the field of view of each aperture at the design location, or the location of the field of view for each aperture must be measured. In this paper we present a new technique that has been developed for the measurement and characterization of the precise location of each aperture in the array. We present the detailed algorithms used for this characterization and the results of reconstructed sources from inertial confinement fusion implosion experiments at NIF. PMID:25554292

  2. Sound frequency-invariant neural coding of a frequency-dependent cue to sound source location.

    PubMed

    Jones, Heath G; Brown, Andrew D; Koka, Kanthaiah; Thornton, Jennifer L; Tollin, Daniel J

    2015-07-01

    The century-old duplex theory of sound localization posits that low- and high-frequency sounds are localized with two different acoustical cues, interaural time and level differences (ITDs and ILDs), respectively. While behavioral studies in humans and behavioral and neurophysiological studies in a variety of animal models have largely supported the duplex theory, behavioral sensitivity to ILD is curiously invariant across the audible spectrum. Here we demonstrate that auditory midbrain neurons in the chinchilla (Chinchilla lanigera) also encode ILDs in a frequency-invariant manner, efficiently representing the full range of acoustical ILDs experienced as a joint function of sound source frequency, azimuth, and distance. We further show, using Fisher information, that nominal "low-frequency" and "high-frequency" ILD-sensitive neural populations can discriminate ILD with similar acuity, yielding neural ILD discrimination thresholds for near-midline sources comparable to behavioral discrimination thresholds estimated for chinchillas. These findings thus suggest a revision to the duplex theory and reinforce ecological and efficiency principles that hold that neural systems have evolved to encode the spectrum of biologically relevant sensory signals to which they are naturally exposed. PMID:25972580

  3. Self characterization of a coded aperture array for neutron source imaging

    NASA Astrophysics Data System (ADS)

    Volegov, P. L.; Danly, C. R.; Fittinghoff, D. N.; Guler, N.; Merrill, F. E.; Wilde, C. H.

    2014-12-01

    The neutron imaging system at the National Ignition Facility (NIF) is an important diagnostic tool for measuring the two-dimensional size and shape of the neutrons produced in the burning deuterium-tritium plasma during the stagnation stage of inertial confinement fusion implosions. Since the neutron source is small (˜100 μm) and neutrons are deeply penetrating (>3 cm) in all materials, the apertures used to achieve the desired 10-μm resolution are 20-cm long, triangular tapers machined in gold foils. These gold foils are stacked to form an array of 20 apertures for pinhole imaging and three apertures for penumbral imaging. These apertures must be precisely aligned to accurately place the field of view of each aperture at the design location, or the location of the field of view for each aperture must be measured. In this paper we present a new technique that has been developed for the measurement and characterization of the precise location of each aperture in the array. We present the detailed algorithms used for this characterization and the results of reconstructed sources from inertial confinement fusion implosion experiments at NIF.

  4. Self characterization of a coded aperture array for neutron source imaging

    SciTech Connect

    Volegov, P. L. Danly, C. R.; Guler, N.; Merrill, F. E.; Wilde, C. H.; Fittinghoff, D. N.

    2014-12-15

    The neutron imaging system at the National Ignition Facility (NIF) is an important diagnostic tool for measuring the two-dimensional size and shape of the neutrons produced in the burning deuterium-tritium plasma during the stagnation stage of inertial confinement fusion implosions. Since the neutron source is small (∼100 μm) and neutrons are deeply penetrating (>3 cm) in all materials, the apertures used to achieve the desired 10-μm resolution are 20-cm long, triangular tapers machined in gold foils. These gold foils are stacked to form an array of 20 apertures for pinhole imaging and three apertures for penumbral imaging. These apertures must be precisely aligned to accurately place the field of view of each aperture at the design location, or the location of the field of view for each aperture must be measured. In this paper we present a new technique that has been developed for the measurement and characterization of the precise location of each aperture in the array. We present the detailed algorithms used for this characterization and the results of reconstructed sources from inertial confinement fusion implosion experiments at NIF.

  5. Open source development experience with a computational gas-solids flow code

    SciTech Connect

    Syamlal, M; O'Brien, T. J.; Benyahia, Sofiane; Gel, Aytekin; Pannala, Sreekanth

    2008-01-01

    A case study on the use of open source (OS) software development in chemical engineering research and education is presented here. The multiphase computational fluid dynamics software MFIX is the object of the case study. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow and the dissemination of information to other areas such as geotechnical and volcanology research are demonstrated. It is shown that the advantages of OS development methodology were realized: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; and the facilitation of peer review of the results of computational research.

  6. AN ADA LINEAR ALGEBRA PACKAGE MODELED AFTER HAL/S

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.

    1994-01-01

    This package extends the Ada programming language to include linear algebra capabilities similar to those of the HAL/S programming language. The package is designed for avionics applications such as Space Station flight software. In addition to the HAL/S built-in functions, the package incorporates the quaternion functions used in the Shuttle and Galileo projects, and routines from LINPAK that solve systems of equations involving general square matrices. Language conventions in this package follow those of HAL/S to the maximum extent practical and minimize the effort required for writing new avionics software and translating existent software into Ada. Valid numeric types in this package include scalar, vector, matrix, and quaternion declarations. (Quaternions are fourcomponent vectors used in representing motion between two coordinate frames). Single precision and double precision floating point arithmetic is available in addition to the standard double precision integer manipulation. Infix operators are used instead of function calls to define dot products, cross products, quaternion products, and mixed scalar-vector, scalar-matrix, and vector-matrix products. The package contains two generic programs: one for floating point, and one for integer. The actual component type is passed as a formal parameter to the generic linear algebra package. The procedures for solving systems of linear equations defined by general matrices include GEFA, GECO, GESL, and GIDI. The HAL/S functions include ABVAL, UNIT, TRACE, DET, INVERSE, TRANSPOSE, GET, PUT, FETCH, PLACE, and IDENTITY. This package is written in Ada (Version 1.2) for batch execution and is machine independent. The linear algebra software depends on nothing outside the Ada language except for a call to a square root function for floating point scalars (such as SQRT in the DEC VAX MATHLIB library). This program was developed in 1989, and is a copyrighted work with all copyright vested in NASA.

  7. Insurance benefits under the ADA: Discrimination or business as usual?

    SciTech Connect

    McFadden, M.E.

    1993-12-31

    In December 1987, John McGann discovered he had AIDS. In July 1988, his employer altered his health insurance policy by reducing lifetime coverage for AIDS to $5,000, while maintaining the million-dollar limit for all other health conditions. The United States Court of Appeals for the Fifth Circuit upheld the employer`s right to make that change. The Supreme Court denied certiori. Public outcry was immediate and voluminous. The Solicitor General argued that the new Americans with Disabilities Act would save future John McGanns from the same treatment, but the validity of this optimistic prediction is yet to be determined. The Americans with Disabilities Act of 1990 (ADA) is landmark legislation that bars discrimination against the disabled in all aspects of employment, public services, and accommodations. The Act broadly defines disability to include illnesses such as AIDS and cancer, as well as limitations on mobility, vision, and hearing. The ADA indisputably creates a private cause of action for discrimination on the basis of disability. However, depending on the standard of review chosen by the federal courts, this cause of action may or may not provide much protection to those claiming discrimination on the basis of disability in employee benefits and insurance. This article discusses the ADA`s coverage of insurance and benefits in light of the possible standards courts might use to evaluate actions of parties in suits alleging discrimination in these areas and applies those standards of review to the facts of the McGann case. 146 refs.

  8. The implementation and use of Ada on distributed systems with high reliability requirements

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1986-01-01

    The use and implementation of Ada in distributed environments in which reliability is the primary concern were investigted. A distributed system, programmed entirely in Ada, was studied to assess the use of individual tasks without concern for the processor used. Continued development and testing of the fault tolerant Ada testbed; development of suggested changes to Ada to cope with the failures of interest; design of approaches to fault tolerant software in real time systems, and the integration of these ideas into Ada; and the preparation of various papers and presentations were discussed.

  9. ISOLA a Fortran code and a Matlab GUI to perform multiple-point source inversion of seismic data

    NASA Astrophysics Data System (ADS)

    Sokos, Efthimios N.; Zahradnik, Jiri

    2008-08-01

    In this paper, a software package for multiple- or single-point source inversion is presented. The package consists of ISOLA-GUI, a user-friendly MATLAB-based interface, and the ISOLA Fortran code, which is the computational core of the application. The methodology used is similar to iterative deconvolution technique, often used in teleseismic studies, but here adjusted for regional and local distances. The advantage of the software is the graphical interface that provides the user with an easy to use environment, rich in graphics and data handling routines, while at the same time the speed of Fortran code is retained. Besides that, the software allows the results to be exported to popular software packages, like Generic Mapping Tools, while at the same time utilizing them for quality plots of the results. The modular design of ISOLA-GUI can be used by users for the addition of supplementary routines in all the stages of processing. An example of the method's ability to obtain a quick insight into the complexity of an earthquake is presented, using records from a moderate size event.

  10. Chagas parasite detection in blood images using AdaBoost.

    PubMed

    Uc-Cetina, Víctor; Brito-Loeza, Carlos; Ruiz-Piña, Hugo

    2015-01-01

    The Chagas disease is a potentially life-threatening illness caused by the protozoan parasite, Trypanosoma cruzi. Visual detection of such parasite through microscopic inspection is a tedious and time-consuming task. In this paper, we provide an AdaBoost learning solution to the task of Chagas parasite detection in blood images. We give details of the algorithm and our experimental setup. With this method, we get 100% and 93.25% of sensitivity and specificity, respectively. A ROC comparison with the method most commonly used for the detection of malaria parasites based on support vector machines (SVM) is also provided. Our experimental work shows mainly two things: (1) Chagas parasites can be detected automatically using machine learning methods with high accuracy and (2) AdaBoost + SVM provides better overall detection performance than AdaBoost or SVMs alone. Such results are the best ones known so far for the problem of automatic detection of Chagas parasites through the use of machine learning, computer vision, and image processing methods. PMID:25861375

  11. Chagas Parasite Detection in Blood Images Using AdaBoost

    PubMed Central

    Uc-Cetina, Víctor; Brito-Loeza, Carlos; Ruiz-Piña, Hugo

    2015-01-01

    The Chagas disease is a potentially life-threatening illness caused by the protozoan parasite, Trypanosoma cruzi. Visual detection of such parasite through microscopic inspection is a tedious and time-consuming task. In this paper, we provide an AdaBoost learning solution to the task of Chagas parasite detection in blood images. We give details of the algorithm and our experimental setup. With this method, we get 100% and 93.25% of sensitivity and specificity, respectively. A ROC comparison with the method most commonly used for the detection of malaria parasites based on support vector machines (SVM) is also provided. Our experimental work shows mainly two things: (1) Chagas parasites can be detected automatically using machine learning methods with high accuracy and (2) AdaBoost + SVM provides better overall detection performance than AdaBoost or SVMs alone. Such results are the best ones known so far for the problem of automatic detection of Chagas parasites through the use of machine learning, computer vision, and image processing methods. PMID:25861375

  12. Review of the status of validation of the computer codes used in the severe accident source term reassessment study (BMI-2104). [PWR; BWR

    SciTech Connect

    Kress, T. S.

    1985-04-01

    The determination of severe accident source terms must, by necessity it seems, rely heavily on the use of complex computer codes. Source term acceptability, therefore, rests on the assessed validity of such codes. Consequently, one element of NRC's recent efforts to reassess LWR severe accident source terms is to provide a review of the status of validation of the computer codes used in the reassessment. The results of this review is the subject of this document. The separate review documents compiled in this report were used as a resource along with the results of the BMI-2104 study by BCL and the QUEST study by SNL to arrive at a more-or-less independent appraisal of the status of source term modeling at this time.

  13. Comparison of Orbiter PRCS Plume Flow Fields Using CFD and Modified Source Flow Codes

    NASA Technical Reports Server (NTRS)

    Rochelle, Wm. C.; Kinsey, Robin E.; Reid, Ethan A.; Stuart, Phillip C.; Lumpkin, Forrest E.

    1997-01-01

    The Space Shuttle Orbiter will use Reaction Control System (RCS) jets for docking with the planned International Space Station (ISS). During approach and backout maneuvers, plumes from these jets could cause high pressure, heating, and thermal loads on ISS components. The object of this paper is to present comparisons of RCS plume flow fields used to calculate these ISS environments. Because of the complexities of 3-D plumes with variable scarf-angle and multi-jet combinations, NASA/JSC developed a plume flow-field methodology for all of these Orbiter jets. The RCS Plume Model (RPM), which includes effects of scarfed nozzles and dual jets, was developed as a modified source-flow engineering tool to rapidly generate plume properties and impingement environments on ISS components. This paper presents flow-field properties from four PRCS jets: F3U low scarf-angle single jet, F3F high scarf-angle single jet, DTU zero scarf-angle dual jet, and F1F/F2F high scarf-angle dual jet. The RPM results compared well with plume flow fields using four CFD programs: General Aerodynamic Simulation Program (GASP), Cartesian (CART), Unified Solution Algorithm (USA), and Reacting and Multi-phase Program (RAMP). Good comparisons of predicted pressures are shown with STS 64 Shuttle Plume Impingement Flight Experiment (SPIFEX) data.

  14. Identification of human proteins functionally conserved with the yeast putative adaptors ADA2 and GCN5.

    PubMed Central

    Candau, R; Moore, P A; Wang, L; Barlev, N; Ying, C Y; Rosen, C A; Berger, S L

    1996-01-01

    Transcriptional adaptor proteins are required for full function of higher eukaryotic acidic activators in the yeast Saccharomyces cerevisiae, suggesting that this pathway of activation is evolutionarily conserved. Consistent with this view, we have identified possible human homologs of yeast ADA2 (yADA2) and yeast GCN5 (yGCN5), components of a putative adaptor complex. While there is overall sequence similarity between the yeast and human proteins, perhaps more significant is conservation of key sequence features with other known adaptors. We show several functional similarities between the human and yeast adaptors. First, as shown for yADA2 and yGCN5, human ADA2 (hADA2) and human GCN5 (hGCN5) interacted in vivo in a yeast two-hybrid assay. Moreover, hGCN5 interacted with yADA2 in this assay, suggesting that the human proteins form similar complexes. Second, both yADA2 and hADA2 contain cryptic activation domains. Third, hGCN5 and yGCN5 had similar stabilizing effects on yADA2 in vivo. Furthermore, the region of yADA2 that interacted with yGCN5 mapped to the amino terminus of yADA2, which is highly conserved in hADA2. Most striking, is the behavior of the human proteins in human cells. First, GAL4-hADA2 activated transcription in HeLa cells, and second, either hADA2 or hGCN5 augmented GAL4-VP16 activation. These data indicated that the human proteins correspond to functional homologs of the yeast adaptors, suggesting that these cofactors play a key role in transcriptional activation. PMID:8552087

  15. Epitope characterization of the ADA response directed against a targeted immunocytokine.

    PubMed

    Stubenrauch, Kay; Künzel, Christian; Vogel, Rudolf; Tuerck, Dietrich; Schick, Eginhard; Heinrich, Julia

    2015-10-10

    Targeted immunocytokines (TICs) display potent activity in selective tumor suppression. This class of multi domain biotherapeutics (MDBs) is composed of the three major domains Fab, Fc, and a cytokine which may induce a complex polyclonal anti-drug antibody (ADA) response. However, classical ADA assays usually are not suitable to specify ADAs and to identify the immunogenic domains of a TIC. The purpose of the present study was to establish epitope characterization of ADA responses in order to specify immunogenic responses against a TIC and their direct impact on the pharmacokinetic profile, safety, and efficacy. Based on standard ADA screening and confirmation assays, respectively, domain detection assays (DDAs) and domain competition assays (DCAs) were established and compared by the use of 12 ADA-positive samples obtained from a cynomolgus monkey study in early development. Both domain-specific assays were sensitive enough to preserve the positive screening assay result and revealed an overall accordance for the evaluation of domain-specific ADA responses. About half of the samples displayed one ADA specificity, either for the Fab or for the cytokine (Cy) domain, and the remaining samples showed a combination of Fab-specific and Cy-specific ADA fractions. Fc-specific ADAs occurred in only one sample. In-depth comparison of DCAs and DDAs showed that both assays appeared to be appropriate to assess multi-specific ADA responses as well as minor ADA fractions. An advantage of DCAs is typically a fast and easy assay establishment, whereas, DDAs in some cases may be superior to assess low abundant ADAs in multi-specific responses. Our results reveal that both approaches benefit from thorough reagent development as an essential precondition for reliable epitope characterization of ADA responses. PMID:26093509

  16. Alteration/deficiency in activation-3 (Ada3) plays a critical role in maintaining genomic stability.

    PubMed

    Mirza, Sameer; Katafiasz, Bryan J; Kumar, Rakesh; Wang, Jun; Mohibi, Shakur; Jain, Smrati; Gurumurthy, Channabasavaiah Basavaraju; Pandita, Tej K; Dave, Bhavana J; Band, Hamid; Band, Vimla

    2012-11-15

    Cell cycle regulation and DNA repair following damage are essential for maintaining genome integrity. DNA damage activates checkpoints in order to repair damaged DNA prior to exit to the next phase of cell cycle. Recently, we have shown the role of Ada3, a component of various histone acetyltransferase complexes, in cell cycle regulation, and loss of Ada3 results in mouse embryonic lethality. Here, we used adenovirus-Cre-mediated Ada3 deletion in Ada3(fl/fl) mouse embryonic fibroblasts (MEFs) to assess the role of Ada3 in DNA damage response following exposure to ionizing radiation (IR). We report that Ada3 depletion was associated with increased levels of phospho-ATM (pATM), γH2AX, phospho-53BP1 (p53BP1) and phospho-RAD51 (pRAD51) in untreated cells; however, radiation response was intact in Ada3(-/-) cells. Notably, Ada3(-/-) cells exhibited a significant delay in disappearance of DNA damage foci for several critical proteins involved in the DNA repair process. Significantly, loss of Ada3 led to enhanced chromosomal aberrations, such as chromosome breaks, fragments, deletions and translocations, which further increased upon DNA damage. Notably, the total numbers of aberrations were more clearly observed in S-phase, as compared with G₁ or G₂ phases of cell cycle with IR. Lastly, comparison of DNA damage in Ada3(fl/fl) and Ada3(-/-) cells confirmed higher residual DNA damage in Ada3(-/-) cells, underscoring a critical role of Ada3 in the DNA repair process. Taken together, these findings provide evidence for a novel role for Ada3 in maintenance of the DNA repair process and genomic stability. PMID:23095635

  17. An open-source, massively parallel code for non-LTE synthesis and inversion of spectral lines and Zeeman-induced Stokes profiles

    NASA Astrophysics Data System (ADS)

    Socas-Navarro, H.; de la Cruz Rodríguez, J.; Asensio Ramos, A.; Trujillo Bueno, J.; Ruiz Cobo, B.

    2015-05-01

    With the advent of a new generation of solar telescopes and instrumentation, interpreting chromospheric observations (in particular, spectropolarimetry) requires new, suitable diagnostic tools. This paper describes a new code, NICOLE, that has been designed for Stokes non-LTE radiative transfer, for synthesis and inversion of spectral lines and Zeeman-induced polarization profiles, spanning a wide range of atmospheric heights from the photosphere to the chromosphere. The code features a number of unique features and capabilities and has been built from scratch with a powerful parallelization scheme that makes it suitable for application on massive datasets using large supercomputers. The source code is written entirely in Fortran 90/2003 and complies strictly with the ANSI standards to ensure maximum compatibility and portability. It is being publicly released, with the idea of facilitating future branching by other groups to augment its capabilities. The source code is currently hosted at the following repository: http://https://github.com/hsocasnavarro/NICOLE

  18. Wind Farm Stabilization by using DFIG with Current Controlled Voltage Source Converters Taking Grid Codes into Consideration

    NASA Astrophysics Data System (ADS)

    Okedu, Kenneth Eloghene; Muyeen, S. M.; Takahashi, Rion; Tamura, Junji

    Recent wind farm grid codes require wind generators to ride through voltage sags, which means that normal power production should be re-initiated once the nominal grid voltage is recovered. However, fixed speed wind turbine generator system using induction generator (IG) has the stability problem similar to the step-out phenomenon of a synchronous generator. On the other hand, doubly fed induction generator (DFIG) can control its real and reactive powers independently while being operated in variable speed mode. This paper proposes a new control strategy using DFIGs for stabilizing a wind farm composed of DFIGs and IGs, without incorporating additional FACTS devices. A new current controlled voltage source converter (CC-VSC) scheme is proposed to control the converters of DFIG and the performance is verified by comparing the results with those of voltage controlled voltage source converter (VC-VSC) scheme. Another salient feature of this study is to reduce the number of proportionate integral (PI) controllers used in the rotor side converter without degrading dynamic and transient performances. Moreover, DC-link protection scheme during grid fault can be omitted in the proposed scheme which reduces overall cost of the system. Extensive simulation analyses by using PSCAD/EMTDC are carried out to clarify the effectiveness of the proposed CC-VSC based control scheme of DFIGs.

  19. Web-MCQ: a set of methods and freely available open source code for administering online multiple choice question assessments.

    PubMed

    Hewson, Claire

    2007-08-01

    E-learning approaches have received increasing attention in recent years. Accordingly, a number of tools have become available to assist the nonexpert computer user in constructing and managing virtual learning environments, and implementing computer-based and/or online procedures to support pedagogy. Both commercial and free packages are now available, with new developments emerging periodically. Commercial products have the advantage of being comprehensive and reliable, but tend to require substantial financial investment and are not always transparent to use. They may also restrict pedagogical choices due to their predetermined ranges of functionality. With these issues in mind, several authors have argued for the pedagogical benefits of developing freely available, open source e-learning resources, which can be shared and further developed within a community of educational practitioners. The present paper supports this objective by presenting a set of methods, along with supporting freely available, downloadable, open source programming code, to allow administration of online multiple choice question assessments to students. PMID:17958158

  20. Code of ethics for dental researchers.

    PubMed

    2014-01-01

    The International Association for Dental Research, in 2009, adopted a code of ethics. The code applies to members of the association and is enforceable by sanction, with the stated requirement that members are expected to inform the association in cases where they believe misconduct has occurred. The IADR code goes beyond the Belmont and Helsinki statements by virtue of covering animal research. It also addresses issues of sponsorship of research and conflicts of interest, international collaborative research, duty of researchers to be informed about applicable norms, standards of publication (including plagiarism), and the obligation of "whistleblowing" for the sake of maintaining the integrity of the dental research enterprise as a whole. The code is organized, like the ADA code, into two sections. The IADR principles are stated, but not defined, and number 12, instead of the ADA's five. The second section consists of "best practices," which are specific statements of expected or interdicted activities. The short list of definitions is useful. PMID:25951679

  1. NASA-evolving to Ada: Five-year plan. A plan for implementing recommendations made by the Ada and software management assessment working group

    NASA Technical Reports Server (NTRS)

    1989-01-01

    At their March 1988 meeting, members of the National Aeronautics and Space Administration (NASA) Information Resources Management (IRM) Council expressed concern that NASA may not have the infrastructure necessary to support the use of Ada for major NASA software projects. Members also observed that the agency has no coordinated strategy for applying its experiences with Ada to subsequent projects (Hinners, 27 June 1988). To deal with these problems, the IRM Council chair appointed an intercenter Ada and Software Management Assessment Working Group (ASMAWG). They prepared a report (McGarry et al., March 1989) entitled, 'Ada and Software Management in NASA: Findings and Recommendations'. That report presented a series of recommendations intended to enable NASA to develop better software at lower cost through the use of Ada and other state-of-the-art software engineering technologies. The purpose here is to describe the steps (called objectives) by which this goal may be achieved, to identify the NASA officials or organizations responsible for carrying out the steps, and to define a schedule for doing so. This document sets forth four goals: adopt agency-wide software standards and policies; use Ada as the programming language for all mission software; establish an infrastructure to support software engineering, including the use of Ada, and to leverage the agency's software experience; and build the agency's knowledge base in Ada and software engineering. A schedule for achieving the objectives and goals is given.

  2. Role of the Ada adaptor complex in gene activation by the glucocorticoid receptor.

    PubMed Central

    Henriksson, A; Almlöf, T; Ford, J; McEwan, I J; Gustafsson, J A; Wright, A P

    1997-01-01

    We have shown that the Ada adaptor complex is important for the gene activation capacity of the glucocorticoid receptor in yeast. The recently isolated human Ada2 protein also increases the potency of the receptor protein in mammalian cells. The Ada pathway is of key significance for the tau1 core transactivation domain (tau1c) of the receptor, which requires Ada for activity in vivo and in vitro. Ada2 can be precipitated from nuclear extracts by a glutathione S-transferase-tau1 fusion protein coupled to agarose beads, and a direct interaction between Ada2 and tau1c can be shown by using purified proteins. This interaction is strongly reduced by a mutation in tau1c that reduces transactivation activity. Mutations affecting the Ada complex do not reverse transcriptional squelching by the tau1 domain, as they do for the VP16 transactivation domain, and thus these powerful acidic activators differ in at least some important aspects of gene activation. Mutations that reduce the activity of the tau1c domain in wild-type yeast strains cause similar reductions in ada mutants that contain little or no Ada activity. Thus, gene activation mechanisms, in addition to the Ada pathway, are involved in the activity of the tau1c domain. PMID:9154805

  3. Monogenic polyarteritis: the lesson of ADA2 deficiency.

    PubMed

    Caorsi, Roberta; Penco, Federica; Schena, Francesca; Gattorno, Marco

    2016-01-01

    The deficiency of Adenosine Deaminase 2 (DADA2) is a new autoinflammatory disease characterised by an early onset vasculopathy with livedoid skin rash associated with systemic manifestations, CNS involvement and mild immunodeficiency.This condition is secondary to autosomal recessive mutations of CECR1 (Cat Eye Syndrome Chromosome Region 1) gene, mapped to chromosome 22q11.1, that encodes for the enzymatic protein adenosine deaminase 2 (ADA2). By now 19 different mutations in CECR1 gene have been detected.The pathogenetic mechanism of DADA2 is still unclear. ADA2 in a secreted protein mainly expressed by cells of the myeloid lineage; its enzymatic activity is higher in conditions of hypoxia, inflammation and oncogenesis. Moreover ADA2 is able to induce macrophages proliferation and differentiation; it's deficiency is in fact associated with a reduction of anti-inflammatory macrophages (M2). The deficiency of ADA2 is also associated with an up-regulation of neutrophils-expressed genes and an increased secretion of pro-inflammatory cytokines. The mild immunodeficiency detected in many DADA2 patients suggests a role of this protein in the adaptive immune response; an increased mortality of B cells and a reduction in the number of memory B cells, terminally differentiated B cells and plasmacells has been described in many patients. The lack of the protein is associated with endothelium damage; however the function of this protein in the endothelial homeostasis is still unknown.From the clinical point of view, this disease is characterized by a wide spectrum of severity. Chronic or recurrent systemic inflammation with fever, elevation of acute phase reactants and skin manifestations (mainly represented by livedo reticularis) is the typical clinical picture. While in some patients the disease is mild and skin-limited, others present a severe, even lethal, disease with multi-organ involvement; the CNS involvement is rather common with ischemic or hemorrhagic strokes. In

  4. Large distributed control system using ADA in fusion research

    SciTech Connect

    Woodruff, J. P., LLNL

    1998-04-21

    Construction of the National Ignition Facility laser at Lawrence Livermore National Laboratory features a large distributed control system constructed using object-oriented software engineering techniques. Control of 60,000 devices is effected using a network of some 500 computers that run software written in Ada and communicating through CORBA. The project has completed its final design review; implementation of the first of five planned increments will be delivered at the end of fiscal year 1998. Preliminary measures of the distributed controls performance confirm the design decisions reported in this paper, and the measurement and supporting simulation of full system performance continue.

  5. An approach to distributed execution of Ada programs

    NASA Technical Reports Server (NTRS)

    Volz, R. A.; Krishnan, P.; Theriault, R.

    1987-01-01

    Intelligent control of the Space Station will require the coordinated execution of computer programs across a substantial number of computing elements. It will be important to develop large subsets of these programs in the form of a single program which executes in a distributed fashion across a number of processors. A translation strategy for distributed execution of Ada programs in which library packages and subprograms may be distributed is described. A preliminary version of the translator is operational. Simple data objects (no records or arrays as yet), subprograms, and static tasks may be referenced remotely.

  6. An Overview of Advanced Data Acquisition System (ADAS)

    NASA Technical Reports Server (NTRS)

    Mata, Carlos T.; Steinrock, T. (Technical Monitor)

    2001-01-01

    The paper discusses the following: 1. Historical background. 2. What is ADAS? 3. R and D status. 4. Reliability/cost examples (1, 2, and 3). 5. What's new? 6. Technical advantages. 7. NASA relevance. 8. NASA plans/options. 9. Remaining R and D. 10. Applications. 11. Product benefits. 11. Commercial advantages. 12. intellectual property. Aerospace industry requires highly reliable data acquisition systems. Traditional Acquisition systems employ end-to-end hardware and software redundancy. Typically, redundancy adds weight, cost, power consumption, and complexity.

  7. MARE2DEM: an open-source code for anisotropic inversion of controlled-source electromagnetic and magnetotelluric data using parallel adaptive 2D finite elements (Invited)

    NASA Astrophysics Data System (ADS)

    Key, K.

    2013-12-01

    This work announces the public release of an open-source inversion code named MARE2DEM (Modeling with Adaptively Refined Elements for 2D Electromagnetics). Although initially designed for the rapid inversion of marine electromagnetic data, MARE2DEM now supports a wide variety of acquisition configurations for both offshore and onshore surveys that utilize electric and magnetic dipole transmitters or magnetotelluric plane waves. The model domain is flexibly parameterized using a grid of arbitrarily shaped polygonal regions, allowing for complicated structures such as topography or seismically imaged horizons to be easily assimilated. MARE2DEM efficiently solves the forward problem in parallel by dividing the input data parameters into smaller subsets using a parallel data decomposition algorithm. The data subsets are then solved in parallel using an automatic adaptive finite element method that iterative solves the forward problem on successively refined finite element meshes until a specified accuracy tolerance is met, thus freeing the end user from the burden of designing an accurate numerical modeling grid. Regularized non-linear inversion for isotropic or anisotropic conductivity is accomplished with a new implementation of Occam's method referred to as fast-Occam, which is able to minimize the objective function in much fewer forward evaluations than the required by the original method. This presentation will review the theoretical considerations behind MARE2DEM and use a few recent offshore EM data sets to demonstrate its capabilities and to showcase the software interface tools that streamline model building and data inversion.

  8. Modeling of a three-source perfusion and blood oxygenation sensor for transplant monitoring using multilayer Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Ibey, Bennett L.; Lee, Seungjoon; Ericson, M. Nance; Wilson, Mark A.; Cote, Gerard L.

    2004-06-01

    A Multi-Layer Monte Carlo (MLMC) model was developed to predict the results of in vivo blood perfusion and oxygenation measurement of transplanted organs as measured by an indwelling optical sensor. A sensor has been developed which uses three-source excitation in the red and infrared ranges (660, 810, 940 nm). In vitro data was taken using this sensor by changing the oxygenation state of whole blood and passing it through a single-tube pump system wrapped in bovine liver tissue. The collected data showed that the red signal increased as blood oxygenation increased and infrared signal decreased. The center wavelength of 810 nanometers was shown to be quite indifferent to blood oxygenation change. A model was developed using MLMC code that sampled the wavelength range from 600-1000 nanometers every 6 nanometers. Using scattering and absorption data for blood and liver tissue within this wavelength range, a five-layer model was developed (tissue, clear tubing, blood, clear tubing, tissue). The theoretical data generated from this model was compared to the in vitro data and showed good correlation with changing blood oxygenation.

  9. Application and systems software in Ada: Development experiences

    NASA Technical Reports Server (NTRS)

    Kuschill, Jim

    1986-01-01

    In its most basic sense software development involves describing the tasks to be solved, including the given objects and the operations to be performed on those objects. Unfortunately, the way people describe objects and operations usually bears little resemblance to source code in most contemporary computer languages. There are two ways around this problem. One is to allow users to describe what they want the computer to do in everyday, typically imprecise English. The PRODOC methodology and software development environment is based on a second more flexible and possibly even easier to use approach. Rather than hiding program structure, PRODOC represents such structure graphically using visual programming techniques. In addition, the program terminology used in PRODOC may be customized so as to match the way human experts in any given application area naturally describe the relevant data and operations. The PRODOC methodology is described in detail.

  10. Beam simulation and radiation dose calculation at the Advanced Photon Source with shower, an Interface Program to the EGS4 code system

    SciTech Connect

    Emery, L.

    1995-07-01

    The interface program shower to the FGS Monte Carlo electromagnetic cascade shower simulation code system was written to facilitate the definition of complicated target and shielding geometries and to simplify the handling of input and output of data. The geometry is defined by a series of namelist commands in an input file. The input and output beam data files follow the SPDDS (self-describing data set) protocol, which makes the files compatible with other physics codes that follow the same protocol. For instance, one can use the results of the cascade shower simulation as the input data for an accelerator tracking code. The shower code has also been used to calculate the bremsstrahlung component of radiation doses for possible beam loss scenarios at the Advanced Photon Source (APS) at Argonne National Laboratory.

  11. Timing issues in the distributed execution of Ada programs

    NASA Technical Reports Server (NTRS)

    Volz, Richard A.; Mudge, Trevor N.

    1987-01-01

    This paper examines, in the context of distributed execution, the meaning of Ada constructs involving time. In the process, unresolved questions of interpretation and problems with the implementation of a consistent notion of time across a network are uncovered. It is observed that there are two Ada mechanisms that can involve a distributed sense of time: the conditional entry call, and the timed entry call. It is shown that a recent interpretation by the Language Maintenance Committee resolves the questions for the conditional entry calls but results in an anomaly for timed entry calls. A detailed discussion of alternative implementations for the timed entry call is made, and it is aruged that: (1) timed entry calls imply a common sense of time between the machines holding the calling and called tasks; and (2) the measurement of time for the expiration of the delay and the decision of whether or not to perform the rendezvous should be made on the machine holding the called task. The need to distinguish the unreadiness of the called task from timeouts caused by network failure is pointed out. Finally, techniques for realizing a single sense of time across the distributed system (at least to within an acceptable degree of uncertainty) are also discussed.

  12. Creation of a laboratory instrument quality monitoring system with AdaSAGE

    NASA Astrophysics Data System (ADS)

    Leif, Robert C.; Rios, Robert; Becker, Margie C.; Becker, C. Kevin; Self, John T.; Leif, Suzanne B.

    1996-05-01

    Two existing Ada tools AdaSAGE and AYACC were combined to produce a system that parses International Society for Analytical Cytology, ISAC, Flow Cytometry Standard 2.0 files and stores the data in AdaSAGE tables. There are significant differences in the way manufacturers interpret and conform to Flow Cytometry Standard 2.0. AdaSAGE is employed to analyze and plot the data from multiple experiments. This data is used to assess the stability of flow cytometers. The initial release will be for DOS. The utilization of AdaSAGE, which is a flexible database tool, will facilitate subsequent development of other products. The software engineer, whose previous professional experience was with C and C++, had very few problems with Ada syntax. The interface to the compiler and other tools was immature compared to those available for C++. The DOS text based user interface environment provided by AdaSAGE limited the functionality of the user interface. However, the present DOS 386 program can be directly ported to the newly released version of AdaSAGE for Microsoft Windows 95. Ada's strong type checking and package structure have significantly facilitated the development of the product.

  13. The Impact of Ada and Object-Oriented Design in NASA Goddard's Flight Dynamics Division

    NASA Technical Reports Server (NTRS)

    Waligora, Sharon; Bailey, John; Stark, Mike

    1996-01-01

    This paper presents the highlights and key findings of 10 years of use and study of Ada and object-oriented design in NASA Goddard's Flight Dynamics Division (FDD). In 1985, the Software Engineering Laboratory (SEL) began investigating how the Ada language might apply to FDD software development projects. Although they began cautiously using Ada on only a few pilot projects, they expected that, if the Ada pilots showed promising results, the FDD would fully transition its entire development organization from FORTRAN to Ada within 10 years. However, 10 years later, the FDD still produced 80 percent of its software in FORTRAN and had begun using C and C++, despite positive results on Ada projects. This paper presents the final results of a SEL study to quantify the impact of Ada in the FDD, to determine why Ada has not flourished, and to recommend future directions regarding Ada. Project trends in both languages are examined as are external factors and cultural issues that affected the infusion of this technology. The detailed results of this study were published in a formal study report in March of 1995. This paper supersedes the preliminary results of this study that were presented at the Eighteenth Annual Software Engineering Workshop in 1993.

  14. ART/Ada design project, phase 1. Task 2 report: Detailed design

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.

    1988-01-01

    Various issues are studied in the context of the design of an Ada based expert system building tool. Using an existing successful design as a starting point, the impact is analyzed of the Ada language and Ada development methodologies on that design, the Ada system is redesigned, and its performance is analyzed using both complexity-theoretic and empirical techniques. The algorithms specified in the overall design are refined, resolving and documenting any open design issues, identifying each system module, documenting the internal architecture and control logic, and describing the primary data structures involved in the module.

  15. The implementation and use of Ada on distributed systems with high reliability requirements

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1984-01-01

    The use and implementation of Ada in distributed environments in which reliability is the primary concern is investigated. Emphasis is placed on the possibility that a distributed system may be programmed entirely in ADA so that the individual tasks of the system are unconcerned with which processors they are executing on, and that failures may occur in the software or underlying hardware. The primary activities are: (1) Continued development and testing of our fault-tolerant Ada testbed; (2) consideration of desirable language changes to allow Ada to provide useful semantics for failure; (3) analysis of the inadequacies of existing software fault tolerance strategies.

  16. The circulating transcriptome as a source of non-invasive cancer biomarkers: concepts and controversies of non-coding and coding RNA in body fluids

    PubMed Central

    Fernandez-Mercado, Marta; Manterola, Lorea; Larrea, Erika; Goicoechea, Ibai; Arestin, María; Armesto, María; Otaegui, David; Lawrie, Charles H

    2015-01-01

    The gold standard for cancer diagnosis remains the histological examination of affected tissue, obtained either by surgical excision, or radiologically guided biopsy. Such procedures however are expensive, not without risk to the patient, and require consistent evaluation by expert pathologists. Consequently, the search for non-invasive tools for the diagnosis and management of cancer has led to great interest in the field of circulating nucleic acids in plasma and serum. An additional benefit of blood-based testing is the ability to carry out screening and repeat sampling on patients undergoing therapy, or monitoring disease progression allowing for the development of a personalized approach to cancer patient management. Despite having been discovered over 60 years ago, the clear clinical potential of circulating nucleic acids, with the notable exception of prenatal diagnostic testing, has yet to translate into the clinic. The recent discovery of non-coding (nc) RNA (in particular micro(mi)RNAs) in the blood has provided fresh impetuous for the field. In this review, we discuss the potential of the circulating transcriptome (coding and ncRNA), as novel cancer biomarkers, the controversy surrounding their origin and biology, and most importantly the hurdles that remain to be overcome if they are really to become part of future clinical practice. PMID:26119132

  17. Validation of the MCNP-DSP Monte Carlo code for calculating source-driven noise parameters of subcritical systems

    SciTech Connect

    Valentine, T.E.; Mihalczo, J.T.

    1995-12-31

    This paper describes calculations performed to validate the modified version of the MCNP code, the MCNP-DSP, used for: the neutron and photon spectra of the spontaneous fission of californium 252; the representation of the detection processes for scattering detectors; the timing of the detection process; and the calculation of the frequency analysis parameters for the MCNP-DSP code.

  18. The Alzheimer's Disease Assessment Scale-Cognitive-Plus (ADAS-Cog-Plus): an expansion of the ADAS-Cog to improve responsiveness in MCI

    PubMed Central

    Carvalho, Janessa O.; Potter, Guy G.; Thames, April; Zelinski, Elizabeth; Crane, Paul K.; Gibbons, Laura E.

    2013-01-01

    Background The Alzheimer's Disease Assessment Scale cognitive subscale (ADAS-Cog) is widely used in AD, but may be less responsive to change when used in people with mild cognitive impairment (MCI). Methods Participants from the Alzheimer's Disease Neuroimaging Initiative were administered a neuropsychological battery and 1.5 T MRI scans over 2–3 years. Informants were queried regarding functional impairments. Some participants had lumbar punctures to obtain cerebrospinal fluid (CSF). We added executive functioning (EF) and functional ability (FA) items to the ADAS-Cog to generate candidate augmented measures. We calibrated these candidates using baseline data (n=811) and selected the best candidate that added EF items alone and that added EF and FA items. We selected candidates based on their responsiveness over three years in a training sample of participants with MCI (n=160). We compared traditional ADAS-Cog scores with the two candidates based on their responsiveness in a validation sample of participants with MCI (n=234), ability to predict conversion to dementia (n=394), strength of association with baseline MRI (n=394) and CSF biomarkers (n=193). Results The selected EF candidate added category fluency (ADAS Plus EF), and the selected EF and FA candidate added category fluency, Digit Symbol, Trail Making, and five items from the Functional Assessment Questionnaire (ADAS Plus EF&FA). The ADAS Plus EF& FA performed as well as or better than traditional ADAS-Cog scores. Conclusion Adding EF and FA items to the ADAS-Cog may improve responsiveness among people with MCI without impairing validity. PMID:22614326

  19. Programming in a proposed 9X distributed Ada

    NASA Technical Reports Server (NTRS)

    Waldrop, Raymond S.; Volz, Richard A.; Goldsack, Stephen J.

    1990-01-01

    The proposed Ada 9X constructs for distribution was studied. The goal was to select suitable test cases to help in the evaluation of the proposed constructs. The examples were to be considered according to the following requirements: real time operation; fault tolerance at several different levels; demonstration of both distributed and massively parallel operation; reflection of realistic NASA programs; illustration of the issues of configuration, compilation, linking, and loading; indications of the consequences of using the proposed revisions for large scale programs; and coverage of the spectrum of communication patterns such as predictable, bursty, small and large messages. The first month was spent identifying possible examples and judging their suitability for the project.

  20. DEC Ada interface to Screen Management Guidelines (SMG)

    NASA Technical Reports Server (NTRS)

    Laomanachareon, Somsak; Lekkos, Anthony A.

    1986-01-01

    DEC's Screen Management Guidelines are the Run-Time Library procedures that perform terminal-independent screen management functions on a VT100-class terminal. These procedures assist users in designing, composing, and keeping track of complex images on a video screen. There are three fundamental elements in the screen management model: the pasteboard, the virtual display, and the virtual keyboard. The pasteboard is like a two-dimensional area on which a user places and manipulates screen displays. The virtual display is a rectangular part of the terminal screen to which a program writes data with procedure calls. The virtual keyboard is a logical structure for input operation associated with a physical keyboard. SMG can be called by all major VAX languages. Through Ada, predefined language Pragmas are used to interface with SMG. These features and elements of SMG are briefly discussed.

  1. Inclusion of geriatric nutrition in ADA-approved undergraduate programs.

    PubMed

    Shoaf, L R; Jensen, H M

    1989-09-01

    All ADA Plan IV programs were surveyed to determine whether geriatric nutrition was included in their curriculums. Of the 268 Plan IV programs, 66% responded. Less than one-fifth of the programs offered or planned to offer a specific geriatric nutrition course. An overview of geriatric nutrition occurred most frequently in a human nutrition course. A practicum/clinical experience or a course other than nutrition most frequently provided in-depth study, if such was available. Nursing homes and congregate meal sites were the primary locations for experiences with the geriatric population. Major activities with that age group included (a) taking diet histories, (b) making nutrition assessments, and (c) providing diet instruction. In some programs, didactic and experiential training with the geriatric population may not be adequate to prepare dietetic undergraduate students to meet the health care needs of that growing segment of society. PMID:2768741

  2. Lessons learned in the transition to Ada from FORTRAN at NASA/Goddard

    NASA Technical Reports Server (NTRS)

    Brophy, Carolyn Elizabeth

    1989-01-01

    Two dynamics satellite simulators are developed from the same requirements, one in Ada and the other in FORTRAN. The purpose of the research was to find out how well the prescriptive Ada development model worked to develop the Ada simulator. The FORTRAN simulator development, as well as past FORTRAN developments, provided a baseline for comparison. Since this was the first simulator developed, the prescriptive Ada development model had many similarities to the usual FORTRAN development model. However, it was modified to include longer design and shorter testing phases, which is generally expected with Ada developments. One result was that the percentage of time the Ada project spent in the various development activities was very similar to the percentage of time spent in these activities when doing a FORTRAN project. Another finding was the difficulty the Ada team had with unit testing as well as with integration. It was realized that adding additional steps to the design phase, such as an abstract data type analysis, and certain guidelines to the implementation phase, such as to use primarily library units and nest sparingly, would have made development easier. These are among the recommendations made to be incorporated in a new Ada development model next time.

  3. The Labor Market Experience of Workers with Disabilities: The ADA and Beyond.

    ERIC Educational Resources Information Center

    Hotchkiss, Julie L.

    This book provides a comprehensive analysis of the recent labor market experience of American workers with disabilities and an assessment of the impact the Americans with Disabilities Act (ADA) has had on that experience. Since one intention of the ADA is to break down barriers to employment for the disabled, the analyses focus on labor demand…

  4. Voices of Freedom: America Speaks Out on the ADA. A Report to the President and Congress.

    ERIC Educational Resources Information Center

    National Council on Disability, Washington, DC.

    This report examines the implementation of the Americans with Disabilities Act (ADA) during the 5 years since its passage in 1990. An introductory chapter considers the overall importance of the Act; the continuing interest of the National Council on Disability (NCD) in the ADA; and the visits of NCD representatives to each of the 50 states, the…

  5. The G22A Polymorphism of the ADA Gene and Susceptibility to Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Hettinger, Joe A.; Liu, Xudong; Holden, Jeanette Jeltje Anne

    2008-01-01

    Inborn errors of purine metabolism have been implicated as a cause for some cases of autism. This hypothesis is supported by the finding of decreased adenosine deaminase (ADA) activity in the sera of some children with autism and reports of an association of the A allele of the ADA G22A (Asp8Asn) polymorphism in individuals with autism of…

  6. School Issues Under [Section] 504 and the ADA: The Latest and Greatest.

    ERIC Educational Resources Information Center

    Aleman, Steven R.

    This paper highlights recent guidance and rulings from the Office of Civil Rights (OCR) of interest to administrators, advocates, and attorneys. It is a companion piece to Student Issues on SectionNB504/ADA: The Latest and Greatest. Compliance with SectionNB504 and the Americans with Disabilities Act (ADA) continues to involve debate and dialog on…

  7. NRPA Law Review. Combat Karate Class Illustrates ADA "Direct Threat" Exception.

    ERIC Educational Resources Information Center

    Kozlowski, James C.

    2000-01-01

    Describes the Americans with Disabilities Act (ADA), which prohibits discrimination against people with disabilities, highlighting a lawsuit involving a boy with AIDS who was barred from a traditional combat-oriented martial arts school. Courts ruled that his exclusion did not violate the ADA because he posed significant health and safety risks to…

  8. The implementation and use of Ada on distributed systems with high reliability requirements

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1988-01-01

    The use and implementation of Ada were investigated in distributed environments in which reliability is the primary concern. In particular, the focus was on the possibility that a distributed system may be programmed entirely in Ada so that the individual tasks of the system are unconcerned with which processors are being executed, and that failures may occur in the software and underlying hardware. A secondary interest is in the performance of Ada systems and how that performance can be gauged reliably. Primary activities included: analysis of the original approach to recovery in distributed Ada programs using the Advanced Transport Operating System (ATOPS) example; review and assessment of the original approach which was found to be capable of improvement; development of a refined approach to recovery that was applied to the ATOPS example; and design and development of a performance assessment scheme for Ada programs based on a flexible user-driven benchmarking system.

  9. The implementation and use of Ada on distributed systems with high reliability requirements

    NASA Technical Reports Server (NTRS)

    Knight, J. C.; Gregory, S. T.; Urquhart, J. I. A.

    1985-01-01

    The use and implementation of Ada in distributed environments in which reliability is the primary concern were investigated. In particular, the concept that a distributed system may be programmed entirely in Ada so that the individual tasks of the system are unconcerned with which processors they are executing on, and that failures may occur in the software or underlying hardware was examined. Progress is discussed for the following areas: continued development and testing of the fault-tolerant Ada testbed; development of suggested changes to Ada so that it might more easily cope with the failure of interest; and design of new approaches to fault-tolerant software in real-time systems, and integration of these ideas into Ada.

  10. The implementation and use of Ada on distributed systems with reliability requirements

    NASA Technical Reports Server (NTRS)

    Reynolds, P. F.; Knight, J. C.; Urquhart, J. I. A.

    1983-01-01

    The issues involved in the use of the programming language Ada on distributed systems are discussed. The effects of Ada programs on hardware failures such as loss of a processor are emphasized. It is shown that many Ada language elements are not well suited to this environment. Processor failure can easily lead to difficulties on those processors which remain. As an example, the calling task in a rendezvous may be suspended forever if the processor executing the serving task fails. A mechanism for detecting failure is proposed and changes to the Ada run time support system are suggested which avoid most of the difficulties. Ada program structures are defined which allow programs to reconfigure and continue to provide service following processor failure.

  11. Coded aperture imaging of fusion source in a plasma focus operated with pure D{sub 2} and a D{sub 2}-Kr gas admixture

    SciTech Connect

    Springham, S. V.; Talebitaher, A.; Shutler, P. M. E.; Rawat, R. S.; Lee, P.; Lee, S.

    2012-09-10

    The coded aperture imaging (CAI) technique has been used to investigate the spatial distribution of DD fusion in a 1.6 kJ plasma focus (PF) device operated in, alternatively, pure deuterium or deuterium-krypton admixture. The coded mask pattern is based on a singer cyclic difference set with 25% open fraction and positioned close to 90 Degree-Sign to the plasma focus axis, with CR-39 detectors used to register tracks of protons from the D(d, p)T reaction. Comparing the coded aperture imaging proton images for pure D{sub 2} and D{sub 2}-Kr admixture operation reveals clear differences in size, density, and shape between the fusion sources for these two cases.

  12. Subunits of ADA-two-A-containing (ATAC) or Spt-Ada-Gcn5-acetyltrasferase (SAGA) Coactivator Complexes Enhance the Acetyltransferase Activity of GCN5.

    PubMed

    Riss, Anne; Scheer, Elisabeth; Joint, Mathilde; Trowitzsch, Simon; Berger, Imre; Tora, László

    2015-11-27

    Histone acetyl transferases (HATs) play a crucial role in eukaryotes by regulating chromatin architecture and locus specific transcription. GCN5 (KAT2A) is a member of the GNAT (Gcn5-related N-acetyltransferase) family of HATs. In metazoans this enzyme is found in two functionally distinct coactivator complexes, SAGA (Spt Ada Gcn5 acetyltransferase) and ATAC (Ada Two A-containing). These two multiprotein complexes comprise complex-specific and shared subunits, which are organized in functional modules. The HAT module of ATAC is composed of GCN5, ADA2a, ADA3, and SGF29, whereas in the SAGA HAT module ADA2b is present instead of ADA2a. To better understand how the activity of human (h) hGCN5 is regulated in the two related, but different, HAT complexes we carried out in vitro HAT assays. We compared the activity of hGCN5 alone with its activity when it was part of purified recombinant hATAC or hSAGA HAT modules or endogenous hATAC or hSAGA complexes using histone tail peptides and full-length histones as substrates. We demonstrated that the subunit environment of the HAT complexes into which GCN5 incorporates determines the enhancement of GCN5 activity. On histone peptides we show that all the tested GCN5-containing complexes acetylate mainly histone H3K14. Our results suggest a stronger influence of ADA2b as compared with ADA2a on the activity of GCN5. However, the lysine acetylation specificity of GCN5 on histone tails or full-length histones was not changed when incorporated in the HAT modules of ATAC or SAGA complexes. Our results thus demonstrate that the catalytic activity of GCN5 is stimulated by subunits of the ADA2a- or ADA2b-containing HAT modules and is further increased by incorporation of the distinct HAT modules in the ATAC or SAGA holo-complexes. PMID:26468280

  13. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  14. Research, development, training, and education using the Ada programming language. Final report, 1 September 1987-31 May 1989

    SciTech Connect

    Harrison, G.C.

    1989-07-16

    The primary goal of this activity was to conduct research in the application and development of Ada, and in broader terms the objectives were as follows: (1) To develop numerical algorithms for parallel processing using the Ada language; (2) To develop new methodologies in reusing Ada software; (3) To solve select problems in applied mathematics using MACSYMA and Ada; (4) Simulate the interactions of nodes in a network using Ada; (5) To increase the cadre of educations available to provide Ada training by conducting Ada workshops for Norfolk State University faculty and staff; (6) To develop a series of in-class and individualized modules addressing Ada programming using computer-assisted instruction; and (7) To disseminate research and computer-aided instruction modules to other minority institutions through computer networking, workshops, and lecture series.

  15. Joint Source-Channel Coding Based on Cosine-Modulated Filter Banks for Erasure-Resilient Signal Transmission

    NASA Astrophysics Data System (ADS)

    Marinkovic, Slavica; Guillemot, Christine

    2005-12-01

    This paper examines erasure resilience of oversampled filter bank (OFB) codes, focusing on two families of codes based on cosine-modulated filter banks (CMFB). We first revisit OFBs in light of filter bank and frame theory. The analogy with channel codes is then shown. In particular, for paraunitary filter banks, we show that the signal reconstruction methods derived from the filter bank theory and from coding theory are equivalent, even in the presence of quantization noise. We further discuss frame properties of the considered OFB structures. Perfect reconstruction (PR) for the CMFB-based OFBs with erasures is proven for the case of erasure patterns for which PR depends only on the general structure of the code and not on the prototype filters. For some of these erasure patterns, the expression of the mean-square reconstruction error is also independent of the filter coefficients. It can be expressed in terms of the number of erasures, and of parameters such as the number of channels and the oversampling ratio. The various structures are compared by simulation for the example of an image transmission system.

  16. All Source Analysis System (ASAS): Migration from VAX to Alpha AXP computer systems

    NASA Technical Reports Server (NTRS)

    Sjoholm-Sierchio, Michael J.; Friedman, Steven Z. (Editor)

    1994-01-01

    The Jet Propulsion Laboratory's (JPL's) experience migrating existing VAX applications to Digital Equipment Corporation's new Alpha AXP processor is covered. The rapid development approach used during the 10-month period required to migrate the All Source Analysis System (ASAS), 1.5 million lines of FORTRAN, C, and Ada code, is also covered. ASAS, an automated tactical intelligence system, was developed by the Jet Propulsion Laboratory for the U. S. Army. Other benefits achieved as a result of the significant performance improvements provided by Alpha AXP platform are also described.

  17. Functional similarity and physical association between GCN5 and ADA2: putative transcriptional adaptors.

    PubMed Central

    Marcus, G A; Silverman, N; Berger, S L; Horiuchi, J; Guarente, L

    1994-01-01

    A selection for yeast mutants resistant to GAL4-VP16-induced toxicity previously identified two genes, ADA2 and ADA3, which may function as adaptors for some transcriptional activation domains and thereby facilitate activation. Here we identify two new genes by the same selection, one of which is identical to GCN5. We show that gcn5 mutants share properties with ada mutants, including slow growth, temperature sensitivity and reduced activation by the VP16 and GCN4 activation domains. Double mutant studies suggest that ADA2 and GCN5 function together in a complex or pathway. Moreover, we demonstrate that GCN5 binds to ADA2 both by the two-hybrid assay in vivo and by co-immunoprecipitation in vitro. This suggests that ADA2 and GCN5 are part of a heteromeric complex that mediates transcriptional activation. Finally, we demonstrate the functional importance of the bromodomain of GCN5, a sequence found in other global transcription factors such as the SWI/SNF complex and the TATA binding protein-associated factors. This domain is not required for the interaction between GCN5 and ADA2 and thus may mediate a more general activity of transcription factors. Images PMID:7957049

  18. On-line replacement of program modules using AdaPT

    NASA Technical Reports Server (NTRS)

    Waldrop, Raymond S.; Volz, Richard A.; Smith, Gary W.; Holzbacher-Valero, A. A.; Goldsack, Stephen J.

    1992-01-01

    One purpose of our research is the investigation of the effectiveness and expressiveness of AdaPT(1), a set of language extensions to Ada 83, for distributed systems. As a part of that effort, we are now investigating the subject of replacing, e.g., upgrading, software modules while the software system remains in operation. The AdaPT language extension provide a good basis for this investigation for several reasons: (1) they include the concept of specific, self-contained program modules which can be manipulated; (2) support for program configuration is included in the language; and (3) although the discussion will be in terms of the AdaPT language, the AdaPT to Ada 83 conversion methodology being developed as another part of this project will provide a basis for the application of our findings to Ada 83 systems. The purpose of this investigation is to explore the basic mechanisms to the replacement process. Thus, while replacement in the presence of real-time deadlines, heterogeneous systems, and unreliable networks is certainly a topic of interest, we will first gain an understanding of the basic processes in the absence of such concerns. The extension of the replacement process to more complex situations can be made later. This report will establish an overview of the on-line upgrade problem, and present a taxonomy of the various aspects of the replacement process.

  19. Development of Immunocapture-LC/MS Assay for Simultaneous ADA Isotyping and Semiquantitation

    PubMed Central

    2016-01-01

    Therapeutic proteins and peptides have potential to elicit immune responses resulting in anti-drug antibodies that can pose problems for both patient safety and product efficacy. During drug development immunogenicity is usually examined by risk-based approach along with specific strategies for developing “fit-for-purpose” bioanalytical approaches. Enzyme-linked immunosorbent assays and electrochemiluminescence immunoassays are the most widely used platform for ADA detection due to their high sensitivity and throughput. During the past decade, LC/MS has emerged as a promising technology for quantitation of biotherapeutics and protein biomarkers in biological matrices, mainly owing to its high specificity, selectivity, multiplexing, and wide dynamic range. In fully taking these advantages, we describe here an immunocapture-LC/MS methodology for simultaneous isotyping and semiquantitation of ADA in human plasma. Briefly, ADA and/or drug-ADA complex is captured by biotinylated drug or anti-drug Ab, immobilized on streptavidin magnetic beads, and separated from human plasma by a magnet. ADA is then released from the beads and subjected to trypsin digestion followed by LC/MS detection of specific universal peptides for each ADA isotype. The LC/MS data are analyzed using cut-point and calibration curve. The proof-of-concept of this methodology is demonstrated by detecting preexisting ADA in human plasma. PMID:27034966

  20. Ada compiler evaluation on the Space Station Freedom Software Support Environment project

    NASA Technical Reports Server (NTRS)

    Badal, D. L.

    1989-01-01

    This paper describes the work in progress to select the Ada compilers for the Space Station Freedom Program (SSFP) Software Support Environment (SSE) project. The purpose of the SSE Ada compiler evaluation team is to establish the criteria, test suites, and benchmarks to be used for evaluating Ada compilers for the mainframes, workstations, and the realtime target for flight- and ground-based computers. The combined efforts and cooperation of the customer, subcontractors, vendors, academia and SIGAda groups made it possible to acquire the necessary background information, benchmarks, test suites, and criteria used.

  1. A Mode Propagation Database Suitable for Code Validation Utilizing the NASA Glenn Advanced Noise Control Fan and Artificial Sources

    NASA Technical Reports Server (NTRS)

    Sutliff, Daniel L.

    2014-01-01

    The NASA Glenn Research Center's Advanced Noise Control Fan (ANCF) was developed in the early 1990s to provide a convenient test bed to measure and understand fan-generated acoustics, duct propagation, and radiation to the farfield. A series of tests were performed primarily for the use of code validation and tool validation. Rotating Rake mode measurements were acquired for parametric sets of: (1) mode blockage, (2) liner insertion loss, (3) short ducts, and (4) mode reflection.

  2. A Mode Propagation Database Suitable for Code Validation Utilizing the NASA Glenn Advanced Noise Control Fan and Artificial Sources

    NASA Technical Reports Server (NTRS)

    Sutliff, Daniel L.

    2014-01-01

    The NASA Glenn Research Center's Advanced Noise Control Fan (ANCF) was developed in the early 1990s to provide a convenient test bed to measure and understand fan-generated acoustics, duct propagation, and radiation to the farfield. A series of tests were performed primarily for the use of code validation and tool validation. Rotating Rake mode measurements were acquired for parametric sets of: (i) mode blockage, (ii) liner insertion loss, (iii) short ducts, and (iv) mode reflection.

  3. Source coding with a permutation-based reversible memory-binding transform for data compression in categorical data domains.

    PubMed

    Talbot, B G; Talbot, L M

    1998-01-01

    A general purpose reversible memory-binding transform (MBT) is developed, which uses a permutation transform technique to bind memory information to a transformed signal alphabet. The algorithm performs well in conjunction with a Huffman coder for both ordered sources, such as pixel intensities, and categorical sources, such as vector quantized codebook indices. PMID:18276336

  4. The implementation and use of Ada on distributed systems with high reliability requirements

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1987-01-01

    Performance analysis was begin on the Ada implementations. The goal is to supply the system designer with tools that will allow a rational decision to be made about whether a particular implementation can support a given application early in the design cycle. Primary activities were: analysis of the original approach to recovery in distributed Ada programs using the Advanced Transport Operating System (ATOPS) example; review and assessment of the original approach which was found to be capable of improvement; preparation and presentation of a paper at the 1987 Washington DC Ada Symposium; development of a refined approach to recovery that is presently being applied to the ATOPS example; and design and development of a performance assessment scheme for Ada programs based on a flexible user-driven benchmarking system.

  5. U.S. Environmental Protection Agency's Robert S. Kerr Environmental Research Center, Ada, Oklahoma

    SciTech Connect

    Farrar-Nagy, S.; Voss, P.; Van Geet, O.

    2006-10-01

    U.S. EPA's Robert S. Kerr Environmental Research Center, Ada, Oklahoma, has reduced its annual energy consumption by 45% by upgrading its building mechanical system and incorporating renewable energy.

  6. 78 FR 10263 - Proposed Collection; Comment Request for ADA Accommodations Request Packet

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-13

    ... Internal Revenue Service Proposed Collection; Comment Request for ADA Accommodations Request Packet AGENCY: Internal Revenue Service (IRS), Treasury. ACTION: Notice and request for comments. SUMMARY: The Department... consideration. ADDRESSES: Direct all written comments to Yvette Lawrence, Internal Revenue Service, Room...

  7. The implications of the ADA Amendments Act of 2008 for residency training program administration.

    PubMed

    Regenbogen, Alexandra; Recupero, Patricia R

    2012-01-01

    The Americans with Disabilities Act (ADA) is rarely invoked by medical residents in training. Dr. Martin Jakubowski, a family medicine resident with Asperger's disorder, was dismissed for communicating poorly with patients, peers, and supervisors and for issuing dangerous medical orders. In an attempt to become reinstated, he sued under the ADA (Jakubowski v. The Christ Hospital), arguing that the program had failed to make reasonable accommodation for his disability. The Sixth Circuit Court of Appeals ruled in favor of the hospital, finding that although the doctor was disabled under the ADA, he had failed to demonstrate that he was otherwise qualified for the position. This article comments on the ADA Amendments Act of 2008, the Equal Employment Opportunity Commission (EEOC) guidelines from 2011 and their application to medical residency training, and the Accreditation Council for Graduate Medical Education (ACGME) core competencies as essential job functions. PMID:23233478

  8. System testing of a production Ada (trademark) project: The GRODY study

    NASA Technical Reports Server (NTRS)

    Seigle, Jeffrey; Esker, Linda; Shi, Ying-Liang

    1990-01-01

    The use of the Ada language and design methodologies that utilize its features has a strong impact on all phases of the software development project lifecycle. At the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC), the Software Engineering Laboratory (SEL) conducted an experiment in parallel development of two flight dynamics systems in FORTRAN and Ada. The teams found some qualitative differences between the system test phases of the two projects. Although planning for system testing and conducting of tests were not generally affected by the use of Ada, the solving of problems found in system testing was generally facilitated by Ada constructs and design methodology. Most problems found in system testing were not due to difficulty with the language or methodology but to lack of experience with the application.

  9. Expanding ADA coverage to employee benefit plans: recent judicial and administrative developments.

    PubMed

    Mook, J R

    1995-01-01

    The Americans with Disabilities Act has been heralded as the Emancipation Proclamation for persons with disabilities. The purpose of the law is to provide nothing less than a "clear and comprehensive national mandate for the elimination of discrimination against individuals with disabilities." Precisely how the nondiscrimination principles of the ADA will be applied to an employer's provision of health benefits to its employees has been the subject of much debate since the Act's passage in 1990. Although the statutory language and the legislative history support a limited application of the ADA to benefits issues, recent court decisions and enforcement actions by the Equal Employment Opportunity Commission indicate that the ADA may have a much more profound impact in the area of benefits plan design and administration. Moreover, as benefits administrators take a much more active role in managing health care decisions, the ADA may become a vehicle for legal challenges to those decisions that affect the disabled. PMID:10172245

  10. CSF ADA Determination in Early Diagnosis of Tuberculous Meningitis in HIV-Infected Patients

    PubMed Central

    Ghosh, Gopal Chandra; Sharma, Brijesh; Gupta, B. B.

    2016-01-01

    Tuberculous and Cryptococcal meningitis are common in HIV patients. A highly specific and sensitive rapid test for diagnosis of Tuberculous meningitis especially in setting of HIV is not available in developing countries where the burden of disease is high. We measured ADA (adenosine deaminase) levels using spectrophotometric method in the CSF of HIV patients with meningitis to differentiate Tuberculous meningitis from meningitis due to other causes. Kruskal-Wallis test was used to compare ADA values between tuberculous meningitis (TBM) and nontuberculous (non-TB) meningitis patients and a receiver-operating characteristic (ROC) analysis curve was drawn from these values. Levels of ADA in the CSF of patients with TBM were significantly higher than those in patients with meningitis due to other causes. CSF ADA level determination with a cut-off value of 6 IU/L was found to be highly specific and fairly sensitive test for the diagnosis of TBM in HIV positive patients. PMID:27144055

  11. JOB OPPORTUNITIES (SUBSURFACE PROTECTION AND REMEDIATION DIVISION, ADA, OKLAHOMA, NATIONAL RISK MANAGEMENT RESEARCH LABORATORY)

    EPA Science Inventory

    This page lists job opportunities at NRMRL's Subsurface Protection and Remediation Division (SPRD) located in Ada, Oklahoma. These include both EPA Postdoctoral Positions and National Research Council Postdoctoral Positions.SPRD's research programs include basic studies to enha...

  12. Retroviral vectors encoding ADA regulatory locus control region provide enhanced T-cell-specific transgene expression

    PubMed Central

    2009-01-01

    Background Murine retroviral vectors have been used in several hundred gene therapy clinical trials, but have fallen out of favor for a number of reasons. One issue is that gene expression from viral or internal promoters is highly variable and essentially unregulated. Moreover, with retroviral vectors, gene expression is usually silenced over time. Mammalian genes, in contrast, are characterized by highly regulated, precise levels of expression in both a temporal and a cell-specific manner. To ascertain if recapitulation of endogenous adenosine deaminase (ADA) expression can be achieved in a vector construct we created a new series of Moloney murine leukemia virus (MuLV) based retroviral vector that carry human regulatory elements including combinations of the ADA promoter, the ADA locus control region (LCR), ADA introns and human polyadenylation sequences in a self-inactivating vector backbone. Methods A MuLV-based retroviral vector with a self-inactivating (SIN) backbone, the phosphoglycerate kinase promoter (PGK) and the enhanced green fluorescent protein (eGFP), as a reporter gene, was generated. Subsequent vectors were constructed from this basic vector by deletion or addition of certain elements. The added elements that were assessed are the human ADA promoter, human ADA locus control region (LCR), introns 7, 8, and 11 from the human ADA gene, and human growth hormone polyadenylation signal. Retroviral vector particles were produced by transient three-plasmid transfection of 293T cells. Retroviral vectors encoding eGFP were titered by transducing 293A cells, and then the proportion of GFP-positive cells was determined using fluorescence-activated cell sorting (FACS). Non T-cell and T-cell lines were transduced at a multiplicity of infection (MOI) of 0.1 and the yield of eGFP transgene expression was evaluated by FACS analysis using mean fluorescent intensity (MFI) detection. Results Vectors that contained the ADA LCR were preferentially expressed in T

  13. On-line upgrade of program modules using AdaPT

    NASA Technical Reports Server (NTRS)

    Waldrop, Raymond S.; Volz, Richard A.; Smith, Gary W.; Goldsack, Stephen J.; Holzbach-Valero, A. A.

    1993-01-01

    One purpose of our research is the investigation of the effectiveness and expressiveness of AdaPT, a set of language extensions to Ada 83, for distributed systems. As a part of that effort, we are now investigating the subject of replacing, e.g. upgrading, software modules while the software system remains in operation. The AdaPT language extensions provide a good basis for this investigation for several reasons: they include the concept of specific, self-contained program modules which can be manipulated; support for program configuration is included in the language; and although the discussion will be in terms of the AdaPT language, the AdaPT to Ada 83 conversion methodology being developed as another part of this project will provide a basis for the application of our findings to Ada 83 and Ada 9X systems. The purpose of this investigation is to explore the basic mechanisms of the replacement process. With this purpose in mind, we will avoid including issues whose presence would obscure these basic mechanisms by introducing additional, unrelated concerns. Thus, while replacement in the presence of real-time deadlines, heterogeneous systems, and unreliable networks is certainly a topic of interest, we will first gain an understanding of the basic processes in the absence of such concerns. The extension of the replacement process to more complex situations can be made later. A previous report established an overview of the module replacement problem, a taxonomy of the various aspects of the replacement process, and a solution to one case in the replacement taxonomy. This report provides solutions to additional cases in the replacement process taxonomy: replacement of partitions with state and replacement of nodes. The solutions presented here establish the basic principles for module replacement. Extension of these solutions to other more complicated cases in the replacement taxonomy is direct, though requiring substantial work beyond the available funding.

  14. A study of the portability of an Ada system in the software engineering laboratory (SEL)

    NASA Technical Reports Server (NTRS)

    Jun, Linda O.; Valett, Susan Ray

    1990-01-01

    A particular porting effort is discussed, and various statistics on analyzing the portability of Ada and the total staff months (overall and by phase) required to accomplish the rehost, are given. This effort is compared to past experiments on the rehosting of FORTRAN systems. The discussion includes an analysis of the types of errors encountered during the rehosting, the changes required to rehost the system, experiences with the Alsys IBM Ada compiler, the impediments encountered, and the lessons learned during this study.

  15. The implementation and use of Ada on distributed systems with high reliability requirements

    NASA Technical Reports Server (NTRS)

    Knight, J. C.; Gregory, S. T.; Urquhart, J. I. A.

    1984-01-01

    The use and implementation of Ada (a trade mark of the US Dept. of Defense) in distributed environments in which the hardware are assumed to be unreliable were investigated. The possibility that a distributed system is programmed entirely in Ada so that the individual tasks of the system are unconcerned with which processors they are executing on and failures occurring in the underlying hardware were examined.

  16. Dosimetric comparison between the microSelectron HDR 192Ir v2 source and the BEBIG 60Co source for HDR brachytherapy using the EGSnrc Monte Carlo transport code

    PubMed Central

    Islam, M. Anwarul; Akramuzzaman, M. M.; Zakaria, G. A.

    2012-01-01

    Manufacturing of miniaturized high activity 192Ir sources have been made a market preference in modern brachytherapy. The smaller dimensions of the sources are flexible for smaller diameter of the applicators and it is also suitable for interstitial implants. Presently, miniaturized 60Co HDR sources have been made available with identical dimensions to those of 192Ir sources. 60Co sources have an advantage of longer half life while comparing with 192Ir source. High dose rate brachytherapy sources with longer half life are logically pragmatic solution for developing country in economic point of view. This study is aimed to compare the TG-43U1 dosimetric parameters for new BEBIG 60Co HDR and new microSelectron 192Ir HDR sources. Dosimetric parameters are calculated using EGSnrc-based Monte Carlo simulation code accordance with the AAPM TG-43 formalism for microSlectron HDR 192Ir v2 and new BEBIG 60Co HDR sources. Air-kerma strength per unit source activity, calculated in dry air are 9.698×10-8 ± 0.55% U Bq-1 and 3.039×10-7 ± 0.41% U Bq-1 for the above mentioned two sources, respectively. The calculated dose rate constants per unit air-kerma strength in water medium are 1.116±0.12% cGy h-1U-1 and 1.097±0.12% cGy h-1U-1, respectively, for the two sources. The values of radial dose function for distances up to 1 cm and more than 22 cm for BEBIG 60Co HDR source are higher than that of other source. The anisotropic values are sharply increased to the longitudinal sides of the BEBIG 60Co source and the rise is comparatively sharper than that of the other source. Tissue dependence of the absorbed dose has been investigated with vacuum phantom for breast, compact bone, blood, lung, thyroid, soft tissue, testis, and muscle. No significant variation is noted at 5 cm of radial distance in this regard while comparing the two sources except for lung tissues. The true dose rates are calculated with considering photon as well as electron transport using appropriate cut

  17. Mobile, hybrid Compton/coded aperture imaging for detection, identification and localization of gamma-ray sources at stand-off distances

    NASA Astrophysics Data System (ADS)

    Tornga, Shawn R.

    The Stand-off Radiation Detection System (SORDS) program is an Advanced Technology Demonstration (ATD) project through the Department of Homeland Security's Domestic Nuclear Detection Office (DNDO) with the goal of detection, identification and localization of weak radiological sources in the presence of large dynamic backgrounds. The Raytheon-SORDS Tri-Modal Imager (TMI) is a mobile truck-based, hybrid gamma-ray imaging system able to quickly detect, identify and localize, radiation sources at standoff distances through improved sensitivity while minimizing the false alarm rate. Reconstruction of gamma-ray sources is performed using a combination of two imaging modalities; coded aperture and Compton scatter imaging. The TMI consists of 35 sodium iodide (NaI) crystals 5x5x2 in3 each, arranged in a random coded aperture mask array (CA), followed by 30 position sensitive NaI bars each 24x2.5x3 in3 called the detection array (DA). The CA array acts as both a coded aperture mask and scattering detector for Compton events. The large-area DA array acts as a collection detector for both Compton scattered events and coded aperture events. In this thesis, developed coded aperture, Compton and hybrid imaging algorithms will be described along with their performance. It will be shown that multiple imaging modalities can be fused to improve detection sensitivity over a broader energy range than either alone. Since the TMI is a moving system, peripheral data, such as a Global Positioning System (GPS) and Inertial Navigation System (INS) must also be incorporated. A method of adapting static imaging algorithms to a moving platform has been developed. Also, algorithms were developed in parallel with detector hardware, through the use of extensive simulations performed with the Geometry and Tracking Toolkit v4 (GEANT4). Simulations have been well validated against measured data. Results of image reconstruction algorithms at various speeds and distances will be presented as well as

  18. Inhibition of adenosine deaminase (ADA)-mediated metabolism of cordycepin by natural substances

    PubMed Central

    Li, Gen; Nakagome, Izumi; Hirono, Shuichi; Itoh, Tomoo; Fujiwara, Ryoichi

    2015-01-01

    Cordycepin, which is an analogue of a nucleoside adenosine, exhibits a wide variety of pharmacological activities including anticancer effects. In this study, ADA1- and ADA2-expressing HEK293 cells were established to determine the major ADA isoform responsible for the deamination of cordycepin. While the metabolic rate of cordycepin deamination was similar between ADA2-expressing and Mock cells, extensive metabolism of cordycepin was observed in the ADA1-expressing cells with Km and Vmax values of 54.9 μmol/L and 45.8 nmole/min/mg protein. Among five natural substances tested in this study (kaempferol, quercetin, myricetin, naringenin, and naringin), naringin strongly inhibited the deamination of cordycepin with Ki values of 58.8 μmol/L in mouse erythrocytes and 168.3 μmol/L in human erythrocytes. A treatment of Jurkat cells with a combination of cordycepin and naringin showed significant cytotoxicity. Our in silico study suggests that not only small molecules such as adenosine derivatives but also bulky molecules like naringin can be a potent ADA1 inhibitor for the clinical usage. PMID:26038697

  19. Constructing a working taxonomy of functional Ada software components for real-time embedded system applications

    NASA Technical Reports Server (NTRS)

    Wallace, Robert

    1986-01-01

    A major impediment to a systematic attack on Ada software reusability is the lack of an effective taxonomy for software component functions. The scope of all possible applications of Ada software is considered too great to allow the practical development of a working taxonomy. Instead, for the purposes herein, the scope of Ada software application is limited to device and subsystem control in real-time embedded systems. A functional approach is taken in constructing the taxonomy tree for identified Ada domain. The use of modular software functions as a starting point fits well with the object oriented programming philosophy of Ada. Examples of the types of functions represented within the working taxonomy are real time kernels, interrupt service routines, synchronization and message passing, data conversion, digital filtering and signal conditioning, and device control. The constructed taxonomy is proposed as a framework from which a need analysis can be performed to reveal voids in current Ada real-time embedded programming efforts for Space Station.

  20. The characterization and optimization of NIO1 ion source extraction aperture using a 3D particle-in-cell code

    NASA Astrophysics Data System (ADS)

    Taccogna, F.; Minelli, P.; Cavenago, M.; Veltri, P.; Ippolito, N.

    2016-02-01

    The geometry of a single aperture in the extraction grid plays a relevant role for the optimization of negative ion transport and extraction probability in a hybrid negative ion source. For this reason, a three-dimensional particle-in-cell/Monte Carlo collision model of the extraction region around the single aperture including part of the source and part of the acceleration (up to the extraction grid (EG) middle) regions has been developed for the new aperture design prepared for negative ion optimization 1 source. Results have shown that the dimension of the flat and chamfered parts and the slope of the latter in front of the source region maximize the product of production rate and extraction probability (allowing the best EG field penetration) of surface-produced negative ions. The negative ion density in the plane yz has been reported.

  1. The characterization and optimization of NIO1 ion source extraction aperture using a 3D particle-in-cell code.

    PubMed

    Taccogna, F; Minelli, P; Cavenago, M; Veltri, P; Ippolito, N

    2016-02-01

    The geometry of a single aperture in the extraction grid plays a relevant role for the optimization of negative ion transport and extraction probability in a hybrid negative ion source. For this reason, a three-dimensional particle-in-cell/Monte Carlo collision model of the extraction region around the single aperture including part of the source and part of the acceleration (up to the extraction grid (EG) middle) regions has been developed for the new aperture design prepared for negative ion optimization 1 source. Results have shown that the dimension of the flat and chamfered parts and the slope of the latter in front of the source region maximize the product of production rate and extraction probability (allowing the best EG field penetration) of surface-produced negative ions. The negative ion density in the plane yz has been reported. PMID:26932027

  2. TIDY, a complete code for renumbering and editing FORTRAN source programs. User's manual for IBM 360/67

    NASA Technical Reports Server (NTRS)

    Barlow, A. V.; Vanderplaats, G. N.

    1973-01-01

    TIDY, a computer code which edits and renumerates FORTRAN decks which have become difficult to read because of many patches and revisions, is described. The old program is reorganized so that statement numbers are added sequentially, and extraneous FORTRAN statements are deleted. General instructions for using TIDY on the IBM 360/67 Tymeshare System, and specific instructions for use on the NASA/AMES IBM 360/67 TSS system are included as well as specific instructions on how to run TIDY in conversational and in batch modes. TIDY may be adopted for use on other computers.

  3. FORTRAN code-evaluation system

    NASA Technical Reports Server (NTRS)

    Capps, J. D.; Kleir, R.

    1977-01-01

    Automated code evaluation system can be used to detect coding errors and unsound coding practices in any ANSI FORTRAN IV source code before they can cause execution-time malfunctions. System concentrates on acceptable FORTRAN code features which are likely to produce undesirable results.

  4. Toward quantifying the source term for predicting global climatic effects of nuclear war: applications of urban fire codes

    SciTech Connect

    Reitter, T.A.; Kang, S.W.; Takata, A.N.

    1985-06-15

    Calculating urban-area fire development is critical to estimating global smoke production effects due to nuclear warfare. To improve calculations of fire starts and spread in urban areas, we performed a parameter-sensitivity analysis using three codes from IIT Research Institute. We applied improved versions of the codes to two urban areas: an infinite ''uniform city'' with only one type of building and the ''San Jose urban area'' as of the late 1960s. We varied parameters and compared affected fuel consumption and areas with a baseline case. The dominant parameters for the uniform city were wind speed, atmospheric visibility, frequency of secondary fire starts, building density, and window sizes. For San Jose (1968), they were wind speed, building densities, location of ground zero (GZ), height of burst (HOB), window sizes, and brand range. Because some results are very sensitive to actual fuel-distribution characteristics and the attack scenario, it is not possible to use a uniform city to represent actual urban areas. This was confirmed by a few calculations for the Detroit area as of the late 1960s. Many improvements are needed, such as inclusion of fire-induced winds and debris fires, before results can be extrapolated to the global scale.

  5. Dosimetric comparison of Monte Carlo codes (EGS4, MCNP, MCNPX) considering external and internal exposures of the Zubal phantom to electron and photon sources.

    PubMed

    Chiavassa, S; Lemosquet, A; Aubineau-Lanièce, I; de Carlan, L; Clairand, I; Ferrer, L; Bardiès, M; Franck, D; Zankl, M

    2005-01-01

    This paper aims at comparing dosimetric assessments performed with three Monte Carlo codes: EGS4, MCNP4c2 and MCNPX2.5e, using a realistic voxel phantom, namely the Zubal phantom, in two configurations of exposure. The first one deals with an external irradiation corresponding to the example of a radiological accident. The results are obtained using the EGS4 and the MCNP4c2 codes and expressed in terms of the mean absorbed dose (in Gy per source particle) for brain, lungs, liver and spleen. The second one deals with an internal exposure corresponding to the treatment of a medullary thyroid cancer by 131I-labelled radiopharmaceutical. The results are obtained by EGS4 and MCNPX2.5e and compared in terms of S-values (expressed in mGy per kBq and per hour) for liver, kidney, whole body and thyroid. The results of these two studies are presented and differences between the codes are analysed and discussed. PMID:16604715

  6. Facilitating Internet-Scale Code Retrieval

    ERIC Educational Resources Information Center

    Bajracharya, Sushil Krishna

    2010-01-01

    Internet-Scale code retrieval deals with the representation, storage, and access of relevant source code from a large amount of source code available on the Internet. Internet-Scale code retrieval systems support common emerging practices among software developers related to finding and reusing source code. In this dissertation we focus on some…

  7. Source convergence diagnostics using Boltzmann entropy criterion application to different OECD/NEA criticality benchmarks with the 3-D Monte Carlo code Tripoli-4

    SciTech Connect

    Dumonteil, E.; Le Peillet, A.; Lee, Y. K.; Petit, O.; Jouanne, C.; Mazzolo, A.

    2006-07-01

    The measurement of the stationarity of Monte Carlo fission source distributions in k{sub eff} calculations plays a central role in the ability to discriminate between fake and 'true' convergence (in the case of a high dominant ratio or in case of loosely coupled systems). Recent theoretical developments have been made in the study of source convergence diagnostics, using Shannon entropy. We will first recall those results, and we will then generalize them using the expression of Boltzmann entropy, highlighting the gain in terms of the various physical problems that we can treat. Finally we will present the results of several OECD/NEA benchmarks using the Tripoli-4 Monte Carlo code, enhanced with this new criterion. (authors)

  8. OFF, Open source Finite volume Fluid dynamics code: A free, high-order solver based on parallel, modular, object-oriented Fortran API

    NASA Astrophysics Data System (ADS)

    Zaghi, S.

    2014-07-01

    OFF, an open source (free software) code for performing fluid dynamics simulations, is presented. The aim of OFF is to solve, numerically, the unsteady (and steady) compressible Navier-Stokes equations of fluid dynamics by means of finite volume techniques: the research background is mainly focused on high-order (WENO) schemes for multi-fluids, multi-phase flows over complex geometries. To this purpose a highly modular, object-oriented application program interface (API) has been developed. In particular, the concepts of data encapsulation and inheritance available within Fortran language (from standard 2003) have been stressed in order to represent each fluid dynamics “entity” (e.g. the conservative variables of a finite volume, its geometry, etc…) by a single object so that a large variety of computational libraries can be easily (and efficiently) developed upon these objects. The main features of OFF can be summarized as follows: Programming LanguageOFF is written in standard (compliant) Fortran 2003; its design is highly modular in order to enhance simplicity of use and maintenance without compromising the efficiency; Parallel Frameworks Supported the development of OFF has been also targeted to maximize the computational efficiency: the code is designed to run on shared-memory multi-cores workstations and distributed-memory clusters of shared-memory nodes (supercomputers); the code’s parallelization is based on Open Multiprocessing (OpenMP) and Message Passing Interface (MPI) paradigms; Usability, Maintenance and Enhancement in order to improve the usability, maintenance and enhancement of the code also the documentation has been carefully taken into account; the documentation is built upon comprehensive comments placed directly into the source files (no external documentation files needed): these comments are parsed by means of doxygen free software producing high quality html and latex documentation pages; the distributed versioning system referred

  9. Research on Universal Combinatorial Coding

    PubMed Central

    Lu, Jun; Zhang, Zhuo; Mo, Juan

    2014-01-01

    The conception of universal combinatorial coding is proposed. Relations exist more or less in many coding methods. It means that a kind of universal coding method is objectively existent. It can be a bridge connecting many coding methods. Universal combinatorial coding is lossless and it is based on the combinatorics theory. The combinational and exhaustive property make it closely related with the existing code methods. Universal combinatorial coding does not depend on the probability statistic characteristic of information source, and it has the characteristics across three coding branches. It has analyzed the relationship between the universal combinatorial coding and the variety of coding method and has researched many applications technologies of this coding method. In addition, the efficiency of universal combinatorial coding is analyzed theoretically. The multicharacteristic and multiapplication of universal combinatorial coding are unique in the existing coding methods. Universal combinatorial coding has theoretical research and practical application value. PMID:24772019

  10. Research on universal combinatorial coding.

    PubMed

    Lu, Jun; Zhang, Zhuo; Mo, Juan

    2014-01-01

    The conception of universal combinatorial coding is proposed. Relations exist more or less in many coding methods. It means that a kind of universal coding method is objectively existent. It can be a bridge connecting many coding methods. Universal combinatorial coding is lossless and it is based on the combinatorics theory. The combinational and exhaustive property make it closely related with the existing code methods. Universal combinatorial coding does not depend on the probability statistic characteristic of information source, and it has the characteristics across three coding branches. It has analyzed the relationship between the universal combinatorial coding and the variety of coding method and has researched many applications technologies of this coding method. In addition, the efficiency of universal combinatorial coding is analyzed theoretically. The multicharacteristic and multiapplication of universal combinatorial coding are unique in the existing coding methods. Universal combinatorial coding has theoretical research and practical application value. PMID:24772019

  11. TOMO3D: 3-D joint refraction and reflection traveltime tomography parallel code for active-source seismic data—synthetic test

    NASA Astrophysics Data System (ADS)

    Meléndez, A.; Korenaga, J.; Sallarès, V.; Miniussi, A.; Ranero, C. R.

    2015-10-01

    We present a new 3-D traveltime tomography code (TOMO3D) for the modelling of active-source seismic data that uses the arrival times of both refracted and reflected seismic phases to derive the velocity distribution and the geometry of reflecting boundaries in the subsurface. This code is based on its popular 2-D version TOMO2D from which it inherited the methods to solve the forward and inverse problems. The traveltime calculations are done using a hybrid ray-tracing technique combining the graph and bending methods. The LSQR algorithm is used to perform the iterative regularized inversion to improve the initial velocity and depth models. In order to cope with an increased computational demand due to the incorporation of the third dimension, the forward problem solver, which takes most of the run time (˜90 per cent in the test presented here), has been parallelized with a combination of multi-processing and message passing interface standards. This parallelization distributes the ray-tracing and traveltime calculations among available computational resources. The code's performance is illustrated with a realistic synthetic example, including a checkerboard anomaly and two reflectors, which simulates the geometry of a subduction zone. The code is designed to invert for a single reflector at a time. A data-driven layer-stripping strategy is proposed for cases involving multiple reflectors, and it is tested for the successive inversion of the two reflectors. Layers are bound by consecutive reflectors, and an initial velocity model for each inversion step incorporates the results from previous steps. This strategy poses simpler inversion problems at each step, allowing the recovery of strong velocity discontinuities that would otherwise be smoothened.

  12. Transmission from theory to practice: Experiences using open-source code development and a virtual short course to increase the adoption of new theoretical approaches

    NASA Astrophysics Data System (ADS)

    Harman, C. J.

    2015-12-01

    Even amongst the academic community, new theoretical tools can remain underutilized due to the investment of time and resources required to understand and implement them. This surely limits the frequency that new theory is rigorously tested against data by scientists outside the group that developed it, and limits the impact that new tools could have on the advancement of science. Reducing the barriers to adoption through online education and open-source code can bridge the gap between theory and data, forging new collaborations, and advancing science. A pilot venture aimed at increasing the adoption of a new theory of time-variable transit time distributions was begun in July 2015 as a collaboration between Johns Hopkins University and The Consortium of Universities for the Advancement of Hydrologic Science (CUAHSI). There were four main components to the venture: a public online seminar covering the theory, an open source code repository, a virtual short course designed to help participants apply the theory to their data, and an online forum to maintain discussion and build a community of users. 18 participants were selected for the non-public components based on their responses in an application, and were asked to fill out a course evaluation at the end of the short course, and again several months later. These evaluations, along with participation in the forum and on-going contact with the organizer suggest strengths and weaknesses in this combination of components to assist participants in adopting new tools.

  13. XSOR codes users manual

    SciTech Connect

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ``XSOR``. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms.

  14. System and method for investigating sub-surface features of a rock formation with acoustic sources generating coded signals

    SciTech Connect

    Vu, Cung Khac; Nihei, Kurt; Johnson, Paul A; Guyer, Robert; Ten Cate, James A; Le Bas, Pierre-Yves; Larmat, Carene S

    2014-12-30

    A system and a method for investigating rock formations includes generating, by a first acoustic source, a first acoustic signal comprising a first plurality of pulses, each pulse including a first modulated signal at a central frequency; and generating, by a second acoustic source, a second acoustic signal comprising a second plurality of pulses. A receiver arranged within the borehole receives a detected signal including a signal being generated by a non-linear mixing process from the first-and-second acoustic signal in a non-linear mixing zone within the intersection volume. The method also includes-processing the received signal to extract the signal generated by the non-linear mixing process over noise or over signals generated by a linear interaction process, or both.

  15. Automated model integration at source code level: An approach for implementing models into the NASA Land Information System

    NASA Astrophysics Data System (ADS)

    Wang, S.; Peters-Lidard, C. D.; Mocko, D. M.; Kumar, S.; Nearing, G. S.; Arsenault, K. R.; Geiger, J. V.

    2014-12-01

    Model integration bridges the data flow between modeling frameworks and models. However, models usually do not fit directly into a particular modeling environment, if not designed for it. An example includes implementing different types of models into the NASA Land Information System (LIS), a software framework for land-surface modeling and data assimilation. Model implementation requires scientific knowledge and software expertise and may take a developer months to learn LIS and model software structure. Debugging and testing of the model implementation is also time-consuming due to not fully understanding LIS or the model. This time spent is costly for research and operational projects. To address this issue, an approach has been developed to automate model integration into LIS. With this in mind, a general model interface was designed to retrieve forcing inputs, parameters, and state variables needed by the model and to provide as state variables and outputs to LIS. Every model can be wrapped to comply with the interface, usually with a FORTRAN 90 subroutine. Development efforts need only knowledge of the model and basic programming skills. With such wrappers, the logic is the same for implementing all models. Code templates defined for this general model interface could be re-used with any specific model. Therefore, the model implementation can be done automatically. An automated model implementation toolkit was developed with Microsoft Excel and its built-in VBA language. It allows model specifications in three worksheets and contains FORTRAN 90 code templates in VBA programs. According to the model specification, the toolkit generates data structures and procedures within FORTRAN modules and subroutines, which transfer data between LIS and the model wrapper. Model implementation is standardized, and about 80 - 90% of the development load is reduced. In this presentation, the automated model implementation approach is described along with LIS programming

  16. Platelet aggregation and serum adenosine deaminase (ADA) activity in pregnancy associated with diabetes, hypertension and HIV.

    PubMed

    Leal, Claudio A M; Leal, Daniela B R; Adefegha, Stephen A; Morsch, Vera M; da Silva, José E P; Rezer, João F P; Schrekker, Clarissa M L; Abdalla, Faida H; Schetinger, Maria R C

    2016-07-01

    Platelet aggregation and adenosine deaminase (ADA) activity were evaluated in pregnant women living with some disease conditions including hypertension, diabetes mellitus and human immunodeficiency virus infection. The subject population is consisted of 15 non-pregnant healthy women [control group (CG)], 15 women with normal pregnancy (NP), 7 women with hypertensive pregnancy (HP), 10 women with gestational diabetes mellitus (GDM) and 12 women with human immunodeficiency virus-infected pregnancy (HIP) groups. The aggregation of platelets was checked using an optical aggregometer, and serum ADA activity was determined using the colorimetric method. After the addition of 5 µM of agonist adenosine diphosphate, the percentage of platelet aggregation was significantly (p < 0·05) increased in NP, HP, GDM and HIP groups when compared with the CG, while the addition of 10 µM of the same agonist caused significant (p < 0·05) elevations in HP, GDM and HIP groups when compared with CG. Furthermore, ADA activity was significantly (p < 0·05) enhanced in NP, HP, GDM and HIP groups when compared with CG. In this study, the increased platelet aggregation and ADA activity in pregnancy and pregnancy-associated diseases suggest that platelet aggregation and ADA activity could serve as peripheral markers for the development of effective therapy in the maintenance of homeostasis and some inflammatory process in these pathophysiological conditions. Copyright © 2016 John Wiley & Sons, Ltd. PMID:27273565

  17. Experience with Ada on the F-18 High Alpha Research Vehicle Flight Test Program

    NASA Technical Reports Server (NTRS)

    Regenie, Victoria A.; Earls, Michael; Le, Jeanette; Thomson, Michael

    1992-01-01

    Considerable experience was acquired with Ada at the NASA Dryden Flight Research Facility during the on-going High Alpha Technology Program. In this program, an F-18 aircraft was highly modified by the addition of thrust-vectoring vanes to the airframe. In addition, substantial alteration was made in the original quadruplex flight control system. The result is the High Alpha Research Vehicle. An additional research flight control computer was incorporated in each of the four channels. Software for the research flight control computer was written in Ada. To date, six releases of this software have been flown. This paper provides a detailed description of the modifications to the research flight control system. Efficient ground-testing of the software was accomplished by using simulations that used the Ada for portions of their software. These simulations are also described. Modifying and transferring the Ada for flight software to the software simulation configuration has allowed evaluation of this language. This paper also discusses such significant issues in using Ada as portability, modifiability, and testability as well as documentation requirements.

  18. Experience with Ada on the F-18 High Alpha Research Vehicle flight test program

    NASA Technical Reports Server (NTRS)

    Regenie, Victoria A.; Earls, Michael; Le, Jeanette; Thomson, Michael

    1994-01-01

    Considerable experience has been acquired with Ada at the NASA Dryden Flight Research Facility during the on-going High Alpha Technology Program. In this program, an F-18 aircraft has been highly modified by the addition of thrust-vectoring vanes to the airframe. In addition, substantial alteration was made in the original quadruplex flight control system. The result is the High Alpha Research Vehicle. An additional research flight control computer was incorporated in each of the four channels. Software for the research flight control computer was written Ada. To date, six releases of this software have been flown. This paper provides a detailed description of the modifications to the research flight control system. Efficient ground-testing of the software was accomplished by using simulations that used the Ada for portions of their software. These simulations are also described. Modifying and transferring the Ada flight software to the software simulation configuration has allowed evaluation of this language. This paper also discusses such significant issues in using Ada as portability, modifiability, and testability as well as documentation requirements.

  19. Autologous transplants of Adipose-Derived Adult Stromal (ADAS) cells afford dopaminergic neuroprotection in a model of Parkinson's disease.

    PubMed

    McCoy, Melissa K; Martinez, Terina N; Ruhn, Kelly A; Wrage, Philip C; Keefer, Edward W; Botterman, Barry R; Tansey, Keith E; Tansey, Malú G

    2008-03-01

    Adult adipose contains stromal progenitor cells with neurogenic potential. However, the stability of neuronal phenotypes adopted by Adipose-Derived Adult Stromal (ADAS) cells and whether terminal neuronal differentiation is required for their consideration as alternatives in cell replacement strategies to treat neurological disorders is largely unknown. We investigated whether in vitro neural induction of ADAS cells determined their ability to neuroprotect or restore function in a lesioned dopaminergic pathway. In vitro-expanded naïve or differentiated ADAS cells were autologously transplanted into substantia nigra 1 week after an intrastriatal 6-hydroxydopamine injection. Neurochemical and behavioral measures demonstrated neuroprotective effects of both ADAS grafts against 6-hydroxydopamine-induced dopaminergic neuron death, suggesting that pre-transplantation differentiation of the cells does not determine their ability to survive or neuroprotect in vivo. Therefore, we investigated whether equivalent protection by naïve and neurally-induced ADAS grafts resulted from robust in situ differentiation of both graft types into dopaminergic fates. Immunohistological analyses revealed that ADAS cells did not adopt dopaminergic cell fates in situ, consistent with the limited ability of these cells to undergo terminal differentiation into electrically active neurons in vitro. Moreover, re-exposure of neurally-differentiated ADAS cells to serum-containing medium in vitro confirmed ADAS cell phenotypic instability (plasticity). Lastly, given that gene expression analyses of in vitro-expanded ADAS cells revealed that both naïve and differentiated ADAS cells express potent dopaminergic survival factors, ADAS transplants may have exerted neuroprotective effects by production of trophic factors at the lesion site. ADAS cells may be ideal for ex vivo gene transfer therapies in Parkinson's disease treatment. PMID:18061169

  20. Rational's experience using Ada for very large systems

    NASA Technical Reports Server (NTRS)

    Archer, James E., Jr.; Devlin, Michael T.

    1986-01-01

    The experience using the Rational Environment has confirmed the advantages forseen when the project was started. Interactive syntatic and semantic information makes a tremendous difference in the ease of constructing programs and making changes to them. The ability to follow semantic references makes it easier to understand exisiting programs and the impact of changes. The integrated debugger makes it much easier to find bugs and test fixes quickly. Taken together, these facilites have helped greatly in reducing the impact of ongoing maintenance of the ability to produce a new code. Similar improvements are anticipated as the same level of integration and interactivity are achieved for configuration management and version control. The environment has also proven useful in introducing personnel to the project and existing personnel to new parts of the system. Personnel benefit from the assistance with syntax and semantics; everyone benefits from the ability to traverse and understand the structure of unfamiliar software. It is often possible for someone completely unfamiliar with a body of code to use these facilities, to understand it well enough to successfully with a body of code to use these facilities to understand it well enough to successfully diagnose and fix bugs in a matter of minutes.

  1. Code System to Solve the Few-Group Neutron Diffusion Equation Utilizing the Nodal Expansion Method (NEM) for Eigenvalue, Adjoint, and Fixed-Source

    2004-04-21

    Version 04 NESTLE solves the few-group neutron diffusion equation utilizing the NEM. The NESTLE code can solve the eigenvalue (criticality), eigenvalue adjoint, external fixed-source steady-state, and external fixed-source or eigenvalue initiated transient problems. The eigenvalue problem allows criticality searches to be completed, and the external fixed-source steady-state problem can search to achieve a specified power level. Transient problems model delayed neutrons via precursor groups. Several core properties can be input as time dependent. Two- ormore » four-energy groups can be utilized, with all energy groups being thermal groups (i.e., upscatter exits) if desired. Core geometries modeled include Cartesian and hexagonal. Three-, two-, and one-dimensional models can be utilized with various symmetries. The thermal conditions predicted by the thermal-hydraulic model of the core are used to correct cross sections for temperature and density effects. Cross sections are parameterized by color, control rod state (i.e., in or out), and burnup, allowing fuel depletion to be modeled. Either a macroscopic or microscopic model may be employed.« less

  2. Implementation of and Ada real-time executive: A case study

    NASA Technical Reports Server (NTRS)

    Laird, James D.; Burton, Bruce A.; Koppes, Mary R.

    1986-01-01

    Current Ada language implementations and runtime environments are immature, unproven and are a key risk area for real-time embedded computer system (ECS). A test-case environment is provided in which the concerns of the real-time, ECS community are addressed. A priority driven executive is selected to be implemented in the Ada programming language. The model selected is representative of real-time executives tailored for embedded systems used missile, spacecraft, and avionics applications. An Ada-based design methodology is utilized, and two designs are considered. The first of these designs requires the use of vendor supplied runtime and tasking support. An alternative high-level design is also considered for an implementation requiring no vendor supplied runtime or tasking support. The former approach is carried through to implementation.

  3. An enhanced Ada run-time system for real-time embedded processors

    NASA Technical Reports Server (NTRS)

    Sims, J. T.

    1991-01-01

    An enhanced Ada run-time system has been developed to support real-time embedded processor applications. The primary focus of this development effort has been on the tasking system and the memory management facilities of the run-time system. The tasking system has been extended to support efficient and precise periodic task execution as required for control applications. Event-driven task execution providing a means of task-asynchronous control and communication among Ada tasks is supported in this system. Inter-task control is even provided among tasks distributed on separate physical processors. The memory management system has been enhanced to provide object allocation and protected access support for memory shared between disjoint processors, each of which is executing a distinct Ada program.

  4. Car dealer kills sale, but did not violate ADA, court says.

    PubMed

    1997-11-28

    The 5th U.S. Circuit Court of Appeals rejected three of four claims brought by [name removed] W. [Name removed] of Texas whose deal for a used pickup truck fell through when the car dealer learned [name removed] had AIDS. [Name removed], who has since died, sued Landmark Chevrolet for slander, intentional infliction of emotional distress, and violations of the Americans with Disabilities Act (ADA) and the Texas Deceptive Trade Practices Act. A district court ruled in favor of Landmark, however, [name removed]'s wife appealed to the 5th U.S. Circuit Court. The appeals court reversed the issue of slander but agreed that Landmark had not violated the ADA or caused emotional distress. The higher court noted that plaintiffs seeking relief under Title II of the ADA must prove that an immediate threat of harm exists and that there is a risk that s(he) will be harmed again. PMID:11364877

  5. SFACTOR: a computer code for calculating dose equivalent to a target organ per microcurie-day residence of a radionuclide in a source organ - supplementary report

    SciTech Connect

    Dunning, Jr, D E; Pleasant, J C; Killough, G G

    1980-05-01

    The purpose of this report is to describe revisions in the SFACTOR computer code and to provide useful documentation for that program. The SFACTOR computer code has been developed to implement current methodologies for computing the average dose equivalent rate S(X reverse arrow Y) to specified target organs in man due to 1 ..mu..Ci of a given radionuclide uniformly distributed in designated source orrgans. The SFACTOR methodology is largely based upon that of Snyder, however, it has been expanded to include components of S from alpha and spontaneous fission decay, in addition to electron and photon radiations. With this methodology, S-factors can be computed for any radionuclide for which decay data are available. The tabulations in Appendix II provide a reference compilation of S-factors for several dosimetrically important radionuclides which are not available elsewhere in the literature. These S-factors are calculated for an adult with characteristics similar to those of the International Commission on Radiological Protection's Reference Man. Corrections to tabulations from Dunning are presented in Appendix III, based upon the methods described in Section 2.3. 10 refs.

  6. Diagnostic value of sputum adenosine deaminase (ADA) level in pulmonary tuberculosis

    PubMed Central

    Binesh, Fariba; Jalali, Hadi; Zare, Mohammad Reza; Behravan, Farhad; Tafti, Arefeh Dehghani; Behnaz, Fatemah; Tabatabaee, Mohammad; Shahcheraghi, Seyed Hossein

    2016-01-01

    Introduction Tuberculosis is still a considerable health problem in many countries. Rapid diagnosis of this disease is important, and adenosine deaminase (ADA) has been used as a diagnostic test. The aim of this study was to assess the diagnostic value of ADA in the sputum of patients with pulmonary tuberculosis. Methods The current study included 40 patients with pulmonary tuberculosis (culture positive, smear ±) and 42 patients with non tuberculosis pulmonary diseases (culture negative). ADA was measured on all of the samples. Results The median value of ADA in non-tuberculosis patients was 2.94 (4.2) U/L and 4.01 (6.54) U/L in tuberculosis patients, but this difference was not statistically significant (p=0.100). The cut-off point of 3.1 U/L had a sensitivity of 61% and a specificity of 53%, the cut-off point of 2.81 U/L had a sensitivity of 64% and a specificity of 50% and the cut-off point of 2.78 U/L had a sensitivity of 65% and a specificity of 48%. The positive predictive values for cut-off points of 3.1, 2.81 and 2.78 U/L were 55.7%, 57.44% and 69.23%, respectively. The negative predictive values for the abovementioned cut-off points were 56.75%, 57.14% and 55.88%, respectively. Conclusion Our results showed that sputum ADA test is neither specific nor sensitive. Because of its low sensitivity and specificity, determination of sputum ADA for the diagnosis of pulmonary tuberculosis is not recommended. PMID:27482515

  7. The cel3 gene of Agaricus bisporus codes for a modular cellulase and is transcriptionally regulated by the carbon source.

    PubMed Central

    Chow, C M; Yagüe, E; Raguz, S; Wood, D A; Thurston, C F

    1994-01-01

    A 52-kDa protein, CEL3, has been separated from the culture filtrate of Agaricus bisporus during growth on cellulose. A PCR-derived probe was made, with a degenerate oligodeoxynucleotide derived from the amino acid sequence of a CEL3 CNBr cleavage product and was used to select cel3 cDNA clones from an A. bisporus cDNA library. Two allelic cDNAs were isolated. They showed 98.8% identity of their nucleotide sequences. The deduced amino acid sequence and domain architecture of CEL3 showed a high degree of similarity to those of cellobiohydrolase II of Trichoderma reesei. Functional expression of cel3 cDNA in Saccharomyces cerevisiae was achieved by placing it under the control of a constitutive promoter and fusing it to the yeast invertase signal sequence. Recombinant CEL3 secreted by yeast showed enzymatic activity towards crystalline cellulose. At long reaction times, CEL3 was also able to degrade carboxymethyl cellulose. Northern (RNA) analysis showed that cel3 gene expression was induced by cellulose and repressed by glucose, fructose, 2-deoxyglucose, and lactose. Glycerol, mannitol, sorbitol, and maltose were neutral carbon sources. Nuclear run-on analysis showed that the rate of synthesis of cel3 mRNA in cellulose-grown cultures was 13 times higher than that in glucose-grown cultures. A low basal rate of cel3 mRNA synthesis was observed in the nuclei isolated from glucose-grown mycelia. Images PMID:8085821

  8. ADA, the Programming Language of Choice for the UPMSat-2 Satellite

    NASA Astrophysics Data System (ADS)

    Garrido, Jorge; Zamorano, Juan; de la Puente, Juan A.; Alonso, Alejandro; Salazar, Emilio

    2015-09-01

    The proper selection of development mechanisms and tools is essential for the final success of any engineering project. This is also true when it comes to software development. Furthermore, when the system shows very specific and hard to meet requirements, as it happens for high-integrity real-time systems, the appropriate selection is crucial. For this kind of systems, Ada has proven to be a successful companion, and satellites are not an exception. The paper presents the reasons behind the selection of Ada for the UPMSat-2 development, along with the experience and examples on its usage.

  9. An Ada implementation of the network manager for the advanced information processing system

    NASA Technical Reports Server (NTRS)

    Nagle, Gail A.

    1986-01-01

    From an implementation standpoint, the Ada language provided many features which facilitated the data and procedure abstraction process. The language supported a design which was dynamically flexible (despite strong typing), modular, and self-documenting. Adequate training of programmers requires access to an efficient compiler which supports full Ada. When the performance issues for real time processing are finally addressed by more stringent requirements for tasking features and the development of efficient run-time environments for embedded systems, the full power of the language will be realized.

  10. The Spectrum Sensing Algorithm Based AdaBoost in Cognitive Radio

    NASA Astrophysics Data System (ADS)

    Tian, Deyong; Wang, Xin

    To solve the low detection rate of the primary user in the cognitive radio environment, we propose a spectrum sensing method based on AdaBoost in the case of low SNR. In this paper, a set of received signal spectrum features are first calculated and extracted the discriminant feature vector as training samples and testing samples for classification. Finally, we utilize the trained AdaBoost to detect the primary user. Test result shows that the proposed algorithm is not affected by uncertainty factors of noise and has high performance to classification detection compared with ANN, SVM and maximum-minimum eigenvalue (MME).

  11. Clinical coding. Code breakers.

    PubMed

    Mathieson, Steve

    2005-02-24

    --The advent of payment by results has seen the role of the clinical coder pushed to the fore in England. --Examinations for a clinical coding qualification began in 1999. In 2004, approximately 200 people took the qualification. --Trusts are attracting people to the role by offering training from scratch or through modern apprenticeships. PMID:15768716

  12. Space and Terrestrial Power System Integration Optimization Code BRMAPS for Gas Turbine Space Power Plants With Nuclear Reactor Heat Sources

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    2007-01-01

    In view of the difficult times the US and global economies are experiencing today, funds for the development of advanced fission reactors nuclear power systems for space propulsion and planetary surface applications are currently not available. However, according to the Energy Policy Act of 2005 the U.S. needs to invest in developing fission reactor technology for ground based terrestrial power plants. Such plants would make a significant contribution toward drastic reduction of worldwide greenhouse gas emissions and associated global warming. To accomplish this goal the Next Generation Nuclear Plant Project (NGNP) has been established by DOE under the Generation IV Nuclear Systems Initiative. Idaho National Laboratory (INL) was designated as the lead in the development of VHTR (Very High Temperature Reactor) and HTGR (High Temperature Gas Reactor) technology to be integrated with MMW (multi-megawatt) helium gas turbine driven electric power AC generators. However, the advantages of transmitting power in high voltage DC form over large distances are also explored in the seminar lecture series. As an attractive alternate heat source the Liquid Fluoride Reactor (LFR), pioneered at ORNL (Oak Ridge National Laboratory) in the mid 1960's, would offer much higher energy yields than current nuclear plants by using an inherently safe energy conversion scheme based on the Thorium --> U233 fuel cycle and a fission process with a negative temperature coefficient of reactivity. The power plants are to be sized to meet electric power demand during peak periods and also for providing thermal energy for hydrogen (H2) production during "off peak" periods. This approach will both supply electric power by using environmentally clean nuclear heat which does not generate green house gases, and also provide a clean fuel H2 for the future, when, due to increased global demand and the decline in discovering new deposits, our supply of liquid fossil fuels will have been used up. This is

  13. 28 CFR Appendix B to Part 36 - Analysis and Commentary on the 2010 ADA Standards for Accessible Design

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... final rules for title II (28 CFR part 35) and title III (28 CFR part 36) of the Americans with... the Department's revised ADA title II regulation, 28 CFR 35.104 Definitions, the Department defines... consist of the 2004 ADA Accessibility Guidelines (ADAAG) and the requirements contained in 28 CFR...

  14. 28 CFR Appendix B to Part 36 - Analysis and Commentary on the 2010 ADA Standards for Accessible Design

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... final rules for title II (28 CFR part 35) and title III (28 CFR part 36) of the Americans with... the Department's revised ADA title II regulation, 28 CFR 35.104 Definitions, the Department defines... consist of the 2004 ADA Accessibility Guidelines (ADAAG) and the requirements contained in 28 CFR...

  15. 28 CFR Appendix B to Part 36 - Analysis and Commentary on the 2010 ADA Standards for Accessible Design

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... final rules for title II (28 CFR part 35) and title III (28 CFR part 36) of the Americans with... the Department's revised ADA title II regulation, 28 CFR 35.104 Definitions, the Department defines... consist of the 2004 ADA Accessibility Guidelines (ADAAG) and the requirements contained in 28 CFR...

  16. 28 CFR Appendix B to Part 36 - Analysis and Commentary on the 2010 ADA Standards for Accessible Design

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... final rules for title II (28 CFR part 35) and title III (28 CFR part 36) of the Americans with... the Department's revised ADA title II regulation, 28 CFR 35.104 Definitions, the Department defines... consist of the 2004 ADA Accessibility Guidelines (ADAAG) and the requirements contained in 28 CFR...

  17. The repository-based software engineering program: Redefining AdaNET as a mainstream NASA source

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Repository-based Software Engineering Program (RBSE) is described to inform and update senior NASA managers about the program. Background and historical perspective on software reuse and RBSE for NASA managers who may not be familiar with these topics are provided. The paper draws upon and updates information from the RBSE Concept Document, baselined by NASA Headquarters, Johnson Space Center, and the University of Houston - Clear Lake in April 1992. Several of NASA's software problems and what RBSE is now doing to address those problems are described. Also, next steps to be taken to derive greater benefit from this Congressionally-mandated program are provided. The section on next steps describes the need to work closely with other NASA software quality, technology transfer, and reuse activities and focuses on goals and objectives relative to this need. RBSE's role within NASA is addressed; however, there is also the potential for systematic transfer of technology outside of NASA in later stages of the RBSE program. This technology transfer is discussed briefly.

  18. 1972-73 Enrollment and Attendance, with a History of Enrollment and ADA from 1963.

    ERIC Educational Resources Information Center

    Los Angeles Community Coll. District, CA. Div. of Educational Planning and Development.

    The 1972-73 Enrollment and Attendance Report provides a history of enrollment and average daily attendance (ADA) for the District and each college for the last ten years. In some instances, the data presented are not complete for the full ten years due to the lack of historical records. In others, a deliberate attempt was made only to summarize…

  19. Implementation of Ada protocols on Mil-STD-1553 B data bus

    NASA Technical Reports Server (NTRS)

    Ruhman, Smil; Rosemberg, Flavia

    1986-01-01

    Standardization activity of data communication in avionic systems started in 1968 for the purpose of total system integration and the elimination of heavy wire bundles carrying signals between various subassemblies. The growing complexity of avionic systems is straining the capabilities of MIL-STD-1553 B (first issued in 1973), but a much greater challenge to it is posed by Ada, the standard language adopted for real-time, computer embedded-systems. Hardware implementation of Ada communication protocols in a contention/token bus or token ring network is proposed. However, during the transition period when the current command/response multiplex data bus is still flourishing and the development environment for distributed multi-computer Ada systems is as yet lacking, a temporary accomodation of the standard language with the standard bus could be very useful and even highly desirable. By concentrating all status informtion and decisions at the bus controller, it was found to be possible to construct an elegant and efficient harware impelementation of the Ada protocols at the bus interface. This solution is discussed.

  20. EVALUATION OF THE ADA TECHNOLOGIES' ELECTRO-DECON PROCESS TO REMOVE RADIOLOGICAL CONTAMINATION

    SciTech Connect

    Pao, Jenn-Hai; Demmer, Rick L.; Argyle, Mark D.; Veatch, Brad D.

    2003-02-27

    A surface decontamination system featuring the use of ADA's electrochemical process was tested and evaluated. The process can be flexibly deployed by using an electrolyte delivery system that has been demonstrated to be reliable and effective. Experimental results demonstrate the effectiveness of this system for the surface decontamination of radiologically contaminated stainless steel.

  1. Lessons learned: Managing the development of a corporate Ada training project

    NASA Technical Reports Server (NTRS)

    Blackmon, Linda F.

    1986-01-01

    The management lessons learned during the implementation of a corporate mandate to develop and deliver an effective Ada training program to all divisions are discussed. The management process involved in obtaining cooperation from all levels in the development of a corporate-wide project is described. The problem areas are identified along with some possible solutions.

  2. Accommodation Hell, or, To Hell with Accommodation: The ADA and the Administration.

    ERIC Educational Resources Information Center

    Robinson, William L.

    This material is designed to help faculty understand the requirements of the Americans with Disabilities Act of 1990 (ADA). A brief overview notes three key considerations: the definition of disability, reasonable accommodation, and undue hardship, and then discusses faculty liability and responsibility for discriminatory acts. The balance of the…

  3. Health Care and ADA Language Education Programs. Cooperative Demonstration Program: High Technology. Final Performance Report.

    ERIC Educational Resources Information Center

    Marion County Schools, Fairmont, WV.

    A project implemented cooperative training programs in the three occupational areas: ADA computer language use; respiratory therapy technician; and hospital pharmacy technician. The project's purpose was to demonstrate high technology training programs for adults as a cooperative effort among the West Virginia Department of Education, local…

  4. Run-time implementation issues for real-time embedded Ada

    NASA Technical Reports Server (NTRS)

    Maule, Ruth A.

    1986-01-01

    A motivating factor in the development of Ada as the department of defense standard language was the high cost of embedded system software development. It was with embedded system requirements in mind that many of the features of the language were incorporated. Yet it is the designers of embedded systems that seem to comprise the majority of the Ada community dissatisfied with the language. There are a variety of reasons for this dissatisfaction, but many seem to be related in some way to the Ada run-time support system. Some of the areas in which the inconsistencies were found to have the greatest impact on performance from the standpoint of real-time systems are presented. In particular, a large part of the duties of the tasking supervisor are subject to the design decisions of the implementer. These include scheduling, rendezvous, delay processing, and task activation and termination. Some of the more general issues presented include time and space efficiencies, generic expansions, memory management, pragmas, and tracing features. As validated compilers become available for bare computer targets, it is important for a designer to be aware that, at least for many real-time issues, all validated Ada compilers are not created equal.

  5. 78 FR 34095 - Adequacy Status of the Idaho, Northern Ada County PM10

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-06

    ... less (PM 10 ), nitrogen oxides (NOx), and volatile organic compounds (VOC) for the years 2008, 2015 and... Northern Ada County PM10 Maintenance Area Budget year PM10 NOX VOC 2008 31.0 29.5 12.6 2015 42.9 29.5...

  6. How Libraries Must Comply with the Americans with Disabilities Act (ADA).

    ERIC Educational Resources Information Center

    Foos, Donald D., Comp.; Pack, Nancy C., Comp.

    The Americans with Disabilities Act (ADA) directs public and private libraries--academic, public, school, and special--to provide services to people with disabilities that are equal to services provided to citizens without disabilities. Six chapters in this book provide information to help library administrators and staff to fully understand the…

  7. Section 504, the ADA, and Public Schools: What Educators Need To Know.

    ERIC Educational Resources Information Center

    Smith, Tom E. C.

    2001-01-01

    This article provides an overview of the requirements of Section 504 of the Rehabilitation Act of 1973 and the Americans with Disabilities Act (ADA), both civil rights laws that require schools to provide eligible students with equal access to a free, appropriate education and to extracurricular activities. Actions schools can take to ensure…

  8. 76 FR 57013 - Recordkeeping and Reporting Requirements Under Title VII, the ADA, and GINA

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-15

    ... proposed modifications of its recordkeeping and reporting provisions under title VII, the ADA, and GINA. (76 FR 31892, June 2, 2011). No requests to present oral testimony at a hearing concerning the... COMMISSION 29 CFR Part 1602 RIN 3046-AA89 Recordkeeping and Reporting Requirements Under Title VII, the...

  9. Evaluation of the scale dependent dynamic SGS model in the open source code caffa3d.MBRi in wall-bounded flows

    NASA Astrophysics Data System (ADS)

    Draper, Martin; Usera, Gabriel

    2015-04-01

    The Scale Dependent Dynamic Model (SDDM) has been widely validated in large-eddy simulations using pseudo-spectral codes [1][2][3]. The scale dependency, particularly the potential law, has been proved also in a priori studies [4][5]. To the authors' knowledge there have been only few attempts to use the SDDM in finite difference (FD) and finite volume (FV) codes [6][7], finding some improvements with the dynamic procedures (scale independent or scale dependent approach), but not showing the behavior of the scale-dependence parameter when using the SDDM. The aim of the present paper is to evaluate the SDDM in the open source code caffa3d.MBRi, an updated version of the code presented in [8]. caffa3d.MBRi is a FV code, second-order accurate, parallelized with MPI, in which the domain is divided in unstructured blocks of structured grids. To accomplish this, 2 cases are considered: flow between flat plates and flow over a rough surface with the presence of a model wind turbine, taking for this case the experimental data presented in [9]. In both cases the standard Smagorinsky Model (SM), the Scale Independent Dynamic Model (SIDM) and the SDDM are tested. As presented in [6][7] slight improvements are obtained with the SDDM. Nevertheless, the behavior of the scale-dependence parameter supports the generalization of the dynamic procedure proposed in the SDDM, particularly taking into account that no explicit filter is used (the implicit filter is unknown). [1] F. Porté-Agel, C. Meneveau, M.B. Parlange. "A scale-dependent dynamic model for large-eddy simulation: application to a neutral atmospheric boundary layer". Journal of Fluid Mechanics, 2000, 415, 261-284. [2] E. Bou-Zeid, C. Meneveau, M. Parlante. "A scale-dependent Lagrangian dynamic model for large eddy simulation of complex turbulent flows". Physics of Fluids, 2005, 17, 025105 (18p). [3] R. Stoll, F. Porté-Agel. "Dynamic subgrid-scale models for momentum and scalar fluxes in large-eddy simulations of

  10. Development and implementation in the Monte Carlo code PENELOPE of a new virtual source model for radiotherapy photon beams and portal image calculation.

    PubMed

    Chabert, I; Barat, E; Dautremer, T; Montagu, T; Agelou, M; Croc de Suray, A; Garcia-Hernandez, J C; Gempp, S; Benkreira, M; de Carlan, L; Lazaro, D

    2016-07-21

    This work aims at developing a generic virtual source model (VSM) preserving all existing correlations between variables stored in a Monte Carlo pre-computed phase space (PS) file, for dose calculation and high-resolution portal image prediction. The reference PS file was calculated using the PENELOPE code, after the flattening filter (FF) of an Elekta Synergy 6 MV photon beam. Each particle was represented in a mobile coordinate system by its radial position (r s ) in the PS plane, its energy (E), and its polar and azimuthal angles (φ d and θ d ), describing the particle deviation compared to its initial direction after bremsstrahlung, and the deviation orientation. Three sub-sources were created by sorting out particles according to their last interaction location (target, primary collimator or FF). For each sub-source, 4D correlated-histograms were built by storing E, r s , φ d and θ d values. Five different adaptive binning schemes were studied to construct 4D histograms of the VSMs, to ensure histogram efficient handling as well as an accurate reproduction of E, r s , φ d and θ d distribution details. The five resulting VSMs were then implemented in PENELOPE. Their accuracy was first assessed in the PS plane, by comparing E, r s , φ d and θ d distributions with those obtained from the reference PS file. Second, dose distributions computed in water, using the VSMs and the reference PS file located below the FF, and also after collimation in both water and heterogeneous phantom, were compared using a 1.5%-0 mm and a 2%-0 mm global gamma index, respectively. Finally, portal images were calculated without and with phantoms in the beam. The model was then evaluated using a 1%-0 mm global gamma index. Performance of a mono-source VSM was also investigated and led, as with the multi-source model, to excellent results when combined with an adaptive binning scheme. PMID:27353090

  11. Development and implementation in the Monte Carlo code PENELOPE of a new virtual source model for radiotherapy photon beams and portal image calculation

    NASA Astrophysics Data System (ADS)

    Chabert, I.; Barat, E.; Dautremer, T.; Montagu, T.; Agelou, M.; Croc de Suray, A.; Garcia-Hernandez, J. C.; Gempp, S.; Benkreira, M.; de Carlan, L.; Lazaro, D.

    2016-07-01

    This work aims at developing a generic virtual source model (VSM) preserving all existing correlations between variables stored in a Monte Carlo pre-computed phase space (PS) file, for dose calculation and high-resolution portal image prediction. The reference PS file was calculated using the PENELOPE code, after the flattening filter (FF) of an Elekta Synergy 6 MV photon beam. Each particle was represented in a mobile coordinate system by its radial position (r s ) in the PS plane, its energy (E), and its polar and azimuthal angles (φ d and θ d ), describing the particle deviation compared to its initial direction after bremsstrahlung, and the deviation orientation. Three sub-sources were created by sorting out particles according to their last interaction location (target, primary collimator or FF). For each sub-source, 4D correlated-histograms were built by storing E, r s , φ d and θ d values. Five different adaptive binning schemes were studied to construct 4D histograms of the VSMs, to ensure histogram efficient handling as well as an accurate reproduction of E, r s , φ d and θ d distribution details. The five resulting VSMs were then implemented in PENELOPE. Their accuracy was first assessed in the PS plane, by comparing E, r s , φ d and θ d distributions with those obtained from the reference PS file. Second, dose distributions computed in water, using the VSMs and the reference PS file located below the FF, and also after collimation in both water and heterogeneous phantom, were compared using a 1.5%–0 mm and a 2%–0 mm global gamma index, respectively. Finally, portal images were calculated without and with phantoms in the beam. The model was then evaluated using a 1%–0 mm global gamma index. Performance of a mono-source VSM was also investigated and led, as with the multi-source model, to excellent results when combined with an adaptive binning scheme.

  12. Array analyses of volcanic earthquakes and tremor recorded at Las Cañadas caldera (Tenerife Island, Spain) during the 2004 seismic activation of Teide volcano

    NASA Astrophysics Data System (ADS)

    Almendros, Javier; Ibáñez, Jesús M.; Carmona, Enrique; Zandomeneghi, Daria

    2007-02-01

    We analyze data from three seismic antennas deployed in Las Cañadas caldera (Tenerife) during May-July 2004. The period selected for the analysis (May 12-31, 2004) constitutes one of the most active seismic episodes reported in the area, except for the precursory seismicity accompanying historical eruptions. Most seismic signals recorded by the antennas were volcano-tectonic (VT) earthquakes. They usually exhibited low magnitudes, although some of them were large enough to be felt at nearby villages. A few long-period (LP) events, generally associated with the presence of volcanic fluids in the medium, were also detected. Furthermore, we detected the appearance of a continuous tremor that started on May 18 and lasted for several weeks, at least until the end of the recording period. It is the first time that volcanic tremor has been reported at Teide volcano. This tremor was a small-amplitude, narrow-band signal with central frequency in the range 1-6 Hz. It was detected at the three antennas located in Las Cañadas caldera. We applied the zero-lag cross-correlation (ZLCC) method to estimate the propagation parameters (back-azimuth and apparent slowness) of the recorded signals. For VT earthquakes, we also determined the S-P times and source locations. Our results indicate that at the beginning of the analyzed period most earthquakes clustered in a deep volume below the northwest flank of Teide volcano. The similarity of the propagation parameters obtained for LP events and these early VT earthquakes suggests that LP events might also originate within the source volume of the VT cluster. During the last two weeks of May, VT earthquakes were generally shallower, and spread all over Las Cañadas caldera. Finally, the analysis of the tremor wavefield points to the presence of multiple, low-energy sources acting simultaneously. We propose a model to explain the pattern of seismicity observed at Teide volcano. The process started in early April with a deep magma

  13. Description of real-time Ada software implementation of a power system monitor for the Space Station Freedom PMAD DC testbed

    NASA Technical Reports Server (NTRS)

    Ludwig, Kimberly; Mackin, Michael; Wright, Theodore

    1991-01-01

    The authors describe the Ada language software developed to perform the electrical power system monitoring functions for the NASA Lewis Research Center's Power Management and Distribution (PMAD) DC testbed. The results of the effort to implement this monitor are presented. The PMAD DC testbed is a reduced-scale prototype of the electric power system to be used in Space Station Freedom. The power is controlled by smart switches known as power control components (or switchgear). The power control components are currently coordinated by five Compaq 386/20e computers connected through an 802.4 local area network. The power system monitor algorithm comprises several functions, including periodic data acquisition, data smoothing, system performance analysis, and status reporting. Data are collected from the switchgear sensors every 100 ms, then passed through a 2-Hz digital filter. System performance analysis includes power interruption and overcurrent detection. The system monitor required a hardware timer interrupt to activate the data acquisition function. The execution time of the code was optimized by using an assembly language routine. The routine allows direct vectoring of the processor to Ada language procedures that perform periodic control activities.

  14. Multicast Reduction Network Source Code

    SciTech Connect

    Lee, G.

    2006-12-19

    MRNet is a software tree-based overlay network developed at the University of Wisconsin, Madison that provides a scalable communication mechanism for parallel tools. MRNet, uses a tree topology of networked processes between a user tool and distributed tool daemons. This tree topology allows scalable multicast communication from the tool to the daemons. The internal nodes of the tree can be used to distribute computation and alalysis on data sent from the tool daemons to the tool. This release covers minor implementation to port this software to the BlueGene/L architecuture and for use with a new implementation of the Dynamic Probe Class Library.

  15. Multicast Reduction Network Source Code

    2006-12-19

    MRNet is a software tree-based overlay network developed at the University of Wisconsin, Madison that provides a scalable communication mechanism for parallel tools. MRNet, uses a tree topology of networked processes between a user tool and distributed tool daemons. This tree topology allows scalable multicast communication from the tool to the daemons. The internal nodes of the tree can be used to distribute computation and alalysis on data sent from the tool daemons to themore » tool. This release covers minor implementation to port this software to the BlueGene/L architecuture and for use with a new implementation of the Dynamic Probe Class Library.« less

  16. Measuring Up: Lakeland Community College Report of the ADA Task Force. A Self Evaluation of College Services, Facilities, Programs, and Activities.

    ERIC Educational Resources Information Center

    Lee, Martha C.; Mastrangelo, Eliz. B.

    Prepared by the Americans with Disabilities Act (ADA) Task Force at Lakeland Community College (LCC) in Ohio, this report assesses LCC's compliance with ADA provisions and presents recommendations concerning projects to be undertaken. Section I provides an introduction to the ADA and its impact at LCC. Section II describes the self-evaluation…

  17. Evaluation of the area factor used in the RESRAD code for the estimation of airborne contaminant concentrations of finite area sources

    SciTech Connect

    Chang, Y.S.; Yu, C.; Wang, S.K.

    1998-07-01

    The area factor is used in the RESRAD code to estimate the airborne contaminant concentrations for a finite area of contaminated soils. The area factor model used in RESRAD version 5.70 and earlier (referred to as the old area factor) was a simple, but conservative, mixing model that tended to overestimate the airborne concentrations of radionuclide contaminants. An improved and more realistic model for the area factor (referred to here as the new area factor) is described in this report. The new area factor model is designed to reflect site-specific soil characteristics and meteorological conditions. The site-specific parameters considered include the size of the source area, average particle diameter, and average wind speed. Other site-specific parameters (particle density, atmospheric stability, raindrop diameter, and annual precipitation rate) were assumed to be constant. The model uses the Gaussian plume model combined with contaminant removal processes, such as dry and wet deposition of particulates. Area factors estimated with the new model are compared with old area factors that were based on the simple mixing model. In addition, sensitivity analyses are conducted for parameters assumed to be constant. The new area factor model has been incorporated into RESRAD version 5.75 and later.

  18. Tale of a multifaceted co-activator, hADA3: from embryogenesis to cancer and beyond.

    PubMed

    Chand, Vaibhav; Nandi, Deeptashree; Mangla, Anita Garg; Sharma, Puneet; Nag, Alo

    2016-09-01

    Human ADA3, the evolutionarily conserved transcriptional co-activator, remains the unified part of multiple cellular functions, including regulation of nuclear receptor functions, cell proliferation, apoptosis, senescence, chromatin remodelling, genomic stability and chromosomal maintenance. The past decade has witnessed exciting findings leading to considerable expansion in research related to the biology and regulation of hADA3. Embryonic lethality in homozygous knockout Ada3 mouse signifies the importance of this gene product during early embryonic development. Moreover, the fact that it is a novel target of Human Papillomavirus E6 oncoprotein, one of the most prevalent causal agents behind cervical cancer, helps highlight some of the crucial aspects of HPV-mediated oncogenesis. These findings imply the central involvement of hADA3 in regulation of various cellular functional losses accountable for the genesis of malignancy and viral infections. Recent reports also provide evidence for post-translational modifications of hADA3 leading to its instability and contributing to the malignant phenotype of cervical cancer cells. Furthermore, its association with poor prognosis of breast cancer suggests intimate association in the pathogenesis of the disease. Here, we present the first review on hADA3 with a comprehensive outlook on the molecular and functional roles of hADA3 to provoke further interest for more elegant and intensive studies exploring assorted aspects of this protein. PMID:27605378

  19. Lessons learned in the transition to ADA from FORTRAN at NASA/Goddard. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Brophy, Carolyn Elizabeth

    1989-01-01

    A case study was done at Goddard Space Flight Center, in which two dynamics satellite simulators are developed from the same requirements, one in Ada and the other in FORTRAN. The purpose of the research was to find out how well the prescriptive Ada development model worked to develop the Ada simulator. The FORTRAN simulator development, as well as past FORTRAN developments, provided a baseline for comparison. Since this was the first simulator developed here, the prescriptive Ada development model had many similarities to the usual FORTRAN development model. However, it was modified to include longer design and shorter testing phases, which is generally expected with Ada development. One surprising result was that the percentage of time the Ada project spent in the various development activities was very similar to the percentage of time spent in these activities when doing a FORTRAN project. Another surprising finding was the difficulty the Ada team had with unit testing as well as with integration. In retrospect it is realized that adding additional steps to the design phase, such as an abstract data type analysis, and certain guidelines to the implementation phase, such as to use primarily library units and nest sparingly, would have made development much easier.

  20. The implementation and use of Ada on distributed systems with high reliability requirements

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1986-01-01

    The general inadequacy of Ada for programming systems that must survive processor loss was shown. A solution to the problem was proposed in which there are no syntatic changes to Ada. The approach was evaluated using a full-scale, realistic application. The application used was the Advanced Transport Operating System (ATOPS), an experimental computer control system developed for a modified Boeing 737 aircraft. The ATOPS system is a full authority, real-time avionics system providing a large variety of advanced features. Methods of building fault tolerance into concurrent systems were explored. A set of criteria by which the proposed method will be judged was examined. Extensive interaction with personnel from Computer Sciences Corporation and NASA Langley occurred to determine the requirements of the ATOPS software. Backward error recovery in concurrent systems was assessed.