Sample records for software requirement specifications

  1. 78 FR 47015 - Software Requirement Specifications for Digital Computer Software Used in Safety Systems of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Requirement Specifications for Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission... issuing a revised regulatory guide (RG), revision 1 of RG 1.172, ``Software Requirement Specifications for...

  2. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... Computer Software and Complex Electronics Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear...-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used... Electronics Engineers (ANSI/IEEE) Standard 830-1998, ``IEEE Recommended Practice for Software Requirements...

  3. A discussion of higher order software concepts as they apply to functional requirements and specifications. [space shuttles and guidance

    NASA Technical Reports Server (NTRS)

    Hamilton, M.

    1973-01-01

    The entry guidance software functional requirements (requirements design phase), its architectural requirements (specifications design phase), and the entry guidance software verified code are discussed. It was found that the proper integration of designs at both the requirements and specifications levels are of high priority consideration.

  4. Development of simulation computer complex specification

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The Training Simulation Computer Complex Study was one of three studies contracted in support of preparations for procurement of a shuttle mission simulator for shuttle crew training. The subject study was concerned with definition of the software loads to be imposed on the computer complex to be associated with the shuttle mission simulator and the development of procurement specifications based on the resulting computer requirements. These procurement specifications cover the computer hardware and system software as well as the data conversion equipment required to interface the computer to the simulator hardware. The development of the necessary hardware and software specifications required the execution of a number of related tasks which included, (1) simulation software sizing, (2) computer requirements definition, (3) data conversion equipment requirements definition, (4) system software requirements definition, (5) a simulation management plan, (6) a background survey, and (7) preparation of the specifications.

  5. Software Requirements Specification for an Ammunition Management System

    DTIC Science & Technology

    1986-09-01

    thesis takes the form of a software requirements specification. Such a specification, according to Pressman [Ref. 7], establishes a complete...defined by Pressman , is depicted in Figure 1.1. 11 Figure 1.1 Generalized Software Life Cycle The common thread which binds the various phases together...application of software engineering principles requires an established methodology. This methodology, according to Pressman [Ref. 8:p. 151 is an

  6. Software requirements specification for the GIS-T/ISTEA pooled fund study phase C linear referencing engine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amai, W.; Espinoza, J. Jr.; Fletcher, D.R.

    1997-06-01

    This Software Requirements Specification (SRS) describes the features to be provided by the software for the GIS-T/ISTEA Pooled Fund Study Phase C Linear Referencing Engine project. This document conforms to the recommendations of IEEE Standard 830-1984, IEEE Guide to Software Requirements Specification (Institute of Electrical and Electronics Engineers, Inc., 1984). The software specified in this SRS is a proof-of-concept implementation of the Linear Referencing Engine as described in the GIS-T/ISTEA pooled Fund Study Phase B Summary, specifically Sheet 13 of the Phase B object model. The software allows an operator to convert between two linear referencing methods and a datummore » network.« less

  7. How Well Can Existing Software Support Processes Accomplish Sustainment of a Non-Developmental Item-Based Acquisition Strategy

    DTIC Science & Technology

    2017-04-06

    Research Hypothesis ........................................................................................................... 15 Research Design ...user community and of accommodating advancing software applications by the vendors. Research Design My approach to this project was to conduct... design descriptions , requirements specifications, test documentation, interface requirement specifications, product specifications, and software

  8. Enhancing requirements engineering for patient registry software systems with evidence-based components.

    PubMed

    Lindoerfer, Doris; Mansmann, Ulrich

    2017-07-01

    Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be a first step to create standards for patient registry software system assessments. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Experience report: Using formal methods for requirements analysis of critical spacecraft software

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Ampo, Yoko

    1994-01-01

    Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.

  10. Impact of Requirements Quality on Project Success or Failure

    NASA Astrophysics Data System (ADS)

    Tamai, Tetsuo; Kamata, Mayumi Itakura

    We are interested in the relationship between the quality of the requirements specifications for software projects and the subsequent outcome of the projects. To examine this relationship, we investigated 32 projects started and completed between 2003 and 2005 by the software development division of a large company in Tokyo. The company has collected reliable data on requirements specification quality, as evaluated by software quality assurance teams, and overall project performance data relating to cost and time overruns. The data for requirements specification quality were first converted into a multiple-dimensional space, with each dimension corresponding to an item of the recommended structure for software requirements specifications (SRS) defined in IEEE Std. 830-1998. We applied various statistical analysis methods to the SRS quality data and project outcomes.

  11. From Goal-Oriented Requirements to Event-B Specifications

    NASA Technical Reports Server (NTRS)

    Aziz, Benjamin; Arenas, Alvaro E.; Bicarregui, Juan; Ponsard, Christophe; Massonet, Philippe

    2009-01-01

    In goal-oriented requirements engineering methodologies, goals are structured into refinement trees from high-level system-wide goals down to fine-grained requirements assigned to specific software/ hardware/human agents that can realise them. Functional goals assigned to software agents need to be operationalised into specification of services that the agent should provide to realise those requirements. In this paper, we propose an approach for operationalising requirements into specifications expressed in the Event-B formalism. Our approach has the benefit of aiding software designers by bridging the gap between declarative requirements and operational system specifications in a rigorous manner, enabling powerful correctness proofs and allowing further refinements down to the implementation level. Our solution is based on verifying that a consistent Event-B machine exhibits properties corresponding to requirements.

  12. Software attribute visualization for high integrity software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  13. Writing testable software requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knirk, D.

    1997-11-01

    This tutorial identifies common problems in analyzing requirements in the problem and constructing a written specification of what the software is to do. It deals with two main problem areas: identifying and describing problem requirements, and analyzing and describing behavior specifications.

  14. Advanced information processing system: Input/output network management software

    NASA Technical Reports Server (NTRS)

    Nagle, Gail; Alger, Linda; Kemp, Alexander

    1988-01-01

    The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.

  15. 7 CFR 1753.7 - Plans and specifications (P&S).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... specifications prepared by the borrower's engineer. The specifications prepared by the borrower's engineer and... its contractor complies with the insurance and bond requirements. (4) Telecommunications software license provision. If the borrower is required to enter into a software license agreement in order to use...

  16. 7 CFR 1753.7 - Plans and specifications (P&S).

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... specifications prepared by the borrower's engineer. The specifications prepared by the borrower's engineer and... its contractor complies with the insurance and bond requirements. (4) Telecommunications software license provision. If the borrower is required to enter into a software license agreement in order to use...

  17. 7 CFR 1753.7 - Plans and specifications (P&S).

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... specifications prepared by the borrower's engineer. The specifications prepared by the borrower's engineer and... its contractor complies with the insurance and bond requirements. (4) Telecommunications software license provision. If the borrower is required to enter into a software license agreement in order to use...

  18. 7 CFR 1753.7 - Plans and specifications (P&S).

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... specifications prepared by the borrower's engineer. The specifications prepared by the borrower's engineer and... its contractor complies with the insurance and bond requirements. (4) Telecommunications software license provision. If the borrower is required to enter into a software license agreement in order to use...

  19. 21 CFR 882.1440 - Neuropsychiatric interpretive electroencephalograph assessment aid.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... described in detail in the software requirements specification and software design specification... the device, hardware and software, must be fully characterized and must demonstrate a reasonable assurance of safety and effectiveness. (i) Hardware specifications must be provided. Appropriate...

  20. Flight software requirements and design support system

    NASA Technical Reports Server (NTRS)

    Riddle, W. E.; Edwards, B.

    1980-01-01

    The desirability and feasibility of computer-augmented support for the pre-implementation activities occurring during the development of flight control software was investigated. The specific topics to be investigated were the capabilities to be included in a pre-implementation support system for flight control software system development, and the specification of a preliminary design for such a system. Further, the pre-implementation support system was to be characterized and specified under the constraints that it: (1) support both description and assessment of flight control software requirements definitions and design specification; (2) account for known software description and assessment techniques; (3) be compatible with existing and planned NASA flight control software development support system; and (4) does not impose, but may encourage, specific development technologies. An overview of the results is given.

  1. An Interoperability Framework and Capability Profiling for Manufacturing Software

    NASA Astrophysics Data System (ADS)

    Matsuda, M.; Arai, E.; Nakano, N.; Wakai, H.; Takeda, H.; Takata, M.; Sasaki, H.

    ISO/TC184/SC5/WG4 is working on ISO16100: Manufacturing software capability profiling for interoperability. This paper reports on a manufacturing software interoperability framework and a capability profiling methodology which were proposed and developed through this international standardization activity. Within the context of manufacturing application, a manufacturing software unit is considered to be capable of performing a specific set of function defined by a manufacturing software system architecture. A manufacturing software interoperability framework consists of a set of elements and rules for describing the capability of software units to support the requirements of a manufacturing application. The capability profiling methodology makes use of the domain-specific attributes and methods associated with each specific software unit to describe capability profiles in terms of unit name, manufacturing functions, and other needed class properties. In this methodology, manufacturing software requirements are expressed in terns of software unit capability profiles.

  2. NASA software specification and evaluation system: Software verification/validation techniques

    NASA Technical Reports Server (NTRS)

    1977-01-01

    NASA software requirement specifications were used in the development of a system for validating and verifying computer programs. The software specification and evaluation system (SSES) provides for the effective and efficient specification, implementation, and testing of computer software programs. The system as implemented will produce structured FORTRAN or ANSI FORTRAN programs, but the principles upon which SSES is designed allow it to be easily adapted to other high order languages.

  3. Software requirements: Guidance and control software development specification

    NASA Technical Reports Server (NTRS)

    Withers, B. Edward; Rich, Don C.; Lowman, Douglas S.; Buckland, R. C.

    1990-01-01

    The software requirements for an implementation of Guidance and Control Software (GCS) are specified. The purpose of the GCS is to provide guidance and engine control to a planetary landing vehicle during its terminal descent onto a planetary surface and to communicate sensory information about that vehicle and its descent to some receiving device. The specification was developed using the structured analysis for real time system specification methodology by Hatley and Pirbhai and was based on a simulation program used to study the probability of success of the 1976 Viking Lander missions to Mars. Three versions of GCS are being generated for use in software error studies.

  4. Investigation of specification measures for the Software Engineering Laboratory (SEL)

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Requirements specification measures are investigated for potential application in the Software Engineering Laboratory. Eighty-seven candidate measures are defined; sixteen are recommended for use. Most measures are derived from a new representation, the Composite Specification Model, which is introduced. The results of extracting the specification measures from the requirements of a real system are described.

  5. Software Dependability and Safety Evaluations ESA's Initiative

    NASA Astrophysics Data System (ADS)

    Hernek, M.

    ESA has allocated funds for an initiative to evaluate Dependability and Safety methods of Software. The objectives of this initiative are; · More extensive validation of Safety and Dependability techniques for Software · Provide valuable results to improve the quality of the Software thus promoting the application of Dependability and Safety methods and techniques. ESA space systems are being developed according to defined PA requirement specifications. These requirements may be implemented through various design concepts, e.g. redundancy, diversity etc. varying from project to project. Analysis methods (FMECA. FTA, HA, etc) are frequently used during requirements analysis and design activities to assure the correct implementation of system PA requirements. The criticality level of failures, functions and systems is determined and by doing that the critical sub-systems are identified, on which dependability and safety techniques are to be applied during development. Proper performance of the software development requires the development of a technical specification for the products at the beginning of the life cycle. Such technical specification comprises both functional and non-functional requirements. These non-functional requirements address characteristics of the product such as quality, dependability, safety and maintainability. Software in space systems is more and more used in critical functions. Also the trend towards more frequent use of COTS and reusable components pose new difficulties in terms of assuring reliable and safe systems. Because of this, its dependability and safety must be carefully analysed. ESA identified and documented techniques, methods and procedures to ensure that software dependability and safety requirements are specified and taken into account during the design and development of a software system and to verify/validate that the implemented software systems comply with these requirements [R1].

  6. The method to divide a sentence of requirement into individual requirements and the development of requirement specification editor which can describe individual requirements.

    PubMed

    Sato, Kuniya; Ooba, Masahiro; Takagi, Tomohiko; Furukawa, Zengo; Komiya, Seiichi; Yaegashi, Rihito

    2013-12-01

    Agile software development gains requirements from the direct discussion with customers and the development staff each time, and the customers evaluate the appropriateness of the requirement. If the customers divide the complicated requirement into individual requirements, the engineer who is in charge of software development can understand it easily. This is called division of requirement. However, the customers do not understand how much and how to divide the requirements. This paper proposes the method to divide a complicated requirement into individual requirements. Also, it shows the development of requirement specification editor which can describe individual requirements. The engineer who is in charge of software development can understand requirements easily.

  7. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, Roy H.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.

    1987-01-01

    The Software Automation, Generation and Administration (SAGA) project is investigating the design and construction of practical software engineering environments for developing and maintaining aerospace systems and applications software. The research includes the practical organization of the software lifecycle, configuration management, software requirements specifications, executable specifications, design methodologies, programming, verification, validation and testing, version control, maintenance, the reuse of software, software libraries, documentation, and automated management.

  8. Specifications for Thesaurus Software.

    ERIC Educational Resources Information Center

    Milstead, Jessica L.

    1991-01-01

    Presents specifications for software that is designed to support manual development and maintenance of information retrieval thesauri. Evaluation of existing software and design of custom software is discussed, requirements for integration with larger systems and for the user interface are described, and relationships among terms are discussed.…

  9. Proposing an Evidence-Based Strategy for Software Requirements Engineering.

    PubMed

    Lindoerfer, Doris; Mansmann, Ulrich

    2016-01-01

    This paper discusses an evidence-based approach to software requirements engineering. The approach is called evidence-based, since it uses publications on the specific problem as a surrogate for stakeholder interests, to formulate risks and testing experiences. This complements the idea that agile software development models are more relevant, in which requirements and solutions evolve through collaboration between self-organizing cross-functional teams. The strategy is exemplified and applied to the development of a Software Requirements list used to develop software systems for patient registries.

  10. Initiating Formal Requirements Specifications with Object-Oriented Models

    NASA Technical Reports Server (NTRS)

    Ampo, Yoko; Lutz, Robyn R.

    1994-01-01

    This paper reports results of an investigation into the suitability of object-oriented models as an initial step in developing formal specifications. The requirements for two critical system-level software modules were used as target applications. It was found that creating object-oriented diagrams prior to formally specifying the requirements enhanced the accuracy of the initial formal specifications and reduced the effort required to produce them. However, the formal specifications incorporated some information not found in the object-oriented diagrams, such as higher-level strategy or goals of the software.

  11. Dynamic visualization techniques for high consequence software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, G.M.

    1998-02-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification. The prototype tool is described along with the requirements constraint language after a brief literature review is presented. Examples of howmore » the tool can be used are also presented. In conclusion, the most significant advantage of this tool is to provide a first step in evaluating specification completeness, and to provide a more productive method for program comprehension and debugging. The expected payoff is increased software surety confidence, increased program comprehension, and reduced development and debugging time.« less

  12. NASA software documentation standard software engineering program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  13. Effects of Using Requirements Catalogs on Effectiveness and Productivity of Requirements Specification in a Software Project Management Course

    ERIC Educational Resources Information Center

    Fernández-Alemán, José Luis; Carrillo-de-Gea, Juan Manuel; Meca, Joaquín Vidal; Ros, Joaquín Nicolás; Toval, Ambrosio; Idri, Ali

    2016-01-01

    This paper presents the results of two educational experiments carried out to determine whether the process of specifying requirements (catalog-based reuse as opposed to conventional specification) has an impact on effectiveness and productivity in co-located and distributed software development environments. The participants in the experiments…

  14. Flight software development for the isothermal dendritic growth experiment

    NASA Technical Reports Server (NTRS)

    Levinson, Laurie H.; Winsa, Edward A.; Glicksman, Martin E.

    1989-01-01

    The Isothermal Dendritic Growth Experiment (IDGE) is a microgravity materials science experiment scheduled to fly in the cargo bay of the shuttle on the United States Microgravity Payload (USMP) carrier. The experiment will be operated by real-time control software which will not only monitor and control onboard experiment hardware, but will also communicate, via downlink data and uplink commands, with the Payload Operations Control Center (POCC) at NASA George C. Marshall Space Flight Center (MSFC). The software development approach being used to implement this system began with software functional requirements specification. This was accomplished using the Yourdon/DeMarco methodology as supplemented by the Ward/Mellor real-time extensions. The requirements specification in combination with software prototyping was then used to generate a detailed design consisting of structure charts, module prologues, and Program Design Language (PDL) specifications. This detailed design will next be used to code the software, followed finally by testing against the functional requirements. The result will be a modular real-time control software system with traceability through every phase of the development process.

  15. Flight software development for the isothermal dendritic growth experiment

    NASA Technical Reports Server (NTRS)

    Levinson, Laurie H.; Winsa, Edward A.; Glicksman, M. E.

    1990-01-01

    The Isothermal Dendritic Growth Experiment (IDGE) is a microgravity materials science experiment scheduled to fly in the cargo bay of the shuttle on the United States Microgravity Payload (USMP) carrier. The experiment will be operated by real-time control software which will not only monitor and control onboard experiment hardware, but will also communicate, via downlink data and unlink commands, with the Payload Operations Control Center (POCC) at NASA George C. Marshall Space Flight Center (MSFC). The software development approach being used to implement this system began with software functional requirements specification. This was accomplished using the Yourdon/DeMarco methodology as supplemented by the Ward/Mellor real-time extensions. The requirements specification in combination with software prototyping was then used to generate a detailed design consisting of structure charts, module prologues, and Program Design Language (PDL) specifications. This detailed design will next be used to code the software, followed finally by testing against the functional requirements. The result will be a modular real-time control software system with traceability through every phase of the development process.

  16. Architectural Implementation of NASA Space Telecommunications Radio System Specification

    NASA Technical Reports Server (NTRS)

    Peters, Kenneth J.; Lux, James P.; Lang, Minh; Duncan, Courtney B.

    2012-01-01

    This software demonstrates a working implementation of the NASA STRS (Space Telecommunications Radio System) architecture specification. This is a developing specification of software architecture and required interfaces to provide commonality among future NASA and commercial software-defined radios for space, and allow for easier mixing of software and hardware from different vendors. It provides required functions, and supports interaction with STRS-compliant simple test plug-ins ("waveforms"). All of it is programmed in "plain C," except where necessary to interact with C++ plug-ins. It offers a small footprint, suitable for use in JPL radio hardware. Future NASA work is expected to develop into fully capable software-defined radios for use on the space station, other space vehicles, and interplanetary probes.

  17. NASA software specification and evaluation system design, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.

  18. ICESat (GLAS) Science Processing Software Document Series. Volume 3; GLAS Science Software Requirements Document; Ver 2.1

    NASA Technical Reports Server (NTRS)

    Jester, Peggy L.; Lee, Jeffrey; Zukor, Dorothy J. (Technical Monitor)

    2001-01-01

    This document addresses the software requirements of the Geoscience Laser Altimeter System (GLAS) Standard Data Software (SDS) supporting the GLAS instrument on the EOS ICESat Spacecraft. This Software Requirements Document represents the initial collection of the technical engineering information for the GLAS SDS. This information is detailed within the second of four main volumes of the Standard documentation, the Product Specification volume. This document is a "roll-out" from the governing volume outline containing the Concept and Requirements sections.

  19. Software Requirements Specification for Lunar IceCube

    NASA Astrophysics Data System (ADS)

    Glaser-Garbrick, Michael R.

    Lunar IceCube is a 6U satellite that will orbit the moon to measure water volatiles as a function of position, altitude, and time, and measure in its various phases. Lunar IceCube, is a collaboration between Morehead State University, Vermont Technical University, Busek, and NASA. The Software Requirements Specification will serve as contract between the overall team and the developers of the flight software. It will provide a system's overview of the software that will be developed for Lunar IceCube, in that it will detail all of the interconnects and protocols for each subsystem's that Lunar IceCube will utilize. The flight software will be written in SPARK to the fullest extent, due to SPARK's unique ability to make software free of any errors. The LIC flight software does make use of a general purpose, reusable application framework called CubedOS. This framework imposes some structuring requirements on the architecture and design of the flight software, but it does not impose any high level requirements. It will also detail the tools that we will be using for Lunar IceCube, such as why we will be utilizing VxWorks.

  20. Knowledge-based requirements analysis for automating software development

    NASA Technical Reports Server (NTRS)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  1. Analyzing Software Requirements Errors in Safety-Critical, Embedded Systems

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.

    1993-01-01

    This paper analyzes the root causes of safety-related software errors in safety-critical, embedded systems. The results show that software errors identified as potentially hazardous to the system tend to be produced by different error mechanisms than non- safety-related software errors. Safety-related software errors are shown to arise most commonly from (1) discrepancies between the documented requirements specifications and the requirements needed for correct functioning of the system and (2) misunderstandings of the software's interface with the rest of the system. The paper uses these results to identify methods by which requirements errors can be prevented. The goal is to reduce safety-related software errors and to enhance the safety of complex, embedded systems.

  2. Preparation guide for class B software specification documents

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1979-01-01

    General conceptual requirements and specific application rules and procedures are provided for the production of software specification documents in conformance with deep space network software standards and class B standards. Class B documentation is identified as the appropriate level applicable to implementation, sustaining engineering, and operational uses by qualified personnel. Special characteristics of class B documents are defined.

  3. Compositional Specification of Software Architecture

    NASA Technical Reports Server (NTRS)

    Penix, John; Lau, Sonie (Technical Monitor)

    1998-01-01

    This paper describes our experience using parameterized algebraic specifications to model properties of software architectures. The goal is to model the decomposition of requirements independent of the style used to implement the architecture. We begin by providing an overview of the role of architecture specification in software development. We then describe how architecture specifications are build up from component and connector specifications and give an overview of insights gained from a case study used to validate the method.

  4. Software Program: Software Management Guidebook

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The purpose of this NASA Software Management Guidebook is twofold. First, this document defines the core products and activities required of NASA software projects. It defines life-cycle models and activity-related methods but acknowledges that no single life-cycle model is appropriate for all NASA software projects. It also acknowledges that the appropriate method for accomplishing a required activity depends on characteristics of the software project. Second, this guidebook provides specific guidance to software project managers and team leaders in selecting appropriate life cycles and methods to develop a tailored plan for a software engineering project.

  5. Guidelines for applying the Composite Specification Model (CSM)

    NASA Technical Reports Server (NTRS)

    Agresti, William

    1987-01-01

    The Composite Specification Model (CSM) is an approach to representing software requirements. Guidelines are provided for applying CSM and developing each of the three descriptive views of the software: the contextual view, using entities and relationships; the dynamic view, using states and transitions; and the function view, using data flows and processes. Using CSM results in a software specification document, which is outlined.

  6. Multidisciplinary and Active/Collaborative Approaches in Teaching Requirements Engineering

    ERIC Educational Resources Information Center

    Rosca, Daniela

    2005-01-01

    The requirements engineering course is a core component of the curriculum for the Master's in Software Engineering programme, at Monmouth University (MU). It covers the process, methods and tools specific to this area, together with the corresponding software quality issues. The need to produce software engineers with strong teamwork and…

  7. NASA Software Documentation Standard

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as "Standard") is designed to support the documentation of all software developed for NASA; its goal is to provide a framework and model for recording the essential information needed throughout the development life cycle and maintenance of a software system. The NASA Software Documentation Standard can be applied to the documentation of all NASA software. The Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. The basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  8. Capturing Requirements for Autonomous Spacecraft with Autonomy Requirements Engineering

    NASA Astrophysics Data System (ADS)

    Vassev, Emil; Hinchey, Mike

    2014-08-01

    The Autonomy Requirements Engineering (ARE) approach has been developed by Lero - the Irish Software Engineering Research Center within the mandate of a joint project with ESA, the European Space Agency. The approach is intended to help engineers develop missions for unmanned exploration, often with limited or no human control. Such robotics space missions rely on the most recent advances in automation and robotic technologies where autonomy and autonomic computing principles drive the design and implementation of unmanned spacecraft [1]. To tackle the integration and promotion of autonomy in software-intensive systems, ARE combines generic autonomy requirements (GAR) with goal-oriented requirements engineering (GORE). Using this approach, software engineers can determine what autonomic features to develop for a particular system (e.g., a space mission) as well as what artifacts that process might generate (e.g., goals models, requirements specification, etc.). The inputs required by this approach are the mission goals and the domain-specific GAR reflecting specifics of the mission class (e.g., interplanetary missions).

  9. Orbit targeting specialist function: Level C formulation requirements

    NASA Technical Reports Server (NTRS)

    Dupont, A.; Mcadoo, S.; Jones, H.; Jones, A. K.; Pearson, D.

    1978-01-01

    A definition of the level C requirements for onboard maneuver targeting software is provided. Included are revisions of the level C software requirements delineated in JSC IN 78-FM-27, Proximity Operations Software; Level C Requirements, dated May 1978. The software supports the terminal phase midcourse (TPM) maneuver, braking and close-in operations as well as supporting computation of the rendezvous corrective combination maneuver (NCC), and the terminal phase initiation (TPI). Specific formulation is contained here for the orbit targeting specialist function including the processing logic, linkage, and data base definitions for all modules. The crew interface with the software is through the keyboard and the ORBIT-TGT display.

  10. SIENA Customer Problem Statement and Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    L. Sauer; R. Clay; C. Adams

    2000-08-01

    This document describes the problem domain and functional requirements of the SIENA framework. The software requirements and system architecture of SIENA are specified in separate documents (called SIENA Software Requirement Specification and SIENA Software Architecture, respectively). While currently this version of the document describes the problems and captures the requirements within the Analysis domain (concentrating on finite element models), it is our intention to subsequent y expand this document to describe problems and capture requirements from the Design and Manufacturing domains. In addition, SIENA is designed to be extendible to support and integrate elements from the other domains (see SIENAmore » Software Architecture document).« less

  11. Software as a service approach to sensor simulation software deployment

    NASA Astrophysics Data System (ADS)

    Webster, Steven; Miller, Gordon; Mayott, Gregory

    2012-05-01

    Traditionally, military simulation has been problem domain specific. Executing an exercise currently requires multiple simulation software providers to specialize, deploy, and configure their respective implementations, integrate the collection of software to achieve a specific system behavior, and then execute for the purpose at hand. This approach leads to rigid system integrations which require simulation expertise for each deployment due to changes in location, hardware, and software. Our alternative is Software as a Service (SaaS) predicated on the virtualization of Night Vision Electronic Sensors (NVESD) sensor simulations as an exemplary case. Management middleware elements layer self provisioning, configuration, and integration services onto the virtualized sensors to present a system of services at run time. Given an Infrastructure as a Service (IaaS) environment, enabled and managed system of simulations yields a durable SaaS delivery without requiring user simulation expertise. Persistent SaaS simulations would provide on demand availability to connected users, decrease integration costs and timelines, and benefit the domain community from immediate deployment of lessons learned.

  12. Seismology software: state of the practice

    NASA Astrophysics Data System (ADS)

    Smith, W. Spencer; Zeng, Zheng; Carette, Jacques

    2018-05-01

    We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.

  13. Seismology software: state of the practice

    NASA Astrophysics Data System (ADS)

    Smith, W. Spencer; Zeng, Zheng; Carette, Jacques

    2018-02-01

    We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.

  14. [Definition and specification requirements for PAC-systems (picture archiving and communication system). A performance index with reference to the standard "IEEE Recommended Practice for Software Requirement Specifications"].

    PubMed

    König, H; Klose, K J

    1999-04-01

    The formulation of requirements is necessary to control the goals of a PACS project. Furthermore, in this way, the scope of functionality necessary to support radiological working processes becomes clear. Definitions of requirements and specification are formulated independently of systems according to the IEEE standard "Recommended Practice for Software Requirements Specifications". Definitions are given in the Request for Information, specifications in the Request for Proposal. Functional and non-functional requirements are distinguished. The solutions are rated with respect to scope, appropriateness and quality of implementation. A PACS checklist was created according to the methods described above. It is published on the homepage of the "Arbeitsgemeinschaft Informationstechnologie" (AGIT) within the "Deutsche Röntgengesellschaft" (DRG) (http://www.uni-marburg.de/mzr/agit). The checklist provides a discussion forum which should contribute to an agreement on accepted basic PACS functionalities.

  15. Computer-Aided Software Engineering - An approach to real-time software development

    NASA Technical Reports Server (NTRS)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  16. LogiKit - assisting complex logic specification and implementation for embedded control systems

    NASA Astrophysics Data System (ADS)

    Diglio, A.; Nicolodi, B.

    2002-07-01

    LogiKit provides an overall lifecycle solution. LogiKit is a powerful software engineering case toolkit for requirements specification, simulation and documentation. LogiKit also provides an automatic ADA software design, code and unit test generator.

  17. A Home Computer Primer.

    ERIC Educational Resources Information Center

    Stone, Antonia

    1982-01-01

    Provides general information on currently available microcomputers, computer programs (software), hardware requirements, software sources, costs, computer games, and programing. Includes a list of popular microcomputers, providing price category, model, list price, software (cassette, tape, disk), monitor specifications, amount of random access…

  18. The Assignment of Scale to Object-Oriented Software Measures

    NASA Technical Reports Server (NTRS)

    Neal, Ralph D.; Weistroffer, H. Roland; Coppins, Richard J.

    1997-01-01

    In order to improve productivity (and quality), measurement of specific aspects of software has become imperative. As object oriented programming languages have become more widely used, metrics designed specifically for object-oriented software are required. Recently a large number of new metrics for object- oriented software has appeared in the literature. Unfortunately, many of these proposed metrics have not been validated to measure what they purport to measure. In this paper fifty (50) of these metrics are analyzed.

  19. Incubator Display Software Cost Reduction Toolset Software Requirements Specification

    NASA Technical Reports Server (NTRS)

    Moran, Susanne; Jeffords, Ralph

    2005-01-01

    The Incubator Display Software Requirements Specification was initially developed by Intrinsyx Technologies Corporation (Intrinsyx) under subcontract to Lockheed Martin, Contract Number NAS2-02090, for the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC) Space Station Biological Research Project (SSBRP). The Incubator Display is a User Payload Application (UPA) used to control an Incubator subrack payload for the SSBRP. The Incubator Display functions on-orbit as part of the subrack payload laptop, on the ground as part of the Communication and Data System (CDS) ground control system, and also as part of the crew training environment.

  20. Software reliability: Application of a reliability model to requirements error analysis

    NASA Technical Reports Server (NTRS)

    Logan, J.

    1980-01-01

    The application of a software reliability model having a well defined correspondence of computer program properties to requirements error analysis is described. Requirements error categories which can be related to program structural elements are identified and their effect on program execution considered. The model is applied to a hypothetical B-5 requirement specification for a program module.

  1. SCA Waveform Development for Space Telemetry

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Kifle, Multi; Hall, C. Steve; Quinn, Todd M.

    2004-01-01

    The NASA Glenn Research Center is investigating and developing suitable reconfigurable radio architectures for future NASA missions. This effort is examining software-based open-architectures for space based transceivers, as well as common hardware platform architectures. The Joint Tactical Radio System's (JTRS) Software Communications Architecture (SCA) is a candidate for the software approach, but may need modifications or adaptations for use in space. An in-house SCA compliant waveform development focuses on increasing understanding of software defined radio architectures and more specifically the JTRS SCA. Space requirements put a premium on size, mass, and power. This waveform development effort is key to evaluating tradeoffs with the SCA for space applications. Existing NASA telemetry links, as well as Space Exploration Initiative scenarios, are the basis for defining the waveform requirements. Modeling and simulations are being developed to determine signal processing requirements associated with a waveform and a mission-specific computational burden. Implementation of the waveform on a laboratory software defined radio platform is proceeding in an iterative fashion. Parallel top-down and bottom-up design approaches are employed.

  2. 42 CFR 495.360 - Software and ownership rights.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 5 2012-10-01 2012-10-01 false Software and ownership rights. 495.360 Section 495... PROGRAM Requirements Specific to the Medicaid Program § 495.360 Software and ownership rights. (a) General... that the State or local government will have all ownership rights in software or modifications thereof...

  3. 42 CFR 495.360 - Software and ownership rights.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 5 2013-10-01 2013-10-01 false Software and ownership rights. 495.360 Section 495... PROGRAM Requirements Specific to the Medicaid Program § 495.360 Software and ownership rights. (a) General... that the State or local government will have all ownership rights in software or modifications thereof...

  4. 42 CFR 495.360 - Software and ownership rights.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Software and ownership rights. 495.360 Section 495... PROGRAM Requirements Specific to the Medicaid Program § 495.360 Software and ownership rights. (a) General... that the State or local government will have all ownership rights in software or modifications thereof...

  5. 42 CFR 495.360 - Software and ownership rights.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 5 2014-10-01 2014-10-01 false Software and ownership rights. 495.360 Section 495... PROGRAM Requirements Specific to the Medicaid Program § 495.360 Software and ownership rights. (a) General... that the State or local government will have all ownership rights in software or modifications thereof...

  6. IMCS reflight certification requirements and design specifications

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The requirements for reflight certification are established. Software requirements encompass the software programs that are resident in the PCC, DEP, PDSS, EC, or any related GSE. A design approach for the reflight software packages is recommended. These designs will be of sufficient detail to permit the implementation of reflight software. The PDSS/IMC Reflight Certification system provides the tools and mechanisms for the user to perform the reflight certification test procedures, test data capture, test data display, and test data analysis. The system as defined will be structured to permit maximum automation of reflight certification procedures and test data analysis.

  7. Choosing a software design method for real-time Ada applications: JSD process inversion as a means to tailor a design specification to the performance requirements and target machine

    NASA Technical Reports Server (NTRS)

    Withey, James V.

    1986-01-01

    The validity of real-time software is determined by its ability to execute on a computer within the time constraints of the physical system it is modeling. In many applications the time constraints are so critical that the details of process scheduling are elevated to the requirements analysis phase of the software development cycle. It is not uncommon to find specifications for a real-time cyclic executive program included to assumed in such requirements. It was found that prelininary designs structured around this implementation abscure the data flow of the real world system that is modeled and that it is consequently difficult and costly to maintain, update and reuse the resulting software. A cyclic executive is a software component that schedules and implicitly synchronizes the real-time software through periodic and repetitive subroutine calls. Therefore a design method is sought that allows the deferral of process scheduling to the later stages of design. The appropriate scheduling paradigm must be chosen given the performance constraints, the largest environment and the software's lifecycle. The concept of process inversion is explored with respect to the cyclic executive.

  8. 7 CFR 1753.7 - Plans and specifications (P&S).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF... its contractor complies with the insurance and bond requirements. (4) Telecommunications software license provision. If the borrower is required to enter into a software license agreement in order to use...

  9. Luigi Gentile Polese | NREL

    Science.gov Websites

    software development of next-generation whole-building energy modeling, analysis, and simulation tools technical positions in networking protocol specifications, call control software, and requirements

  10. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  11. Learning from Software Localization.

    ERIC Educational Resources Information Center

    Guo, She-Sen

    2003-01-01

    Localization is the process of adapting a product to meet the language, cultural and other requirements of a specific target environment or market. This article describes ways in which software localization impacts upon curriculum, and discusses what students will learn from software localization. (AEF)

  12. Automatic Debugging Support for UML Designs

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    Design of large software systems requires rigorous application of software engineering methods covering all phases of the software process. Debugging during the early design phases is extremely important, because late bug-fixes are expensive. In this paper, we describe an approach which facilitates debugging of UML requirements and designs. The Unified Modeling Language (UML) is a set of notations for object-orient design of a software system. We have developed an algorithm which translates requirement specifications in the form of annotated sequence diagrams into structured statecharts. This algorithm detects conflicts between sequence diagrams and inconsistencies in the domain knowledge. After synthesizing statecharts from sequence diagrams, these statecharts usually are subject to manual modification and refinement. By using the "backward" direction of our synthesis algorithm. we are able to map modifications made to the statechart back into the requirements (sequence diagrams) and check for conflicts there. Fed back to the user conflicts detected by our algorithm are the basis for deductive-based debugging of requirements and domain theory in very early development stages. Our approach allows to generate explanations oil why there is a conflict and which parts of the specifications are affected.

  13. Comparative Evaluations of Four Specification Methods for Real-Time Systems

    DTIC Science & Technology

    1989-12-01

    December 1989 Comparative Evaluations of Four Specification Methods for Real - Time Systems David P. Wood William G. Wood Specification and Design Methods...Methods for Real - Time Systems Abstract: A number of methods have been proposed in the last decade for the specification of system and software requirements...and software specification for real - time systems . Our process for the identification of methods that meet the above criteria is described in greater

  14. Autonomous Real Time Requirements Tracing

    NASA Technical Reports Server (NTRS)

    Plattsmier, George I.; Stetson, Howard K.

    2014-01-01

    One of the more challenging aspects of software development is the ability to verify and validate the functional software requirements dictated by the Software Requirements Specification (SRS) and the Software Detail Design (SDD). Insuring the software has achieved the intended requirements is the responsibility of the Software Quality team and the Software Test team. The utilization of Timeliner-TLX(sup TM) Auto-Procedures for relocating ground operations positions to ISS automated on-board operations has begun the transition that would be required for manned deep space missions with minimal crew requirements. This transition also moves the auto-procedures from the procedure realm into the flight software arena and as such the operational requirements and testing will be more structured and rigorous. The autoprocedures would be required to meet NASA software standards as specified in the Software Safety Standard (NASASTD- 8719), the Software Engineering Requirements (NPR 7150), the Software Assurance Standard (NASA-STD-8739) and also the Human Rating Requirements (NPR-8705). The Autonomous Fluid Transfer System (AFTS) test-bed utilizes the Timeliner-TLX(sup TM) Language for development of autonomous command and control software. The Timeliner- TLX(sup TM) system has the unique feature of providing the current line of the statement in execution during real-time execution of the software. The feature of execution line number internal reporting unlocks the capability of monitoring the execution autonomously by use of a companion Timeliner-TLX(sup TM) sequence as the line number reporting is embedded inside the Timeliner-TLX(sup TM) execution engine. This negates I/O processing of this type data as the line number status of executing sequences is built-in as a function reference. This paper will outline the design and capabilities of the AFTS Autonomous Requirements Tracker, which traces and logs SRS requirements as they are being met during real-time execution of the targeted system. It is envisioned that real time requirements tracing will greatly assist the movement of autoprocedures to flight software enhancing the software assurance of auto-procedures and also their acceptance as reliable commanders

  15. Autonomous Real Time Requirements Tracing

    NASA Technical Reports Server (NTRS)

    Plattsmier, George; Stetson, Howard

    2014-01-01

    One of the more challenging aspects of software development is the ability to verify and validate the functional software requirements dictated by the Software Requirements Specification (SRS) and the Software Detail Design (SDD). Insuring the software has achieved the intended requirements is the responsibility of the Software Quality team and the Software Test team. The utilization of Timeliner-TLX(sup TM) Auto- Procedures for relocating ground operations positions to ISS automated on-board operations has begun the transition that would be required for manned deep space missions with minimal crew requirements. This transition also moves the auto-procedures from the procedure realm into the flight software arena and as such the operational requirements and testing will be more structured and rigorous. The autoprocedures would be required to meet NASA software standards as specified in the Software Safety Standard (NASASTD- 8719), the Software Engineering Requirements (NPR 7150), the Software Assurance Standard (NASA-STD-8739) and also the Human Rating Requirements (NPR-8705). The Autonomous Fluid Transfer System (AFTS) test-bed utilizes the Timeliner-TLX(sup TM) Language for development of autonomous command and control software. The Timeliner-TLX(sup TM) system has the unique feature of providing the current line of the statement in execution during real-time execution of the software. The feature of execution line number internal reporting unlocks the capability of monitoring the execution autonomously by use of a companion Timeliner-TLX(sup TM) sequence as the line number reporting is embedded inside the Timeliner-TLX(sup TM) execution engine. This negates I/O processing of this type data as the line number status of executing sequences is built-in as a function reference. This paper will outline the design and capabilities of the AFTS Autonomous Requirements Tracker, which traces and logs SRS requirements as they are being met during real-time execution of the targeted system. It is envisioned that real time requirements tracing will greatly assist the movement of autoprocedures to flight software enhancing the software assurance of auto-procedures and also their acceptance as reliable commanders.

  16. Portable image-manipulation software: what is the extra development cost?

    PubMed

    Ligier, Y; Ratib, O; Funk, M; Perrier, R; Girard, C; Logean, M

    1992-08-01

    A hospital-wide picture archiving and communication system (PACS) project is currently under development at the University Hospital of Geneva. The visualization and manipulation of images provided by different imaging modalities constitutes one of the most challenging component of a PACS. It was necessary to provide this visualization software on a number of types of workstations because of the varying requirements imposed by the range of clinical uses it must serve. The user interface must be the same, independent of the underlying workstation. In addition to a standard set of image-manipulation and processing tools, there is a need for more specific clinical tools that can be easily adapted to specific medical requirements. To achieve this goal, it was elected to develop a modular and portable software called OSIRIS. This software is available on two different operating systems (the UNIX standard X-11/OSF-Motif based workstations and the Macintosh family) and can be easily ported to other systems. The extra effort required to design such software in a modular and portable way was worthwhile because it resulted in a platform that can be easily expanded and adapted to a variety of specific clinical applications. Its portability allows users to benefit from the rapidly evolving workstation technology and to adapt the performance to suit their needs.

  17. TOPEX Software Document Series. Volume 5; Rev. 1; TOPEX GDR Processing

    NASA Technical Reports Server (NTRS)

    Lee, Jeffrey; Lockwood, Dennis; Hancock, David W., III

    2003-01-01

    This document is a compendium of the WFF TOPEX Software Development Team's knowledge regarding Geophysical Data Record (GDR) Processing. It includes many elements of a requirements document, a software specification document, a software design document, and a user's manual. In the more technical sections, this document assumes the reader is familiar with TOPEX and instrument files.

  18. Learning Theories Applied to Teaching Technology: Constructivism versus Behavioral Theory for Instructing Multimedia Software Programs

    ERIC Educational Resources Information Center

    Reed, Cajah S.

    2012-01-01

    This study sought to find evidence for a beneficial learning theory to teach computer software programs. Additionally, software was analyzed for each learning theory's applicability to resolve whether certain software requires a specific method of education. The results are meant to give educators more effective teaching tools, so students…

  19. Advanced engineering software for in-space assembly and manned planetary spacecraft

    NASA Technical Reports Server (NTRS)

    Delaquil, Donald; Mah, Robert

    1990-01-01

    Meeting the objectives of the Lunar/Mars initiative to establish safe and cost-effective extraterrestrial bases requires an integrated software/hardware approach to operational definitions and systems implementation. This paper begins this process by taking a 'software-first' approach to systems design, for implementing specific mission scenarios in the domains of in-space assembly and operations of the manned Mars spacecraft. The technological barriers facing implementation of robust operational systems within these two domains are discussed, and preliminary software requirements and architectures that resolve these barriers are provided.

  20. Process Acceptance and Adoption by IT Software Project Practitioners

    ERIC Educational Resources Information Center

    Guardado, Deana R.

    2012-01-01

    This study addresses the question of what factors determine acceptance and adoption of processes in the context of Information Technology (IT) software development projects. This specific context was selected because processes required for managing software development projects are less prescriptive than in other, more straightforward, IT…

  1. FRAMES-2.0 Software System: Frames 2.0 Pest Integration (F2PEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castleton, Karl J.; Meyer, Philip D.

    2009-06-17

    The implementation of the FRAMES 2.0 F2PEST module is described, including requirements, design, and specifications of the software. This module integrates the PEST parameter estimation software within the FRAMES 2.0 environmental modeling framework. A test case is presented.

  2. Seven Steps to Responsible Software Selection. ERIC Digest.

    ERIC Educational Resources Information Center

    Komoski, P. Kenneth; Plotnick, Eric

    Microcomputers in schools contribute significantly to the learning process, and software selection is taken as seriously as the selection of text books. The seven step process for responsible software selection are: (1) analyzing needs, including the differentiation between needs and objectives; (2) specification of requirements; (3) identifying…

  3. Configuration management and software measurement in the Ground Systems Development Environment (GSDE)

    NASA Technical Reports Server (NTRS)

    Church, Victor E.; Long, D.; Hartenstein, Ray; Perez-Davila, Alfredo

    1992-01-01

    A set of functional requirements for software configuration management (CM) and metrics reporting for Space Station Freedom ground systems software are described. This report is one of a series from a study of the interfaces among the Ground Systems Development Environment (GSDE), the development systems for the Space Station Training Facility (SSTF) and the Space Station Control Center (SSCC), and the target systems for SSCC and SSTF. The focus is on the CM of the software following delivery to NASA and on the software metrics that relate to the quality and maintainability of the delivered software. The CM and metrics requirements address specific problems that occur in large-scale software development. Mechanisms to assist in the continuing improvement of mission operations software development are described.

  4. "LearningPad" Conundrum: The Perils of Using Third-Party Software and Student Privacy

    ERIC Educational Resources Information Center

    O'Brien, Jason; Roller, Sarah; Lampley, Sandra

    2017-01-01

    This case focuses on the potential problems associated with sharing personally identifiable information (PII) when students are required to use third-party software. Specifically, third-grade students were required to complete "LearningPad" activities as a component of their homework grade in math, spelling, and language arts. As…

  5. Advanced Integrated Display System V/STOL Program Performance Specification. Volume I.

    DTIC Science & Technology

    1980-06-01

    sensor inputs required before the sensor can be designated acceptable. The reactivation count of each sensor parameter which satisfies its veri...129 3.5.2 AIDS Configuration Parameters .............. 133 3.5.3 AIDS Throughput Requirements ............... 133 4 QUALITY ASSURANCE...lists the adaptation parameters of the AIDS software; these parameters include the throughput and memory requirements of the software. 3.2 SYSTEM

  6. Atmospheric, Magnetospheric and Plasmas in Space (AMPS) spacelab payload definition study; volume 4: Part 2, Labcraft payload general specification

    NASA Technical Reports Server (NTRS)

    Keeley, J. T.

    1976-01-01

    The Labcraft Payload General Specification (LPGS) amplifies those general requirements in the Labcraft Program Specification (LPS) to ensure that all hardware, software, and STS elements will successfully function as an integrated system to accomplish the objectives of the first Labcraft mission. Contract End Item Specifications (CEIS) and Procurement Drawings (PDs) prepared and implemented for all deliverable hardware and software elements are discussed.

  7. 42 CFR 495.360 - Software and ownership rights.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Software and ownership rights. 495.360 Section 495.360 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... PROGRAM Requirements Specific to the Medicaid Program § 495.360 Software and ownership rights. (a) General...

  8. A Probabilistic Software System Attribute Acceptance Paradigm for COTS Software Evaluation

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    2005-01-01

    Standard software requirement formats are written from top-down perspectives only, that is, from an ideal notion of a client s needs. Despite the exactness of the standard format, software and system errors in designed systems have abounded. Bad and inadequate requirements have resulted in cost overruns, schedule slips and lost profitability. Commercial off-the-shelf (COTS) software components are even more troublesome than designed systems because they are often provided as is and subsequently delivered with unsubstantiated validation of described capabilities. For COTS software, there needs to be a way to express the client s software needs in a consistent and formal manner using software system attributes derived from software quality standards. Additionally, the format needs to be amenable to software evaluation processes that integrate observable evidence garnered from historical data. This paper presents a paradigm that effectively bridges the gap between what a client desires (top-down) and what has been demonstrated (bottom-up) for COTS software evaluation. The paradigm addresses the specification of needs before the software evaluation is performed and can be used to increase the shared understanding between clients and software evaluators about what is required and what is technically possible.

  9. WFF TOPEX Software Documentation Altimeter Instrument File (AIF) Processing, October 1998. Volume 3

    NASA Technical Reports Server (NTRS)

    Lee, Jeffrey; Lockwood, Dennis

    2003-01-01

    This document is a compendium of the WFF TOPEX Software Development Team's knowledge regarding Sensor Data Record (SDR) Processing. It includes many elements of a requirements document, a software specification document, a software design document, and a user's manual. In the more technical sections, this document assumes the reader is familiar with TOPEX and instrument files.

  10. The MINERVA Software Development Process

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.

    2017-01-01

    This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.

  11. A Matrix Approach to Software Process Definition

    NASA Technical Reports Server (NTRS)

    Schultz, David; Bachman, Judith; Landis, Linda; Stark, Mike; Godfrey, Sally; Morisio, Maurizio; Powers, Edward I. (Technical Monitor)

    2000-01-01

    The Software Engineering Laboratory (SEL) is currently engaged in a Methodology and Metrics program for the Information Systems Center (ISC) at Goddard Space Flight Center (GSFC). This paper addresses the Methodology portion of the program. The purpose of the Methodology effort is to assist a software team lead in selecting and tailoring a software development or maintenance process for a specific GSFC project. It is intended that this process will also be compliant with both ISO 9001 and the Software Engineering Institute's Capability Maturity Model (CMM). Under the Methodology program, we have defined four standard ISO-compliant software processes for the ISC, and three tailoring criteria that team leads can use to categorize their projects. The team lead would select a process and appropriate tailoring factors, from which a software process tailored to the specific project could be generated. Our objective in the Methodology program is to present software process information in a structured fashion, to make it easy for a team lead to characterize the type of software engineering to be performed, and to apply tailoring parameters to search for an appropriate software process description. This will enable the team lead to follow a proven, effective software process and also satisfy NASA's requirement for compliance with ISO 9001 and the anticipated requirement for CMM assessment. This work is also intended to support the deployment of sound software processes across the ISC.

  12. ATM Technology Demonstration-1 Phase II Boeing Configurable Graphical Display (CGD) Software Design Description

    NASA Technical Reports Server (NTRS)

    Wilber, George F.

    2017-01-01

    This Software Description Document (SDD) captures the design for developing the Flight Interval Management (FIM) system Configurable Graphics Display (CGD) software. Specifically this SDD describes aspects of the Boeing CGD software and the surrounding context and interfaces. It does not describe the Honeywell components of the CGD system. The SDD provides the system overview, architectural design, and detailed design with all the necessary information to implement the Boeing components of the CGD software and integrate them into the CGD subsystem within the larger FIM system. Overall system and CGD system-level requirements are derived from the CGD SRS (in turn derived from the Boeing System Requirements Design Document (SRDD)). Display and look-and-feel requirements are derived from Human Machine Interface (HMI) design documents and working group recommendations. This Boeing CGD SDD is required to support the upcoming Critical Design Review (CDR).

  13. High performance VLSI telemetry data systems

    NASA Technical Reports Server (NTRS)

    Chesney, J.; Speciale, N.; Horner, W.; Sabia, S.

    1990-01-01

    NASA's deployment of major space complexes such as Space Station Freedom (SSF) and the Earth Observing System (EOS) will demand increased functionality and performance from ground based telemetry acquisition systems well above current system capabilities. Adaptation of space telemetry data transport and processing standards such as those specified by the Consultative Committee for Space Data Systems (CCSDS) standards and those required for commercial ground distribution of telemetry data, will drive these functional and performance requirements. In addition, budget limitations will force the requirement for higher modularity, flexibility, and interchangeability at lower cost in new ground telemetry data system elements. At NASA's Goddard Space Flight Center (GSFC), the design and development of generic ground telemetry data system elements, over the last five years, has resulted in significant solutions to these problems. This solution, referred to as the functional components approach includes both hardware and software components ready for end user application. The hardware functional components consist of modern data flow architectures utilizing Application Specific Integrated Circuits (ASIC's) developed specifically to support NASA's telemetry data systems needs and designed to meet a range of data rate requirements up to 300 Mbps. Real-time operating system software components support both embedded local software intelligence, and overall system control, status, processing, and interface requirements. These components, hardware and software, form the superstructure upon which project specific elements are added to complete a telemetry ground data system installation. This paper describes the functional components approach, some specific component examples, and a project example of the evolution from VLSI component, to basic board level functional component, to integrated telemetry data system.

  14. Building Safer Systems With SpecTRM

    NASA Technical Reports Server (NTRS)

    2003-01-01

    System safety, an integral component in software development, often poses a challenge to engineers designing computer-based systems. While the relaxed constraints on software design allow for increased power and flexibility, this flexibility introduces more possibilities for error. As a result, system engineers must identify the design constraints necessary to maintain safety and ensure that the system and software design enforces them. Safeware Engineering Corporation, of Seattle, Washington, provides the information, tools, and techniques to accomplish this task with its Specification Tools and Requirements Methodology (SpecTRM). NASA assisted in developing this engineering toolset by awarding the company several Small Business Innovation Research (SBIR) contracts with Ames Research Center and Langley Research Center. The technology benefits NASA through its applications for Space Station rendezvous and docking. SpecTRM aids system and software engineers in developing specifications for large, complex safety critical systems. The product enables engineers to find errors early in development so that they can be fixed with the lowest cost and impact on the system design. SpecTRM traces both the requirements and design rationale (including safety constraints) throughout the system design and documentation, allowing engineers to build required system properties into the design from the beginning, rather than emphasizing assessment at the end of the development process when changes are limited and costly.System safety, an integral component in software development, often poses a challenge to engineers designing computer-based systems. While the relaxed constraints on software design allow for increased power and flexibility, this flexibility introduces more possibilities for error. As a result, system engineers must identify the design constraints necessary to maintain safety and ensure that the system and software design enforces them. Safeware Engineering Corporation, of Seattle, Washington, provides the information, tools, and techniques to accomplish this task with its Specification Tools and Requirements Methodology (SpecTRM). NASA assisted in developing this engineering toolset by awarding the company several Small Business Innovation Research (SBIR) contracts with Ames Research Center and Langley Research Center. The technology benefits NASA through its applications for Space Station rendezvous and docking. SpecTRM aids system and software engineers in developing specifications for large, complex safety critical systems. The product enables engineers to find errors early in development so that they can be fixed with the lowest cost and impact on the system design. SpecTRM traces both the requirements and design rationale (including safety constraints) throughout the system design and documentation, allowing engineers to build required system properties into the design from the beginning, rather than emphasizing assessment at the end of the development process when changes are limited and costly.

  15. Requirements for the implementation of schedule repair technology in the Experiment Scheduling Program

    NASA Technical Reports Server (NTRS)

    Bullington, Stanley F.

    1992-01-01

    The following list of requirements specifies the proposed revisions to the Experiment Scheduling Program (ESP2) which deal with schedule repair. These requirements are divided into those which are general in nature, those which relate to measurement and analysis functions of the software, those which relate specifically to conflict resolution, and those relating directly to the user interface. (This list is not a complete list of requirements for the user interface, but only a list of those schedule repair requirements which relate to the interface.) Some of the requirements relate only to uses of the software in real-time operations. Others are clearly for future versions of the software, beyond the upcoming revision. In either case, the fact will be clearly stated.

  16. PDSS/IMC requirements and functional specifications

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The system (software and hardware) requirements for the Payload Development Support System (PDSS)/Image Motion Compensator (IMC) are provided. The PDSS/IMC system provides the capability for performing Image Motion Compensator Electronics (IMCE) flight software test, checkout, and verification and provides the capability for monitoring the IMC flight computer system during qualification testing for fault detection and fault isolation.

  17. Towards Archetypes-Based Software Development

    NASA Astrophysics Data System (ADS)

    Piho, Gunnar; Roost, Mart; Perkins, David; Tepandi, Jaak

    We present a framework for the archetypes based engineering of domains, requirements and software (Archetypes-Based Software Development, ABD). An archetype is defined as a primordial object that occurs consistently and universally in business domains and in business software systems. An archetype pattern is a collaboration of archetypes. Archetypes and archetype patterns are used to capture conceptual information into domain specific models that are utilized by ABD. The focus of ABD is on software factories - family-based development artefacts (domain specific languages, patterns, frameworks, tools, micro processes, and others) that can be used to build the family members. We demonstrate the usage of ABD for developing laboratory information management system (LIMS) software for the Clinical and Biomedical Proteomics Group, at the Leeds Institute of Molecular Medicine, University of Leeds.

  18. Instrument control software requirement specification for Extremely Large Telescopes

    NASA Astrophysics Data System (ADS)

    Young, Peter J.; Kiekebusch, Mario J.; Chiozzi, Gianluca

    2010-07-01

    Engineers in several observatories are now designing the next generation of optical telescopes, the Extremely Large Telescopes (ELT). These are very complex machines that will host sophisticated astronomical instruments to be used for a wide range of scientific studies. In order to carry out scientific observations, a software infrastructure is required to orchestrate the control of the multiple subsystems and functions. This paper will focus on describing the considerations, strategies and main issues related to the definition and analysis of the software requirements for the ELT's Instrument Control System using modern development processes and modelling tools like SysML.

  19. PUS Services Software Building Block Automatic Generation for Space Missions

    NASA Astrophysics Data System (ADS)

    Candia, S.; Sgaramella, F.; Mele, G.

    2008-08-01

    The Packet Utilization Standard (PUS) has been specified by the European Committee for Space Standardization (ECSS) and issued as ECSS-E-70-41A to define the application-level interface between Ground Segments and Space Segments. The ECSS-E- 70-41A complements the ECSS-E-50 and the Consultative Committee for Space Data Systems (CCSDS) recommendations for packet telemetry and telecommand. The ECSS-E-70-41A characterizes the identified PUS Services from a functional point of view and the ECSS-E-70-31 standard specifies the rules for their mission-specific tailoring. The current on-board software design for a space mission implies the production of several PUS terminals, each providing a specific tailoring of the PUS services. The associated on-board software building blocks are developed independently, leading to very different design choices and implementations even when the mission tailoring requires very similar services (from the Ground operative perspective). In this scenario, the automatic production of the PUS services building blocks for a mission would be a way to optimize the overall mission economy and improve the robusteness and reliability of the on-board software and of the Ground-Space interactions. This paper presents the Space Software Italia (SSI) activities for the development of an integrated environment to support: the PUS services tailoring activity for a specific mission. the mission-specific PUS services configuration. the generation the UML model of the software building block implementing the mission-specific PUS services and the related source code, support documentation (software requirements, software architecture, test plans/procedures, operational manuals), and the TM/TC database. The paper deals with: (a) the project objectives, (b) the tailoring, configuration, and generation process, (c) the description of the environments supporting the process phases, (d) the characterization of the meta-model used for the generation, (e) the characterization of the reference avionics architecture and of the reference on- board software high-level architecture.

  20. Reference manual for a Requirements Specification Language (RSL), version 2.0

    NASA Technical Reports Server (NTRS)

    Fisher, Gene L.; Cohen, Gerald C.

    1993-01-01

    This report is a Reference Manual for a general-purpose Requirements Specification Language, RSL. The purpose of RSL is to specify precisely the external structure of a mechanized system and to define requirements that the system must meet. A system can be comprised of a mixture of hardware, software, and human processing elements. RSL is a hybrid of features found in several popular requirements specification languages and includes constructs for formal mathematical specification.

  1. A second generation experiment in fault-tolerant software

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1986-01-01

    The primary goal was to determine whether the application of fault tolerance to software increases its reliability if the cost of production is the same as for an equivalent nonfault tolerance version derived from the same requirements specification. Software development protocols are discussed. The feasibility of adapting to software design fault tolerance the technique of N-fold Modular Redundancy with majority voting was studied.

  2. TOPEX SDR Processing, October 1998. Volume 4

    NASA Technical Reports Server (NTRS)

    Lee, Jeffrey E.; Lockwood, Dennis W.

    2003-01-01

    This document is a compendium of the WFF TOPEX Software Development Team's knowledge regarding Sensor Data Record (SDR) Processing. It includes many elements of a requirements document, a software specification document, a software design document, and a user's manual. In the more technical sections, this document assumes the reader is familiar with TOPEX and instrument files.

  3. Mod-5A wind turbine generator program design report. Volume 4: Drawings and specifications, book 2

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The design, development and analysis of the 7.3 MW MOD-5A wind turbine generator is documented. There are four volumes. This volume contains the drawings and specifications that were developed in preparation for building the MOD-5A wind turbine generator. This is the second book of volume four. Some of the items it contains are specs for the emergency shutdown panel, specs for the simulator software, simulator hardware specs, site operator terminal requirements, control data system requirements, software project management plan, elastomeric teeter bearing requirement specs, specs for the controls electronic cabinet, and specs for bolt pretensioning.

  4. Global Hawk Systems Engineering. Case Study

    DTIC Science & Technology

    2010-01-01

    Management Core System ( TBMCS ) (complex software development) • F-111 Fighter (joint program with significant involvement by the Office of the...Software Requirements Specification TACC Tailored Airworthiness Certification Criteria TBMCS Theater Battle Management Core System TEMP Test and

  5. Formalizing Space Shuttle Software Requirements

    NASA Technical Reports Server (NTRS)

    Crow, Judith; DiVito, Ben L.

    1996-01-01

    This paper describes two case studies in which requirements for new flight-software subsystems on NASA's Space Shuttle were analyzed, one using standard formal specification techniques, the other using state exploration. These applications serve to illustrate three main theses: (1) formal methods can complement conventional requirements analysis processes effectively, (2) formal methods confer benefits regardless of how extensively they are adopted and applied, and (3) formal methods are most effective when they are judiciously tailored to the application.

  6. Spectrum analysis on quality requirements consideration in software design documents.

    PubMed

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  7. Software engineering project management - A state-of-the-art report

    NASA Technical Reports Server (NTRS)

    Thayer, R. H.; Lehman, J. H.

    1977-01-01

    The management of software engineering projects in the aerospace industry was investigated. The survey assessed such features as contract type, specification preparation techniques, software documentation required by customers, planning and cost-estimating, quality control, the use of advanced program practices, software tools and test procedures, the education levels of project managers, programmers and analysts, work assignment, automatic software monitoring capabilities, design and coding reviews, production times, success rates, and organizational structure of the projects.

  8. Concept document of the repository-based software engineering program: A constructive appraisal

    NASA Technical Reports Server (NTRS)

    1992-01-01

    A constructive appraisal of the Concept Document of the Repository-Based Software Engineering Program is provided. The Concept Document is designed to provide an overview of the Repository-Based Software Engineering (RBSE) Program. The Document should be brief and provide the context for reading subsequent requirements and product specifications. That is, all requirements to be developed should be traceable to the Concept Document. Applied Expertise's analysis of the Document was directed toward assuring that: (1) the Executive Summary provides a clear, concise, and comprehensive overview of the Concept (rewrite as necessary); (2) the sections of the Document make best use of the NASA 'Data Item Description' for concept documents; (3) the information contained in the Document provides a foundation for subsequent requirements; and (4) the document adequately: identifies the problem being addressed; articulates RBSE's specific role; specifies the unique aspects of the program; and identifies the nature and extent of the program's users.

  9. Development of an expert system prototype for determining software functional requirements for command management activities at NASA Goddard

    NASA Technical Reports Server (NTRS)

    Liebowitz, J.

    1985-01-01

    The development of an expert system prototype for determining software functional requirements for NASA Goddard's Command Management System (CMS) is described. The role of the CMS is to transform general requests into specific spacecraft commands with command execution conditions. The CMS is part of the NASA Data System which entails the downlink of science and engineering data from NASA near-earth satellites to the user, and the uplink of command and control data to the spacecraft. Subjects covered include: the problem environment of determining CMS software functional requirements; the expert system approach for handling CMS requirements development; validation and evaluation procedures for the expert system.

  10. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  11. The Impact of a Simulation and Problem-Based Learning Design Project on Student Learning and Teamwork Skills. CSE Technical Report.

    ERIC Educational Resources Information Center

    Chung, Gregory K. W. K.

    This study examined a civil engineering capstone course that embedded a sophisticated simulation-based task within instruction. Students (n=28) were required to conduct a hazardous waste site investigation using simulation software designed specifically for the course (Interactive Site Investigation Software) (ISIS). The software simulated…

  12. SOA Governance: A Critical SOA Success Factor

    DTIC Science & Technology

    2010-04-01

    Software Perspective Service Consumer Service Providers Interface Optimize tomorrow today. ® Building Blocks...of a SOA Service – Software implemented capability that is well-defined, self contained and does not depend on context or state of other services ... Service Consumer – Service , application or other software component that requires a specific service . – Located through registry – Initiates service

  13. Study of application of space telescope science operations software for SIRTF use

    NASA Technical Reports Server (NTRS)

    Dignam, F.; Stetson, E.; Allendoerfer, W.

    1985-01-01

    The design and development of the Space Telescope Science Operations Ground System (ST SOGS) was evaluated to compile a history of lessons learned that would benefit NASA's Space Infrared Telescope Facility (SIRTF). Forty-nine specific recommendations resulted and were categorized as follows: (1) requirements: a discussion of the content, timeliness and proper allocation of the system and segment requirements and the resulting impact on SOGS development; (2) science instruments: a consideration of the impact of the Science Instrument design and data streams on SOGS software; and (3) contract phasing: an analysis of the impact of beginning the various ST program segments at different times. Approximately half of the software design and source code might be useable for SIRTF. Transportability of this software requires, at minimum, a compatible DEC VAX-based architecture and VMS operating system, system support software similar to that developed for SOGS, and continued evolution of the SIRTF operations concept and requirements such that they remain compatible with ST SOGS operation.

  14. Beyond formalism

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1991-01-01

    The ongoing debate over the role of formalism and formal specifications in software features many speakers with diverse positions. Yet, in the end, they share the conviction that the requirements of a software system can be unambiguously specified, that acceptable software is a product demonstrably meeting the specifications, and that the design process can be carried out with little interaction between designers and users once the specification has been agreed to. This conviction is part of a larger paradigm prevalent in American management thinking, which holds that organizations are systems that can be precisely specified and optimized. This paradigm, which traces historically to the works of Frederick Taylor in the early 1900s, is no longer sufficient for organizations and software systems today. In the domain of software, a new paradigm, called user-centered design, overcomes the limitations of pure formalism. Pioneered in Scandinavia, user-centered design is spreading through Europe and is beginning to make its way into the U.S.

  15. Model Transformation for a System of Systems Dependability Safety Case

    NASA Technical Reports Server (NTRS)

    Murphy, Judy; Driskell, Stephen B.

    2010-01-01

    Software plays an increasingly larger role in all aspects of NASA's science missions. This has been extended to the identification, management and control of faults which affect safety-critical functions and by default, the overall success of the mission. Traditionally, the analysis of fault identification, management and control are hardware based. Due to the increasing complexity of system, there has been a corresponding increase in the complexity in fault management software. The NASA Independent Validation & Verification (IV&V) program is creating processes and procedures to identify, and incorporate safety-critical software requirements along with corresponding software faults so that potential hazards may be mitigated. This Specific to Generic ... A Case for Reuse paper describes the phases of a dependability and safety study which identifies a new, process to create a foundation for reusable assets. These assets support the identification and management of specific software faults and, their transformation from specific to generic software faults. This approach also has applications to other systems outside of the NASA environment. This paper addresses how a mission specific dependability and safety case is being transformed to a generic dependability and safety case which can be reused for any type of space mission with an emphasis on software fault conditions.

  16. Applying formal methods and object-oriented analysis to existing flight software

    NASA Technical Reports Server (NTRS)

    Cheng, Betty H. C.; Auernheimer, Brent

    1993-01-01

    Correctness is paramount for safety-critical software control systems. Critical software failures in medical radiation treatment, communications, and defense are familiar to the public. The significant quantity of software malfunctions regularly reported to the software engineering community, the laws concerning liability, and a recent NRC Aeronautics and Space Engineering Board report additionally motivate the use of error-reducing and defect detection software development techniques. The benefits of formal methods in requirements driven software development ('forward engineering') is well documented. One advantage of rigorously engineering software is that formal notations are precise, verifiable, and facilitate automated processing. This paper describes the application of formal methods to reverse engineering, where formal specifications are developed for a portion of the shuttle on-orbit digital autopilot (DAP). Three objectives of the project were to: demonstrate the use of formal methods on a shuttle application, facilitate the incorporation and validation of new requirements for the system, and verify the safety-critical properties to be exhibited by the software.

  17. KSC Space Station Operations Language (SSOL)

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Space Station Operations Language (SSOL) will serve a large community of diverse users dealing with the integration and checkout of Space Station modules. Kennedy Space Center's plan to achieve Level A specification of the SSOL system, encompassing both its language and its automated support environment, is presented in the format of a briefing. The SSOL concept is a collection of fundamental elements that span languages, operating systems, software development, software tools and several user classes. The approach outlines a thorough process that combines the benefits of rapid prototyping with a coordinated requirements gathering effort, yielding a Level A specification of the SSOL requirements.

  18. Automatic Requirements Specification Extraction from Natural Language (ARSENAL)

    DTIC Science & Technology

    2014-10-01

    designers, implementers) involved in the design of software systems. However, natural language descriptions can be informal, incomplete, imprecise...communication of technical descriptions between the various stakeholders (e.g., customers, designers, imple- menters) involved in the design of software systems...the accuracy of the natural language processing stage, the degree of automation, and robustness to noise. 1 2 Introduction Software systems operate in

  19. An Automated Method for Identifying Inconsistencies within Diagrammatic Software Requirements Specifications

    NASA Technical Reports Server (NTRS)

    Zhang, Zhong

    1997-01-01

    The development of large-scale, composite software in a geographically distributed environment is an evolutionary process. Often, in such evolving systems, striving for consistency is complicated by many factors, because development participants have various locations, skills, responsibilities, roles, opinions, languages, terminology and different degrees of abstraction they employ. This naturally leads to many partial specifications or viewpoints. These multiple views on the system being developed usually overlap. From another aspect, these multiple views give rise to the potential for inconsistency. Existing CASE tools do not efficiently manage inconsistencies in distributed development environment for a large-scale project. Based on the ViewPoints framework the WHERE (Web-Based Hypertext Environment for requirements Evolution) toolkit aims to tackle inconsistency management issues within geographically distributed software development projects. Consequently, WHERE project helps make more robust software and support software assurance process. The long term goal of WHERE tools aims to the inconsistency analysis and management in requirements specifications. A framework based on Graph Grammar theory and TCMJAVA toolkit is proposed to detect inconsistencies among viewpoints. This systematic approach uses three basic operations (UNION, DIFFERENCE, INTERSECTION) to study the static behaviors of graphic and tabular notations. From these operations, subgraphs Query, Selection, Merge, Replacement operations can be derived. This approach uses graph PRODUCTIONS (rewriting rules) to study the dynamic transformations of graphs. We discuss the feasibility of implementation these operations. Also, We present the process of porting original TCM (Toolkit for Conceptual Modeling) project from C++ to Java programming language in this thesis. A scenario based on NASA International Space Station Specification is discussed to show the applicability of our approach. Finally, conclusion and future work about inconsistency management issues in WHERE project will be summarized.

  20. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases, they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At the Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and processes for developing software. This paper will discuss some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies and processes.

  1. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and procedure for developing software. This paper will discuss the some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies.

  2. Implementation of a software for REmote COMparison of PARticlE and photon treatment plans: ReCompare.

    PubMed

    Löck, Steffen; Roth, Klaus; Skripcak, Tomas; Worbs, Mario; Helmbrecht, Stephan; Jakobi, Annika; Just, Uwe; Krause, Mechthild; Baumann, Michael; Enghardt, Wolfgang; Lühr, Armin

    2015-09-01

    To guarantee equal access to optimal radiotherapy, a concept of patient assignment to photon or particle radiotherapy using remote treatment plan exchange and comparison - ReCompare - was proposed. We demonstrate the implementation of this concept and present its clinical applicability. The ReCompare concept was implemented using a client-server based software solution. A clinical workflow for the remote treatment plan exchange and comparison was defined. The steps required by the user and performed by the software for a complete plan transfer were described and an additional module for dose-response modeling was added. The ReCompare software was successfully tested in cooperation with three external partner clinics and worked meeting all required specifications. It was compatible with several standard treatment planning systems, ensured patient data protection, and integrated in the clinical workflow. The ReCompare software can be applied to support non-particle radiotherapy institutions with the patient-specific treatment decision on the optimal irradiation modality by remote treatment plan exchange and comparison. Copyright © 2015. Published by Elsevier GmbH.

  3. Specification and simulation of behavior of the Continuous Infusion Insulin Pump system.

    PubMed

    Babamir, Seyed Morteza; Dehkordi, Mehdi Borhani

    2014-01-01

    Continuous Infusion Insulin Pump (CIIP) system is responsible for monitoring diabetic blood sugar. In this paper, we aim to specify and simulate the CIIP software behavior. To this end, we first: (1) presented a model consisting of the CIIP system behavior in response to its environment (diabetic) behavior and (2) we formally defined the safety requirements of the system environment (diabetic) in the Z formal modeling language. Such requirements should be satisfied by the CIIP software. Finally, we programmed the model and requirements.

  4. Research on Generating Method of Embedded Software Test Document Based on Dynamic Model

    NASA Astrophysics Data System (ADS)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying

    2018-03-01

    This paper provides a dynamic model-based test document generation method for embedded software that provides automatic generation of two documents: test requirements specification documentation and configuration item test documentation. This method enables dynamic test requirements to be implemented in dynamic models, enabling dynamic test demand tracking to be easily generated; able to automatically generate standardized, standardized test requirements and test documentation, improved document-related content inconsistency and lack of integrity And other issues, improve the efficiency.

  5. Knowledge-based system V and V in the Space Station Freedom program

    NASA Technical Reports Server (NTRS)

    Kelley, Keith; Hamilton, David; Culbert, Chris

    1992-01-01

    Knowledge Based Systems (KBS's) are expected to be heavily used in the Space Station Freedom Program (SSFP). Although SSFP Verification and Validation (V&V) requirements are based on the latest state-of-the-practice in software engineering technology, they may be insufficient for Knowledge Based Systems (KBS's); it is widely stated that there are differences in both approach and execution between KBS V&V and conventional software V&V. In order to better understand this issue, we have surveyed and/or interviewed developers from sixty expert system projects in order to understand the differences and difficulties in KBS V&V. We have used this survey results to analyze the SSFP V&V requirements for conventional software in order to determine which specific requirements are inappropriate for KBS V&V and why they are inappropriate. Further work will result in a set of recommendations that can be used either as guidelines for applying conventional software V&V requirements to KBS's or as modifications to extend the existing SSFP conventional software V&V requirements to include KBS requirements. The results of this work are significant to many projects, in addition to SSFP, which will involve KBS's.

  6. Tailoring a software production environment for a large project

    NASA Technical Reports Server (NTRS)

    Levine, D. R.

    1984-01-01

    A software production environment was constructed to meet the specific goals of a particular large programming project. These goals, the specific solutions as implemented, and the experiences on a project of over 100,000 lines of source code are discussed. The base development environment for this project was an ordinary PWB Unix (tm) system. Several important aspects of the development process required support not available in the existing tool set.

  7. Knowledge Based Software Assistant Conference Proceedings (4th) Held in Syracuse, New York on 12-14 September 1989

    DTIC Science & Technology

    1990-05-01

    Sanders Associates. Inc. A demonstration of knowledge-based support for the evolut ;cnry development of software system requirements uskig mitV/9 text...Conference Commiffee W Douga W~t Spin-Off Technologies 4 AN OVERVIEW OF RADC’S KNOWLEDGE BASED SOFTWARE ASSISTANT PROGRAM Donald M. Elefante Rome Air...Knowledge-Based Software Assistant is a formally based, computer-mediated paradigm for the specification, development, evolution , and Ir ig term

  8. LogScope

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Smith, Margaret H.; Barringer, Howard; Groce, Alex

    2012-01-01

    LogScope is a software package for analyzing log files. The intended use is for offline post-processing of such logs, after the execution of the system under test. LogScope can, however, in principle, also be used to monitor systems online during their execution. Logs are checked against requirements formulated as monitors expressed in a rule-based specification language. This language has similarities to a state machine language, but is more expressive, for example, in its handling of data parameters. The specification language is user friendly, simple, and yet expressive enough for many practical scenarios. The LogScope software was initially developed to specifically assist in testing JPL s Mars Science Laboratory (MSL) flight software, but it is very generic in nature and can be applied to any application that produces some form of logging information (which almost any software does).

  9. Application of NX Siemens PLM software in educational process in preparing students of engineering branch

    NASA Astrophysics Data System (ADS)

    Sadchikova, G. M.

    2017-01-01

    This article discusses the results of the introduction of computer-aided design NX by Siemens Plm Software to the classes of a higher education institution. The necessity of application of modern information technologies in teaching students of engineering profile and selection of a software product is substantiated. The author describes stages of the software module study in relation to some specific courses, considers the features of NX software, which require the creation of standard and unified product databases. The article also gives examples of research carried out by the students with the various software modules.

  10. Field Guide for Designing Human Interaction with Intelligent Systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Thronesbery, Carroll G.

    1998-01-01

    The characteristics of this Field Guide approach address the problems of designing innovative software to support user tasks. The requirements for novel software are difficult to specify a priori, because there is not sufficient understanding of how the users' tasks should be supported, and there are not obvious pre-existing design solutions. When the design team is in unfamiliar territory, care must be taken to avoid rushing into detailed design, requirements specification, or implementation of the wrong product. The challenge is to get the right design and requirements in an efficient, cost-effective manner. This document's purpose is to describe the methods we are using to design human interactions with intelligent systems which support Space Shuttle flight controllers in the Mission Control Center at NASA/Johnson Space Center. Although these software systems usually have some intelligent features, the design challenges arise primarily from the innovation needed in the software design. While these methods are tailored to our specific context, they should be extensible, and helpful to designers of human interaction with other types of automated systems. We review the unique features of this context so that you can determine how to apply these methods to your project Throughout this Field Guide, goals of the design methods are discussed. This should help designers understand how a specific method might need to be adapted to the project at hand.

  11. Automated Software Development Workstation (ASDW)

    NASA Technical Reports Server (NTRS)

    Fridge, Ernie

    1990-01-01

    Software development is a serious bottleneck in the construction of complex automated systems. An increase of the reuse of software designs and components has been viewed as a way to relieve this bottleneck. One approach to achieving software reusability is through the development and use of software parts composition systems. A software parts composition system is a software development environment comprised of a parts description language for modeling parts and their interfaces, a catalog of existing parts, a composition editor that aids a user in the specification of a new application from existing parts, and a code generator that takes a specification and generates an implementation of a new application in a target language. The Automated Software Development Workstation (ASDW) is an expert system shell that provides the capabilities required to develop and manipulate these software parts composition systems. The ASDW is now in Beta testing at the Johnson Space Center. Future work centers on responding to user feedback for capability and usability enhancement, expanding the scope of the software lifecycle that is covered, and in providing solutions to handling very large libraries of reusable components.

  12. Designing Computerized Provider Order Entry Software in Iran: The Nurses' and Physicians' Viewpoints.

    PubMed

    Khammarnia, Mohammad; Sharifian, Roxana; Zand, Farid; Keshtkaran, Ali; Barati, Omid

    2016-09-01

    This study aimed to identify the functional requirements of computerized provider order entry software and design this software in Iran. This study was conducted using review documentation, interview, and focus group discussions in Shiraz University of Medical Sciences, as the medical pole in Iran, in 2013-2015. The study sample consisted of physicians (n = 12) and nurses (n = 2) in the largest hospital in the southern part of Iran and information technology experts (n = 5) in Shiraz University of Medical Sciences. Functional requirements of the computerized provider order entry system were examined in three phases. Finally, the functional requirements were distributed in four levels, and accordingly, the computerized provider order entry software was designed. The software had seven main dimensions: (1) data entry, (2) drug interaction management system, (3) warning system, (4) treatment services, (5) ability to write in software, (6) reporting from all sections of the software, and (7) technical capabilities of the software. The nurses and physicians emphasized quick access to the computerized provider order entry software, order prescription section, and applicability of the software. The software had some items that had not been mentioned in other studies. Ultimately, the software was designed by a company specializing in hospital information systems in Iran. This study was the first specific investigation of computerized provider order entry software design in Iran. Based on the results, it is suggested that this software be implemented in hospitals.

  13. Formal methods demonstration project for space applications

    NASA Technical Reports Server (NTRS)

    Divito, Ben L.

    1995-01-01

    The Space Shuttle program is cooperating in a pilot project to apply formal methods to live requirements analysis activities. As one of the larger ongoing shuttle Change Requests (CR's), the Global Positioning System (GPS) CR involves a significant upgrade to the Shuttle's navigation capability. Shuttles are to be outfitted with GPS receivers and the primary avionics software will be enhanced to accept GPS-provided positions and integrate them into navigation calculations. Prior to implementing the CR, requirements analysts at Loral Space Information Systems, the Shuttle software contractor, must scrutinize the CR to identify and resolve any requirements issues. We describe an ongoing task of the Formal Methods Demonstration Project for Space Applications whose goal is to find an effective way to use formal methods in the GPS CR requirements analysis phase. This phase is currently under way and a small team from NASA Langley, ViGYAN Inc. and Loral is now engaged in this task. Background on the GPS CR is provided and an overview of the hardware/software architecture is presented. We outline the approach being taken to formalize the requirements, only a subset of which is being attempted. The approach features the use of the PVS specification language to model 'principal functions', which are major units of Shuttle software. Conventional state machine techniques form the basis of our approach. Given this background, we present interim results based on a snapshot of work in progress. Samples of requirements specifications rendered in PVS are offered to illustration. We walk through a specification sketch for the principal function known as GPS Receiver State processing. Results to date are summarized and feedback from Loral requirements analysts is highlighted. Preliminary data is shown comparing issues detected by the formal methods team versus those detected using existing requirements analysis methods. We conclude by discussing our plan to complete the remaining activities of this task.

  14. The software development process at the Chandra X-ray Center

    NASA Astrophysics Data System (ADS)

    Evans, Janet D.; Evans, Ian N.; Fabbiano, Giuseppina

    2008-08-01

    Software development for the Chandra X-ray Center Data System began in the mid 1990's, and the waterfall model of development was mandated by our documents. Although we initially tried this approach, we found that a process with elements of the spiral model worked better in our science-based environment. High-level science requirements are usually established by scientists, and provided to the software development group. We follow with review and refinement of those requirements prior to the design phase. Design reviews are conducted for substantial projects within the development team, and include scientists whenever appropriate. Development follows agreed upon schedules that include several internal releases of the task before completion. Feedback from science testing early in the process helps to identify and resolve misunderstandings present in the detailed requirements, and allows review of intangible requirements. The development process includes specific testing of requirements, developer and user documentation, and support after deployment to operations or to users. We discuss the process we follow at the Chandra X-ray Center (CXC) to develop software and support operations. We review the role of the science and development staff from conception to release of software, and some lessons learned from managing CXC software development for over a decade.

  15. ACES: Space shuttle flight software analysis expert system

    NASA Technical Reports Server (NTRS)

    Satterwhite, R. Scott

    1990-01-01

    The Analysis Criteria Evaluation System (ACES) is a knowledge based expert system that automates the final certification of the Space Shuttle onboard flight software. Guidance, navigation and control of the Space Shuttle through all its flight phases are accomplished by a complex onboard flight software system. This software is reconfigured for each flight to allow thousands of mission-specific parameters to be introduced and must therefore be thoroughly certified prior to each flight. This certification is performed in ground simulations by executing the software in the flight computers. Flight trajectories from liftoff to landing, including abort scenarios, are simulated and the results are stored for analysis. The current methodology of performing this analysis is repetitive and requires many man-hours. The ultimate goals of ACES are to capture the knowledge of the current experts and improve the quality and reduce the manpower required to certify the Space Shuttle onboard flight software.

  16. The Bartlesville System; TGISS Software Documentation.

    ERIC Educational Resources Information Center

    Roberts, Tommy L.; And Others

    TGISS (Total Guidance Information Support System) is an information storage and retrieval system specifically designed to meet the needs and requirements of a counselor in the Bartlesville Public School environment. The system, which is a combination of man/machine capabilities, includes the hardware and software necessary to extend the…

  17. Advanced Chemistry Collection, 2nd Edition

    NASA Astrophysics Data System (ADS)

    2001-11-01

    Software requirements are given in Table 3. Some programs have additional special requirements. Please see the individual program abstracts at JCE Online or the documentation included on the CD-ROM for more specific information. Table 3. General software requirements for the Advanced Chemistry Collection.

    ComputerSystemOther Software(Required by one or more programs)
    Mac OS compatibleSystem 7.6.1 or higherAcrobat Reader (included)Mathcad; Mathematica;MacMolecule2; QuickTime 4; HyperCard Player
    Windows CompatibleWindows 2000, 98, 95, NT 4Acrobat Reader (included)Mathcad; Mathematica;PCMolecule2; QuickTime 4;HyperChem; Excel

    Literature Cited

    1. General Chemistry Collection, 5th ed.; J. Chem. Educ. Software, 2001, SP16.
    2. Advanced Chemistry Collection; J. Chem. Educ. Software, 2001, SP28.

  18. Some design constraints required for the assembly of software components: The incorporation of atomic abstract types into generically structured abstract types

    NASA Technical Reports Server (NTRS)

    Johnson, Charles S.

    1986-01-01

    It is nearly axiomatic, that to take the greatest advantage of the useful features available in a development system, and to avoid the negative interactions of those features, requires the exercise of a design methodology which constrains their use. A major design support feature of the Ada language is abstraction: for data, functions processes, resources, and system elements in general. Atomic abstract types can be created in packages defining those private types and all of the overloaded operators, functions, and hidden data required for their use in an application. Generically structured abstract types can be created in generic packages defining those structured private types, as buildups from the user-defined data types which are input as parameters. A study is made of the design constraints required for software incorporating either atomic or generically structured abstract types, if the integration of software components based on them is to be subsequently performed. The impact of these techniques on the reusability of software and the creation of project-specific software support environments is also discussed.

  19. Communication Problems in Requirements Engineering: A Field Study

    NASA Technical Reports Server (NTRS)

    Al-Rawas, Amer; Easterbrook, Steve

    1996-01-01

    The requirements engineering phase of software development projects is characterized by the intensity and importance of communication activities. During this phase, the various stakeholders must be able to communicate their requirements to the analysts, and the analysts need to be able to communicate the specifications they generate back to the stakeholders for validation. This paper describes a field investigation into the problems of communication between disparate communities involved in the requirements specification activities. The results of this study are discussed in terms of their relation to three major communication barriers: (1) ineffectiveness of the current communication channels; (2) restrictions on expressiveness imposed by notations; and (3) social and organizational barriers. The results confirm that organizational and social issues have great influence on the effectiveness of communication. They also show that in general, end-users find the notations used by software practitioners to model their requirements difficult to understand and validate.

  20. Top Down Implementation Plan for system performance test software

    NASA Technical Reports Server (NTRS)

    Jacobson, G. N.; Spinak, A.

    1982-01-01

    The top down implementation plan used for the development of system performance test software during the Mark IV-A era is described. The plan is based upon the identification of the hierarchical relationship of the individual elements of the software design, the development of a sequence of functionally oriented demonstrable steps, the allocation of subroutines to the specific step where they are first required, and objective status reporting. The results are: determination of milestones, improved managerial visibility, better project control, and a successful software development.

  1. Master Software Requirements Specification

    NASA Technical Reports Server (NTRS)

    Hu, Chaumin

    2003-01-01

    A basic function of a computational grid such as the NASA Information Power Grid (IPG) is to allow users to execute applications on remote computer systems. The Globus Resource Allocation Manager (GRAM) provides this functionality in the IPG and many other grids at this time. While the functionality provided by GRAM clients is adequate, GRAM does not support useful features such as staging several sets of files, running more than one executable in a single job submission, and maintaining historical information about execution operations. This specification is intended to provide the environmental and software functional requirements for the IPG Job Manager V2.0 being developed by AMTI for NASA.

  2. Interface Specifications for the A-7E Shared Services Module.

    DTIC Science & Technology

    1982-09-08

    To illustrate the principles, the onboard software for the Navy’s A-7E aircraft will be redesigned and rewritten. The Shared Services module provides...purpose of the Shared Services module is to allow the remainder of the software to remain unchanged when the requirements-based rules for these values and...services change. This report describes the modular structure of the Shared Services module, and contains the abstract interface specifications for all

  3. Workflow-Based Software Development Environment

    NASA Technical Reports Server (NTRS)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  4. Requirements Specification Language (RSL) and supporting tools

    NASA Technical Reports Server (NTRS)

    Frincke, Deborah; Wolber, Dave; Fisher, Gene; Cohen, Gerald C.

    1992-01-01

    This document describes a general purpose Requirement Specification Language (RSL). RSL is a hybrid of features found in several popular requirement specification languages. The purpose of RSL is to describe precisely the external structure of a system comprised of hardware, software, and human processing elements. To overcome the deficiencies of informal specification languages, RSL includes facilities for mathematical specification. Two RSL interface tools are described. The Browser view contains a complete document with all details of the objects and operations. The Dataflow view is a specialized, operation-centered depiction of a specification that shows how specified operations relate in terms of inputs and outputs.

  5. Medicine clerkships and portable computing: a national survey of internal medicine clerkship directors.

    PubMed

    Ferenchick, Gary; Solomon, David; Durning, Steven J

    2010-01-01

    Portable computers are widely used by medical trainees, but there is a lack of data on how these devices are used in clinical education programs. The objective is to define the current use of portable computing in internal medicine clerkships and to determine medicine clerkship directors' perceptions of the current value and future importance of portable computing. A 2006 national survey of institutional members of the Clerkship Directors in Internal Medicine. Eighty-three of 110 (75%) of institutional members responded. An institutional requirement for portable computing was reported by 32 schools (39%), whereas only 13 (16%) provided students with a portable computer. Between 10 and 31 institutions (12-37%) reported student use for patient care activities (i.e. order entry, writing patient notes) and only 2 to 4 institutions (2-5%) required such use. The majority of respondents (59-95%) reported portable computer use for educational activities (i.e., tracking patient problems, knowledge resource), however, only in 5 to 19 (6-23%) were such educational uses required. Fifty-six respondents (68%) reported that portable computer's "added value" for teaching and 61 (73%) reported that portable computers would be important in meeting clerkship objectives in the next 3 years. Of interest, even among the institutions requiring portable computers, only 50% recommended or required specific software. Portable computing is required at 39% of allopathic medical schools in the United States. However required portable computing for specific patient care or educational tasks is uncommon. In addition, guidance on specific software exists in only one half of school requiring portable computers, suggesting informal or unstructured uses of required portable computer's in the remaining half. The educational impact of formal institutional requirements for software versus informal "user-defined" applications is unknown.

  6. Modelling and Implementation of Catalogue Cards Using FreeMarker

    ERIC Educational Resources Information Center

    Radjenovic, Jelen; Milosavljevic, Branko; Surla, Dusan

    2009-01-01

    Purpose: The purpose of this paper is to report on a study involving the specification (using Unified Modelling Language (UML) 2.0) of information requirements and implementation of the software components for generating catalogue cards. The implementation in a Java environment is developed using the FreeMarker software.…

  7. Hypoxia, Monitoring, and Mitigation System

    DTIC Science & Technology

    2015-08-01

    Oxygen Saturation Measured via Pulse - Oximeter SRS Software Requirements Specification SW Software TI Texas Instruments uPROC Micro-Processor USAARL...Financial) Table of Figures Figure 1: Pulse OX custom module...Tasks 3, 4 and 5 have not been exercised. Sensor definition testing continued on the custom pulse -ox design. Additional refinement on the pulse

  8. 7 CFR 1753.6 - Standards, specifications, and general requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE TELECOMMUNICATIONS SYSTEM CONSTRUCTION POLICIES AND PROCEDURES... subject to the “Buy American” provision (7 U.S.C. 901 et seq. as amended in 1938). (e) All software, software systems, and firmware financed with loan funds must be year 2000 compliant, as defined in 7 CFR...

  9. 50 CFR 300.45 - Vessel Monitoring System.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... installers may be obtained from the Regional Administrator or the Administrator. (d) Hardware and software specifications. The VMS unit installed and carried on board a vessel to comply with the requirements of this section must consist of hardware and software that is approved by the Administrator and approved by NMFS...

  10. 50 CFR 300.45 - Vessel Monitoring System.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... installers may be obtained from the Regional Administrator or the Administrator. (d) Hardware and software specifications. The VMS unit installed and carried on board a vessel to comply with the requirements of this section must consist of hardware and software that is approved by the Administrator and approved by NMFS...

  11. 50 CFR 300.45 - Vessel Monitoring System.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... installers may be obtained from the Regional Administrator or the Administrator. (d) Hardware and software specifications. The VMS unit installed and carried on board a vessel to comply with the requirements of this section must consist of hardware and software that is approved by the Administrator and approved by NMFS...

  12. An Improved Suite of Object Oriented Software Measures

    NASA Technical Reports Server (NTRS)

    Neal, Ralph D.; Weistroffer, H. Roland; Coppins, Richard J.

    1997-01-01

    In the pursuit of ever increasing productivity, the need to be able to measure specific aspects of software is generally agreed upon. As object oriented programming languages are becoming more and more widely used, metrics specifically designed for object oriented software are required. In recent years there has been an explosion of new, object oriented software metrics proposed in the literature. Unfortunately, many or most of these proposed metrics have not been validated to measure what they claim to measure. In fact, an analysis of many of these metrics shows that they do not satisfy basic properties of measurement theory, and thus their application has to be suspect. In this paper ten improved metrics are proposed and are validated using measurement theory.

  13. HSCT4.0 Application: Software Requirements Specification

    NASA Technical Reports Server (NTRS)

    Salas, A. O.; Walsh, J. L.; Mason, B. H.; Weston, R. P.; Townsend, J. C.; Samareh, J. A.; Green, L. L.

    2001-01-01

    The software requirements for the High Performance Computing and Communication Program High Speed Civil Transport application project, referred to as HSCT4.0, are described. The objective of the HSCT4.0 application project is to demonstrate the application of high-performance computing techniques to the problem of multidisciplinary design optimization of a supersonic transport configuration, using high-fidelity analysis simulations. Descriptions of the various functions (and the relationships among them) that make up the multidisciplinary application as well as the constraints on the software design arc provided. This document serves to establish an agreement between the suppliers and the customer as to what the HSCT4.0 application should do and provides to the software developers the information necessary to design and implement the system.

  14. Generalized Information Architecture for Managing Requirements in IBM?s Rational DOORS(r) Application.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aragon, Kathryn M.; Eaton, Shelley M.; McCornack, Marjorie Turner

    When a requirements engineering effort fails to meet expectations, often times the requirements management tool is blamed. Working with numerous project teams at Sandia National Laboratories over the last fifteen years has shown us that the tool is rarely the culprit; usually it is the lack of a viable information architecture with well- designed processes to support requirements engineering. This document illustrates design concepts with rationale, as well as a proven information architecture to structure and manage information in support of requirements engineering activities for any size or type of project. This generalized information architecture is specific to IBM's Rationalmore » DOORS (Dynamic Object Oriented Requirements System) software application, which is the requirements management tool in Sandia's CEE (Common Engineering Environment). This generalized information architecture can be used as presented or as a foundation for designing a tailored information architecture for project-specific needs. It may also be tailored for another software tool. Version 1.0 4 November 201« less

  15. The Effective Use of System and Software Architecture Standards for Software Technology Readiness Assessments

    DTIC Science & Technology

    2011-05-01

    IEC 42010 Technology Viewpoint • Case Study – Multimedia Conferencing System – Technology Specification • Risks of Software TRL Determination...fully support the required threshold functionality . • Relevant Environment for Space* – A satellite from launch to standard operation in space is...Analytical and experimental critical function and/or characteristic f f t TRL 4 TRL 3 proo o concep Technology concept and/or application

  16. Prototyping with Data Dictionaries for Requirements Analysis.

    DTIC Science & Technology

    1985-03-01

    statistical packages and software for screen layout. These items work at a higher level than another category of prototyping tool, program generators... Program generators are software packages which, when given specifications, produce source listings, usually in a high order language such as COBCL...with users and this will not happen if he must stop to develcp a detailed program . [Ref. 241] Hardware as well as software should be considered in

  17. GEOSAT Follow-On (GFO) Altimeter Document Series. Volume 5; Version 1; GFO Radar Altimeter Processing at Wallops Flight Facility

    NASA Technical Reports Server (NTRS)

    Lockwood, Dennis W.; Conger, A. M.

    2003-01-01

    This document is a compendium of the WFF GFO Software Development Team's knowledge regarding of GDO CAL/VAL Data. It includes many elements of a requirements document, a software specification document, a software design document, and a user's guide. In the more technical sections, this document assumes the reader is familiar with GFO and its CAL/VAL Data.

  18. Automatic programming of simulation models

    NASA Technical Reports Server (NTRS)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1990-01-01

    The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.

  19. Implementing Software Safety in the NASA Environment

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha S.; Radley, Charles F.

    1994-01-01

    Until recently, NASA did not consider allowing computers total control of flight systems. Human operators, via hardware, have constituted the ultimate safety control. In an attempt to reduce costs, NASA has come to rely more and more heavily on computers and software to control space missions. (For example. software is now planned to control most of the operational functions of the International Space Station.) Thus the need for systematic software safety programs has become crucial for mission success. Concurrent engineering principles dictate that safety should be designed into software up front, not tested into the software after the fact. 'Cost of Quality' studies have statistics and metrics to prove the value of building quality and safety into the development cycle. Unfortunately, most software engineers are not familiar with designing for safety, and most safety engineers are not software experts. Software written to specifications which have not been safety analyzed is a major source of computer related accidents. Safer software is achieved step by step throughout the system and software life cycle. It is a process that includes requirements definition, hazard analyses, formal software inspections, safety analyses, testing, and maintenance. The greatest emphasis is placed on clearly and completely defining system and software requirements, including safety and reliability requirements. Unfortunately, development and review of requirements are the weakest link in the process. While some of the more academic methods, e.g. mathematical models, may help bring about safer software, this paper proposes the use of currently approved software methodologies, and sound software and assurance practices to show how, to a large degree, safety can be designed into software from the start. NASA's approach today is to first conduct a preliminary system hazard analysis (PHA) during the concept and planning phase of a project. This determines the overall hazard potential of the system to be built. Shortly thereafter, as the system requirements are being defined, the second iteration of hazard analyses takes place, the systems hazard analysis (SHA). During the systems requirements phase, decisions are made as to what functions of the system will be the responsibility of software. This is the most critical time to affect the safety of the software. From this point, software safety analyses as well as software engineering practices are the main focus for assuring safe software. While many of the steps proposed in this paper seem like just sound engineering practices, they are the best technical and most cost effective means to assure safe software within a safe system.

  20. 21 CFR 820.3 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... specified DMR requirements before it is released for distribution. (y) Specification means any requirement... for distribution. (c) Component means any raw material, substance, piece, part, software, firmware... physical and performance requirements of a device that are used as a basis for device design. (g) Design...

  1. A high order approach to flight software development and testing

    NASA Technical Reports Server (NTRS)

    Steinbacher, J.

    1981-01-01

    The use of a software development facility is discussed as a means of producing a reliable and maintainable ECS software system, and as a means of providing efficient use of the ECS hardware test facility. Principles applied to software design are given, including modularity, abstraction, hiding, and uniformity. The general objectives of each phase of the software life cycle are also given, including testing, maintenance, code development, and requirement specifications. Software development facility tools are summarized, and tool deficiencies recognized in the code development and testing phases are considered. Due to limited lab resources, the functional simulation capabilities may be indispensable in the testing phase.

  2. Integrating interface slicing into software engineering processes

    NASA Technical Reports Server (NTRS)

    Beck, Jon

    1993-01-01

    Interface slicing is a tool which was developed to facilitate software engineering. As previously presented, it was described in terms of its techniques and mechanisms. The integration of interface slicing into specific software engineering activities is considered by discussing a number of potential applications of interface slicing. The applications discussed specifically address the problems, issues, or concerns raised in a previous project. Because a complete interface slicer is still under development, these applications must be phrased in future tenses. Nonetheless, the interface slicing techniques which were presented can be implemented using current compiler and static analysis technology. Whether implemented as a standalone tool or as a module in an integrated development or reverse engineering environment, they require analysis no more complex than that required for current system development environments. By contrast, conventional slicing is a methodology which, while showing much promise and intuitive appeal, has yet to be fully implemented in a production language environment despite 12 years of development.

  3. Flexible Software Architecture for Visualization and Seismic Data Analysis

    NASA Astrophysics Data System (ADS)

    Petunin, S.; Pavlov, I.; Mogilenskikh, D.; Podzyuban, D.; Arkhipov, A.; Baturuin, N.; Lisin, A.; Smith, A.; Rivers, W.; Harben, P.

    2007-12-01

    Research in the field of seismology requires software and signal processing utilities for seismogram manipulation and analysis. Seismologists and data analysts often encounter a major problem in the use of any particular software application specific to seismic data analysis: the tuning of commands and windows to the specific waveforms and hot key combinations so as to fit their familiar informational environment. The ability to modify the user's interface independently from the developer requires an adaptive code structure. An adaptive code structure also allows for expansion of software capabilities such as new signal processing modules and implementation of more efficient algorithms. Our approach is to use a flexible "open" architecture for development of geophysical software. This report presents an integrated solution for organizing a logical software architecture based on the Unix version of the Geotool software implemented on the Microsoft NET 2.0 platform. Selection of this platform greatly expands the variety and number of computers that can implement the software, including laptops that can be utilized in field conditions. It also facilitates implementation of communication functions for seismic data requests from remote databases through the Internet. The main principle of the new architecture for Geotool is that scientists should be able to add new routines for digital waveform analysis via software plug-ins that utilize the basic Geotool display for GUI interaction. The use of plug-ins allows the efficient integration of diverse signal-processing software, including software still in preliminary development, into an organized platform without changing the fundamental structure of that platform itself. An analyst's use of Geotool is tracked via a metadata file so that future studies can reconstruct, and alter, the original signal processing operations. The work has been completed in the framework of a joint Russian- American project.

  4. LimsPortal and BonsaiLIMS: development of a lab information management system for translational medicine

    PubMed Central

    2011-01-01

    Background Laboratory Information Management Systems (LIMS) are an increasingly important part of modern laboratory infrastructure. As typically very sophisticated software products, LIMS often require considerable resources to select, deploy and maintain. Larger organisations may have access to specialist IT support to assist with requirements elicitation and software customisation, however smaller groups will often have limited IT support to perform the kind of iterative development that can resolve the difficulties that biologists often have when specifying requirements. Translational medicine aims to accelerate the process of treatment discovery by bringing together multiple disciplines to discover new approaches to treating disease, or novel applications of existing treatments. The diverse set of disciplines and complexity of processing procedures involved, especially with the use of high throughput technologies, bring difficulties in customizing a generic LIMS to provide a single system for managing sample related data within a translational medicine research setting, especially where limited IT support is available. Results We have designed and developed a LIMS, BonsaiLIMS, around a very simple data model that can be easily implemented using a variety of technologies, and can be easily extended as specific requirements dictate. A reference implementation using Oracle 11 g database and the Python framework, Django is presented. Conclusions By focusing on a minimal feature set and a modular design we have been able to deploy the BonsaiLIMS system very quickly. The benefits to our institute have been the avoidance of the prolonged implementation timescales, budget overruns, scope creep, off-specifications and user fatigue issues that typify many enterprise software implementations. The transition away from using local, uncontrolled records in spreadsheet and paper formats to a centrally held, secured and backed-up database brings the immediate benefits of improved data visibility, audit and overall data quality. The open-source availability of this software allows others to rapidly implement a LIMS which in itself might sufficiently address user requirements. In situations where this software does not meet requirements, it can serve to elicit more accurate specifications from end-users for a more heavyweight LIMS by acting as a demonstrable prototype. PMID:21569484

  5. LimsPortal and BonsaiLIMS: development of a lab information management system for translational medicine.

    PubMed

    Bath, Timothy G; Bozdag, Selcuk; Afzal, Vackar; Crowther, Daniel

    2011-05-13

    Laboratory Information Management Systems (LIMS) are an increasingly important part of modern laboratory infrastructure. As typically very sophisticated software products, LIMS often require considerable resources to select, deploy and maintain. Larger organisations may have access to specialist IT support to assist with requirements elicitation and software customisation, however smaller groups will often have limited IT support to perform the kind of iterative development that can resolve the difficulties that biologists often have when specifying requirements. Translational medicine aims to accelerate the process of treatment discovery by bringing together multiple disciplines to discover new approaches to treating disease, or novel applications of existing treatments. The diverse set of disciplines and complexity of processing procedures involved, especially with the use of high throughput technologies, bring difficulties in customizing a generic LIMS to provide a single system for managing sample related data within a translational medicine research setting, especially where limited IT support is available. We have designed and developed a LIMS, BonsaiLIMS, around a very simple data model that can be easily implemented using a variety of technologies, and can be easily extended as specific requirements dictate. A reference implementation using Oracle 11 g database and the Python framework, Django is presented. By focusing on a minimal feature set and a modular design we have been able to deploy the BonsaiLIMS system very quickly. The benefits to our institute have been the avoidance of the prolonged implementation timescales, budget overruns, scope creep, off-specifications and user fatigue issues that typify many enterprise software implementations. The transition away from using local, uncontrolled records in spreadsheet and paper formats to a centrally held, secured and backed-up database brings the immediate benefits of improved data visibility, audit and overall data quality. The open-source availability of this software allows others to rapidly implement a LIMS which in itself might sufficiently address user requirements. In situations where this software does not meet requirements, it can serve to elicit more accurate specifications from end-users for a more heavyweight LIMS by acting as a demonstrable prototype.

  6. Tools for Embedded Computing Systems Software

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  7. SSMNG Software Service Manager: A Scalable Building Blocks Architecture for PUS Services & FDIR Management

    NASA Astrophysics Data System (ADS)

    Lisio, Giovanni; Candia, Sante; Campolo, Giovanni; Pascucci, Dario

    2011-08-01

    Thales Alenia Space Italy has carried out the definition of a configurable (on mission basis) PUS ECSS-E_70- 41A see [3] Centralised Services Layer, characterised by:- a mission-independent set of 'classes' implementing the services logic.- a mission-dependent set of configuration data and selection flags.The software components belonging to this layer implement the PUS standard services ECSS-E_70-41A and a set of mission-specific services. The design of this layer has been performed by separating the services mechanisms (mission-independent execution logic) from the services configuration information (mission-dependent data). Once instantiated for a specific mission, the PUS Centralised Services Layer offers a large set of capabilities available to the CSCI's Applications Layer. This paper describes the building blocks PUS architectural solution developed by Thales Alenia Space Italy, emphasizing the mechanisms which allow easy configuration of the Scalable PUS library to fulfill the requirements of different missions. This paper also focus the Thales Alenia Space solution to automatically generate the mission-specific "PUS Services" flight software based on mission specific requirements. Building the PUS services mechanisms, which are configurable on mission basis is part of the PRIMA (Multipurpose Spacecraft Bus ) 'missionisation' process improvement. PRIMA Platform Avionics Software (ASW) is continuously evolving to improve modularity and standardization of interfaces and of SW components (see references in [1]).

  8. Software Requirements Analysis as Fault Predictor

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores

    2003-01-01

    Waiting until the integration and system test phase to discover errors leads to more costly rework than resolving those same errors earlier in the lifecycle. Costs increase even more significantly once a software system has become operational. WE can assess the quality of system requirements, but do little to correlate this information either to system assurance activities or long-term reliability projections - both of which remain unclear and anecdotal. Extending earlier work on requirements accomplished by the ARM tool, measuring requirements quality information against code complexity and test data for the same system may be used to predict specific software modules containing high impact or deeply embedded faults now escaping in operational systems. Such knowledge would lead to more effective and efficient test programs. It may enable insight into whether a program should be maintained or started over.

  9. Automatic programming for critical applications

    NASA Technical Reports Server (NTRS)

    Loganantharaj, Raj L.

    1988-01-01

    The important phases of a software life cycle include verification and maintenance. Usually, the execution performance is an expected requirement in a software development process. Unfortunately, the verification and the maintenance of programs are the time consuming and the frustrating aspects of software engineering. The verification cannot be waived for the programs used for critical applications such as, military, space, and nuclear plants. As a consequence, synthesis of programs from specifications, an alternative way of developing correct programs, is becoming popular. The definition, or what is understood by automatic programming, has been changed with our expectations. At present, the goal of automatic programming is the automation of programming process. Specifically, it means the application of artificial intelligence to software engineering in order to define techniques and create environments that help in the creation of high level programs. The automatic programming process may be divided into two phases: the problem acquisition phase and the program synthesis phase. In the problem acquisition phase, an informal specification of the problem is transformed into an unambiguous specification while in the program synthesis phase such a specification is further transformed into a concrete, executable program.

  10. Development of a software safety process and a case study of its use

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    1993-01-01

    The goal of this research is to continue the development of a comprehensive approach to software safety and to evaluate the approach with a case study. The case study is a major part of the project, and it involves the analysis of a specific safety-critical system from the medical equipment domain. The particular application being used was selected because of the availability of a suitable candidate system. We consider the results to be generally applicable and in no way particularly limited by the domain. The research is concentrating on issues raised by the specification and verification phases of the software lifecycle since they are central to our previously-developed rigorous definitions of software safety. The theoretical research is based on our framework of definitions for software safety. In the area of specification, the main topics being investigated are the development of techniques for building system fault trees that correctly incorporate software issues and the development of rigorous techniques for the preparation of software safety specifications. The research results are documented. Another area of theoretical investigation is the development of verification methods tailored to the characteristics of safety requirements. Verification of the correct implementation of the safety specification is central to the goal of establishing safe software. The empirical component of this research is focusing on a case study in order to provide detailed characterizations of the issues as they appear in practice, and to provide a testbed for the evaluation of various existing and new theoretical results, tools, and techniques. The Magnetic Stereotaxis System is summarized.

  11. Project W-211, initial tank retrieval systems, retrieval control system software configuration management plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    RIECK, C.A.

    1999-02-23

    This Software Configuration Management Plan (SCMP) provides the instructions for change control of the W-211 Project, Retrieval Control System (RCS) software after initial approval/release but prior to the transfer of custody to the waste tank operations contractor. This plan applies to the W-211 system software developed by the project, consisting of the computer human-machine interface (HMI) and programmable logic controller (PLC) software source and executable code, for production use by the waste tank operations contractor. The plan encompasses that portion of the W-211 RCS software represented on project-specific AUTOCAD drawings that are released as part of the C1 definitive designmore » package (these drawings are identified on the drawing list associated with each C-1 package), and the associated software code. Implementation of the plan is required for formal acceptance testing and production release. The software configuration management plan does not apply to reports and data generated by the software except where specifically identified. Control of information produced by the software once it has been transferred for operation is the responsibility of the receiving organization.« less

  12. The integration of the risk management process with the lifecycle of medical device software.

    PubMed

    Pecoraro, F; Luzi, D

    2014-01-01

    The application of software in the Medical Device (MD) domain has become central to the improvement of diagnoses and treatments. The new European regulations that specifically address software as an important component of MD, require complex procedures to make software compliant with safety requirements, introducing thereby new challenges in the qualification and classification of MD software as well as in the performance of risk management activities. Under this perspective, the aim of this paper is to propose an integrated framework that combines the activities to be carried out by the manufacturer to develop safe software within the development lifecycle based on the regulatory requirements reported in US and European regulations as well as in the relevant standards and guidelines. A comparative analysis was carried out to identify the main issues related to the application of the current new regulations. In addition, standards and guidelines recently released to harmonise procedures for the validation of MD software have been used to define the risk management activities to be carried out by the manufacturer during the software development process. This paper highlights the main issues related to the qualification and classification of MD software, providing an analysis of the different regulations applied in Europe and the US. A model that integrates the risk management process within the software development lifecycle has been proposed too. It is based on regulatory requirements and considers software risk analysis as a central input to be managed by the manufacturer already at the initial stages of the software design, in order to prevent MD failures. Relevant changes in the process of MD development have been introduced with the recognition of software being an important component of MDs as stated in regulations and standards. This implies the performance of highly iterative processes that have to integrate the risk management in the framework of software development. It also makes it necessary to involve both medical and software engineering competences to safeguard patient and user safety.

  13. Software Productivity of Field Experiments Using the Mobile Agents Open Architecture with Workflow Interoperability

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Lowry, Michael R.; Nado, Robert Allen; Sierhuis, Maarten

    2011-01-01

    We analyzed a series of ten systematically developed surface exploration systems that integrated a variety of hardware and software components. Design, development, and testing data suggest that incremental buildup of an exploration system for long-duration capabilities is facilitated by an open architecture with appropriate-level APIs, specifically designed to facilitate integration of new components. This improves software productivity by reducing changes required for reconfiguring an existing system.

  14. Shuttle mission simulator requirement report, volume 2, revision A

    NASA Technical Reports Server (NTRS)

    Burke, J. F.

    1973-01-01

    The training requirements of all mission phases for crews and ground support personnel are presented. The specifications are given for the design and development of the simulator, data processing systems, engine control, software, and systems integration.

  15. Using SFOC to fly the Magellan Venus mapping mission

    NASA Technical Reports Server (NTRS)

    Bucher, Allen W.; Leonard, Robert E., Jr.; Short, Owen G.

    1993-01-01

    Traditionally, spacecraft flight operations at the Jet Propulsion Laboratory (JPL) have been performed by teams of spacecraft experts utilizing ground software designed specifically for the current mission. The Jet Propulsion Laboratory set out to reduce the cost of spacecraft mission operations by designing ground data processing software that could be used by multiple spacecraft missions, either sequentially or concurrently. The Space Flight Operations Center (SFOC) System was developed to provide the ground data system capabilities needed to monitor several spacecraft simultaneously and provide enough flexibility to meet the specific needs of individual projects. The Magellan Spacecraft Team utilizes the SFOC hardware and software designed for engineering telemetry analysis, both real-time and non-real-time. The flexibility of the SFOC System has allowed the spacecraft team to integrate their own tools with SFOC tools to perform the tasks required to operate a spacecraft mission. This paper describes how the Magellan Spacecraft Team is utilizing the SFOC System in conjunction with their own software tools to perform the required tasks of spacecraft event monitoring as well as engineering data analysis and trending.

  16. FermiLib v0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MCCLEAN, JARROD; HANER, THOMAS; STEIGER, DAMIAN

    FermiLib is an open source software package designed to facilitate the development and testing of algorithms for simulations of fermionic systems on quantum computers. Fermionic simulations represent an important application of early quantum devices with a lot of potential high value targets, such as quantum chemistry for the development of new catalysts. This software strives to provide a link between the required domain expertise in specific fermionic applications and quantum computing to enable more users to directly interface with, and develop for, these applications. It is an extensible Python library designed to interface with the high performance quantum simulator, ProjectQ,more » as well as application specific software such as PSI4 from the domain of quantum chemistry. Such software is key to enabling effective user facilities in quantum computation research.« less

  17. AMPHION: Specification-based programming for scientific subroutine libraries

    NASA Technical Reports Server (NTRS)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Waldinger, Richard; Stickel, Mark

    1994-01-01

    AMPHION is a knowledge-based software engineering (KBSE) system that guides a user in developing a diagram representing a formal problem specification. It then automatically implements a solution to this specification as a program consisting of calls to subroutines from a library. The diagram provides an intuitive domain oriented notation for creating a specification that also facilitates reuse and modification. AMPHION'S architecture is domain independent. AMPHION is specialized to an application domain by developing a declarative domain theory. Creating a domain theory is an iterative process that currently requires the joint expertise of domain experts and experts in automated formal methods for software development.

  18. An Object-Based Requirements Modeling Method.

    ERIC Educational Resources Information Center

    Cordes, David W.; Carver, Doris L.

    1992-01-01

    Discusses system modeling and specification as it relates to object-based information systems development and software development. An automated system model based on the objects in the initial requirements document is described, the requirements document translator is explained, and a sample application of the technique is provided. (12…

  19. Analysis and specification tools in relation to the APSE

    NASA Technical Reports Server (NTRS)

    Hendricks, John W.

    1986-01-01

    Ada and the Ada Programming Support Environment (APSE) specifically address the phases of the system/software life cycle which follow after the user's problem was translated into system and software development specifications. The waterfall model of the life cycle identifies the analysis and requirements definition phases as preceeding program design and coding. Since Ada is a programming language and the APSE is a programming support environment, they are primarily targeted to support program (code) development, tecting, and maintenance. The use of Ada based or Ada related specification languages (SLs) and program design languages (PDLs) can extend the use of Ada back into the software design phases of the life cycle. Recall that the standardization of the APSE as a programming support environment is only now happening after many years of evolutionary experience with diverse sets of programming support tools. Restricting consideration to one, or even a few chosen specification and design tools, could be a real mistake for an organization or a major project such as the Space Station, which will need to deal with an increasingly complex level of system problems. To require that everything be Ada-like, be implemented in Ada, run directly under the APSE, and fit into a rigid waterfall model of the life cycle would turn a promising support environment into a straight jacket for progress.

  20. Software reuse issues affecting AdaNET

    NASA Technical Reports Server (NTRS)

    Mcbride, John G.

    1989-01-01

    The AdaNet program is reviewing its long-term goals and strategies. A significant concern is whether current AdaNet plans adequately address the major strategic issues of software reuse technology. The major reuse issues of providing AdaNet services that should be addressed as part of future AdaNet development are identified and reviewed. Before significant development proceeds, a plan should be developed to resolve the aforementioned issues. This plan should also specify a detailed approach to develop AdaNet. A three phased strategy is recommended. The first phase would consist of requirements analysis and produce an AdaNet system requirements specification. It would consider the requirements of AdaNet in terms of mission needs, commercial realities, and administrative policies affecting development, and the experience of AdaNet and other projects promoting the transfer software engineering technology. Specifically, requirements analysis would be performed to better understand the requirements for AdaNet functions. The second phase would provide a detailed design of the system. The AdaNet should be designed with emphasis on the use of existing technology readily available to the AdaNet program. A number of reuse products are available upon which AdaNet could be based. This would significantly reduce the risk and cost of providing an AdaNet system. Once a design was developed, implementation would proceed in the third phase.

  1. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is shown.

  2. Neutron imaging data processing using the Mantid framework

    NASA Astrophysics Data System (ADS)

    Pouzols, Federico M.; Draper, Nicholas; Nagella, Sri; Yang, Erica; Sajid, Ahmed; Ross, Derek; Ritchie, Brian; Hill, John; Burca, Genoveva; Minniti, Triestino; Moreton-Smith, Christopher; Kockelmann, Winfried

    2016-09-01

    Several imaging instruments are currently being constructed at neutron sources around the world. The Mantid software project provides an extensible framework that supports high-performance computing for data manipulation, analysis and visualisation of scientific data. At ISIS, IMAT (Imaging and Materials Science & Engineering) will offer unique time-of-flight neutron imaging techniques which impose several software requirements to control the data reduction and analysis. Here we outline the extensions currently being added to Mantid to provide specific support for neutron imaging requirements.

  3. Statewide test of construction quality index for pavement software : final report, October 2008.

    DOT National Transportation Integrated Search

    2008-10-01

    All Florida Department of Transportation (FDOT) pavement projects are accepted in accordance with : one or more construction specifications. The purposes of these specifications are to provide guidance : and establish minimum requirements that enable...

  4. ARIES: Acquisition of Requirements and Incremental Evolution of Specifications

    NASA Technical Reports Server (NTRS)

    Roberts, Nancy A.

    1993-01-01

    This paper describes a requirements/specification environment specifically designed for large-scale software systems. This environment is called ARIES (Acquisition of Requirements and Incremental Evolution of Specifications). ARIES provides assistance to requirements analysts for developing operational specifications of systems. This development begins with the acquisition of informal system requirements. The requirements are then formalized and gradually elaborated (transformed) into formal and complete specifications. ARIES provides guidance to the user in validating formal requirements by translating them into natural language representations and graphical diagrams. ARIES also provides ways of analyzing the specification to ensure that it is correct, e.g., testing the specification against a running simulation of the system to be built. Another important ARIES feature, especially when developing large systems, is the sharing and reuse of requirements knowledge. This leads to much less duplication of effort. ARIES combines all of its features in a single environment that makes the process of capturing a formal specification quicker and easier.

  5. Using Combined SFTA and SFMECA Techniques for Space Critical Software

    NASA Astrophysics Data System (ADS)

    Nicodemos, F. G.; Lahoz, C. H. N.; Abdala, M. A. D.; Saotome, O.

    2012-01-01

    This work addresses the combined Software Fault Tree Analysis (SFTA) and Software Failure Modes, Effects and Criticality Analysis (SFMECA) techniques applied to space critical software of satellite launch vehicles. The combined approach is under research as part of the Verification and Validation (V&V) efforts to increase software dependability and as future application in other projects under development at Instituto de Aeronáutica e Espaço (IAE). The applicability of such approach was conducted on system software specification and applied to a case study based on the Brazilian Satellite Launcher (VLS). The main goal is to identify possible failure causes and obtain compensating provisions that lead to inclusion of new functional and non-functional system software requirements.

  6. Ensemble Eclipse: A Process for Prefab Development Environment for the Ensemble Project

    NASA Technical Reports Server (NTRS)

    Wallick, Michael N.; Mittman, David S.; Shams, Khawaja, S.; Bachmann, Andrew G.; Ludowise, Melissa

    2013-01-01

    This software simplifies the process of having to set up an Eclipse IDE programming environment for the members of the cross-NASA center project, Ensemble. It achieves this by assembling all the necessary add-ons and custom tools/preferences. This software is unique in that it allows developers in the Ensemble Project (approximately 20 to 40 at any time) across multiple NASA centers to set up a development environment almost instantly and work on Ensemble software. The software automatically has the source code repositories and other vital information and settings included. The Eclipse IDE is an open-source development framework. The NASA (Ensemble-specific) version of the software includes Ensemble-specific plug-ins as well as settings for the Ensemble project. This software saves developers the time and hassle of setting up a programming environment, making sure that everything is set up in the correct manner for Ensemble development. Existing software (i.e., standard Eclipse) requires an intensive setup process that is both time-consuming and error prone. This software is built once by a single user and tested, allowing other developers to simply download and use the software

  7. Advanced information processing system: Local system services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Alger, Linda; Whittredge, Roy; Stasiowski, Peter

    1989-01-01

    The Advanced Information Processing System (AIPS) is a multi-computer architecture composed of hardware and software building blocks that can be configured to meet a broad range of application requirements. The hardware building blocks are fault-tolerant, general-purpose computers, fault-and damage-tolerant networks (both computer and input/output), and interfaces between the networks and the computers. The software building blocks are the major software functions: local system services, input/output, system services, inter-computer system services, and the system manager. The foundation of the local system services is an operating system with the functions required for a traditional real-time multi-tasking computer, such as task scheduling, inter-task communication, memory management, interrupt handling, and time maintenance. Resting on this foundation are the redundancy management functions necessary in a redundant computer and the status reporting functions required for an operator interface. The functional requirements, functional design and detailed specifications for all the local system services are documented.

  8. Tools reference manual for a Requirements Specification Language (RSL), version 2.0

    NASA Technical Reports Server (NTRS)

    Fisher, Gene L.; Cohen, Gerald C.

    1993-01-01

    This report describes a general-purpose Requirements Specification Language, RSL. The purpose of RSL is to specify precisely the external structure of a mechanized system and to define requirements that the system must meet. A system can be comprised of a mixture of hardware, software, and human processing elements. RSL is a hybrid of features found in several popular requirements specification languages, such as SADT (Structured Analysis and Design Technique), PSL (Problem Statement Language), and RMF (Requirements Modeling Framework). While languages such as these have useful features for structuring a specification, they generally lack formality. To overcome the deficiencies of informal requirements languages, RSL has constructs for formal mathematical specification. These constructs are similar to those found in formal specification languages such as EHDM (Enhanced Hierarchical Development Methodology), Larch, and OBJ3.

  9. Model-Drive Architecture for Agent-Based Systems

    NASA Technical Reports Server (NTRS)

    Gradanin, Denis; Singh, H. Lally; Bohner, Shawn A.; Hinchey, Michael G.

    2004-01-01

    The Model Driven Architecture (MDA) approach uses a platform-independent model to define system functionality, or requirements, using some specification language. The requirements are then translated to a platform-specific model for implementation. An agent architecture based on the human cognitive model of planning, the Cognitive Agent Architecture (Cougaar) is selected for the implementation platform. The resulting Cougaar MDA prescribes certain kinds of models to be used, how those models may be prepared and the relationships of the different kinds of models. Using the existing Cougaar architecture, the level of application composition is elevated from individual components to domain level model specifications in order to generate software artifacts. The software artifacts generation is based on a metamodel. Each component maps to a UML structured component which is then converted into multiple artifacts: Cougaar/Java code, documentation, and test cases.

  10. Computing and software

    USGS Publications Warehouse

    White, Gary C.; Hines, J.E.

    2004-01-01

    The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methodsavailable. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them.In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented.Rotella et al. (2004) compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004).Efford et al. (2004) present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years.Barker & White (2004) discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood functions together to generate estimates. The idea is interesting, and maybe some bright young statistician can work out the specifics to implement the procedure.Choquet et al. (2004) describe MSURGE, a software package that implements the multistate capture–recapture models. The unique feature of MSURGE is that the design matrix is constructed with an interpreted language called GEMACO. Because MSURGE is limited to just multistate models, the special requirements of these likelihoods can be provided.The software and methods presented in these papers gives biologists and wildlife managers an expanding range of possibilities for data analysis. Although ease–of–use is generally getting better, it does not replace the need for understanding of the requirements and structure of the models being computed. The internet provides access to many free software packages as well as user–discussion groups to share knowledge and ideas. (A starting point for wildlife–related applications is (http://www.phidot.org).

  11. HelioScan: a software framework for controlling in vivo microscopy setups with high hardware flexibility, functional diversity and extendibility.

    PubMed

    Langer, Dominik; van 't Hoff, Marcel; Keller, Andreas J; Nagaraja, Chetan; Pfäffli, Oliver A; Göldi, Maurice; Kasper, Hansjörg; Helmchen, Fritjof

    2013-04-30

    Intravital microscopy such as in vivo imaging of brain dynamics is often performed with custom-built microscope setups controlled by custom-written software to meet specific requirements. Continuous technological advancement in the field has created a need for new control software that is flexible enough to support the biological researcher with innovative imaging techniques and provide the developer with a solid platform for quickly and easily implementing new extensions. Here, we introduce HelioScan, a software package written in LabVIEW, as a platform serving this dual role. HelioScan is designed as a collection of components that can be flexibly assembled into microscope control software tailored to the particular hardware and functionality requirements. Moreover, HelioScan provides a software framework, within which new functionality can be implemented in a quick and structured manner. A specific HelioScan application assembles at run-time from individual software components, based on user-definable configuration files. Due to its component-based architecture, HelioScan can exploit synergies of multiple developers working in parallel on different components in a community effort. We exemplify the capabilities and versatility of HelioScan by demonstrating several in vivo brain imaging modes, including camera-based intrinsic optical signal imaging for functional mapping of cortical areas, standard two-photon laser-scanning microscopy using galvanometric mirrors, and high-speed in vivo two-photon calcium imaging using either acousto-optic deflectors or a resonant scanner. We recommend HelioScan as a convenient software framework for the in vivo imaging community. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Reducing Lifecycle Sustainment Costs

    DTIC Science & Technology

    2015-05-01

    ahead of government systems – Specific O&S needs in government: depots, software centers, VAMOSC/ ERP interfaces Implications of ERP Systems...funding is not allocated for its implementation .  Technology Refresh often requires non-recurring engineering investment, but the Working Capital Funds...VAMOSC Systems – Cost and Software Data Reports (CSDRs) • Contractor Logistics Support Contracts • Includes subcontractor reporting – Effects of

  13. A NASA family of minicomputer systems, Appendix A

    NASA Technical Reports Server (NTRS)

    Deregt, M. P.; Dulfer, J. E.

    1972-01-01

    This investigation was undertaken to establish sufficient specifications, or standards, for minicomputer hardware and software to provide NASA with realizable economics in quantity purchases, interchangeability of minicomputers, software, storage and peripherals, and a uniformly high quality. The standards will define minicomputer system component types, each specialized to its intended NASA application, in as many levels of capacity as required.

  14. 7 CFR 1753.6 - Standards, specifications, and general requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... compliant, as defined in 7 CFR 1735.22(e). (d) All materials and equipment financed with loan funds are subject to the “Buy American” provision (7 U.S.C. 901 et seq. as amended in 1938). (e) All software, software systems, and firmware financed with loan funds must be year 2000 compliant, as defined in 7 CFR...

  15. 7 CFR 1753.6 - Standards, specifications, and general requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... compliant, as defined in 7 CFR 1735.22(e). (d) All materials and equipment financed with loan funds are subject to the “Buy American” provision (7 U.S.C. 901 et seq. as amended in 1938). (e) All software, software systems, and firmware financed with loan funds must be year 2000 compliant, as defined in 7 CFR...

  16. 7 CFR 1753.6 - Standards, specifications, and general requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... compliant, as defined in 7 CFR 1735.22(e). (d) All materials and equipment financed with loan funds are subject to the “Buy American” provision (7 U.S.C. 901 et seq. as amended in 1938). (e) All software, software systems, and firmware financed with loan funds must be year 2000 compliant, as defined in 7 CFR...

  17. 7 CFR 1753.6 - Standards, specifications, and general requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... compliant, as defined in 7 CFR 1735.22(e). (d) All materials and equipment financed with loan funds are subject to the “Buy American” provision (7 U.S.C. 901 et seq. as amended in 1938). (e) All software, software systems, and firmware financed with loan funds must be year 2000 compliant, as defined in 7 CFR...

  18. HAL/S language specification

    NASA Technical Reports Server (NTRS)

    Newbold, P. M.

    1974-01-01

    A programming language for the flight software of the NASA space shuttle program was developed and identified as HAL/S. The language is intended to satisfy virtually all of the flight software requirements of the space shuttle. The language incorporates a wide range of features, including applications-oriented data types and organizations, real time control mechanisms, and constructs for systems programming tasks.

  19. Self-service for software development projects and HPC activities

    NASA Astrophysics Data System (ADS)

    Husejko, M.; Høimyr, N.; Gonzalez, A.; Koloventzos, G.; Asbury, D.; Trzcinska, A.; Agtzidis, I.; Botrel, G.; Otto, J.

    2014-05-01

    This contribution describes how CERN has implemented several essential tools for agile software development processes, ranging from version control (Git) to issue tracking (Jira) and documentation (Wikis). Running such services in a large organisation like CERN requires many administrative actions both by users and service providers, such as creating software projects, managing access rights, users and groups, and performing tool-specific customisation. Dealing with these requests manually would be a time-consuming task. Another area of our CERN computing services that has required dedicated manual support has been clusters for specific user communities with special needs. Our aim is to move all our services to a layered approach, with server infrastructure running on the internal cloud computing infrastructure at CERN. This contribution illustrates how we plan to optimise the management of our of services by means of an end-user facing platform acting as a portal into all the related services for software projects, inspired by popular portals for open-source developments such as Sourceforge, GitHub and others. Furthermore, the contribution will discuss recent activities with tests and evaluations of High Performance Computing (HPC) applications on different hardware and software stacks, and plans to offer a dynamically scalable HPC service at CERN, based on affordable hardware.

  20. Software assurance standard

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.

  1. Pointo - a Low Cost Solution to Point Cloud Processing

    NASA Astrophysics Data System (ADS)

    Houshiar, H.; Winkler, S.

    2017-11-01

    With advance in technology access to data especially 3D point cloud data becomes more and more an everyday task. 3D point clouds are usually captured with very expensive tools such as 3D laser scanners or very time consuming methods such as photogrammetry. Most of the available softwares for 3D point cloud processing are designed for experts and specialists in this field and are usually very large software packages containing variety of methods and tools. This results in softwares that are usually very expensive to acquire and also very difficult to use. Difficulty of use is caused by complicated user interfaces that is required to accommodate a large list of features. The aim of these complex softwares is to provide a powerful tool for a specific group of specialist. However they are not necessary required by the majority of the up coming average users of point clouds. In addition to complexity and high costs of these softwares they generally rely on expensive and modern hardware and only compatible with one specific operating system. Many point cloud customers are not point cloud processing experts or willing to spend the high acquisition costs of these expensive softwares and hardwares. In this paper we introduce a solution for low cost point cloud processing. Our approach is designed to accommodate the needs of the average point cloud user. To reduce the cost and complexity of software our approach focuses on one functionality at a time in contrast with most available softwares and tools that aim to solve as many problems as possible at the same time. Our simple and user oriented design improve the user experience and empower us to optimize our methods for creation of an efficient software. In this paper we introduce Pointo family as a series of connected softwares to provide easy to use tools with simple design for different point cloud processing requirements. PointoVIEWER and PointoCAD are introduced as the first components of the Pointo family to provide a fast and efficient visualization with the ability to add annotation and documentation to the point clouds.

  2. Friendly Extensible Transfer Tool Beta Version

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, William P.; Gutierrez, Kenneth M.; McRee, Susan R.

    2016-04-15

    Often data transfer software is designed to meet specific requirements or apply to specific environments. Frequently, this requires source code integration for added functionality. An extensible data transfer framework is needed to more easily incorporate new capabilities, in modular fashion. Using FrETT framework, functionality may be incorporated (in many cases without need of source code) to handle new platform capabilities: I/O methods (e.g., platform specific data access), network transport methods, data processing (e.g., data compression.).

  3. Wildlife software: procedures for publication of computer software

    USGS Publications Warehouse

    Samuel, M.D.

    1990-01-01

    Computers and computer software have become an integral part of the practice of wildlife science. Computers now play an important role in teaching, research, and management applications. Because of the specialized nature of wildlife problems, specific computer software is usually required to address a given problem (e.g., home range analysis). This type of software is not usually available from commercial vendors and therefore must be developed by those wildlife professionals with particular skill in computer programming. Current journal publication practices generally prevent a detailed description of computer software associated with new techniques. In addition, peer review of journal articles does not usually include a review of associated computer software. Thus, many wildlife professionals are usually unaware of computer software that would meet their needs or of major improvements in software they commonly use. Indeed most users of wildlife software learn of new programs or important changes only by word of mouth.

  4. Network, system, and status software enhancements for the autonomously managed electrical power system breadboard. Volume 3: Commands specification

    NASA Technical Reports Server (NTRS)

    Mckee, James W.

    1990-01-01

    This volume (3 of 4) contains the specification for the command language for the AMPS system. The volume contains a requirements specification for the operating system and commands and a design specification for the operating system and command. The operating system and commands sits on top of the protocol. The commands are an extension of the present set of AMPS commands in that the commands are more compact, allow multiple sub-commands to be bundled into one command, and have provisions for identifying the sender and the intended receiver. The commands make no change to the actual software that implement the commands.

  5. 45 CFR 164.312 - Technical safeguards.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... REQUIREMENTS SECURITY AND PRIVACY Security Standards for the Protection of Electronic Protected Health... that maintain electronic protected health information to allow access only to those persons or software... specifications: (i) Unique user identification (Required). Assign a unique name and/or number for identifying and...

  6. Image manipulation software portable on different hardware platforms: what is the cost?

    NASA Astrophysics Data System (ADS)

    Ligier, Yves; Ratib, Osman M.; Funk, Matthieu; Perrier, Rene; Girard, Christian; Logean, Marianne

    1992-07-01

    A hospital wide PACS project is currently under development at the University Hospital of Geneva. The visualization and manipulation of images provided by different imaging modalities constitutes one of the most challenging components of a PACS. Because there are different requirements depending on the clinical usage, it was necessary for such a visualization software to be provided on different types of workstations in different sectors of the PACS. The user interface has to be the same independently of the underlying workstation. Beside, in addition to a standard set of image manipulation and processing tools there is a need for more specific clinical tools that should be easily adapted to specific medical requirements. To achieve operating and windowing systems: the standard Unix/X-11/OSF-Motif based workstations and the Macintosh family and should be easily ported on other systems. This paper describes the design of such a system and discusses the extra cost and efforts involved in the development of a portable and easily expandable software.

  7. Planning of electroporation-based treatments using Web-based treatment-planning software.

    PubMed

    Pavliha, Denis; Kos, Bor; Marčan, Marija; Zupanič, Anže; Serša, Gregor; Miklavčič, Damijan

    2013-11-01

    Electroporation-based treatment combining high-voltage electric pulses and poorly permanent cytotoxic drugs, i.e., electrochemotherapy (ECT), is currently used for treating superficial tumor nodules by following standard operating procedures. Besides ECT, another electroporation-based treatment, nonthermal irreversible electroporation (N-TIRE), is also efficient at ablating deep-seated tumors. To perform ECT or N-TIRE of deep-seated tumors, following standard operating procedures is not sufficient and patient-specific treatment planning is required for successful treatment. Treatment planning is required because of the use of individual long-needle electrodes and the diverse shape, size and location of deep-seated tumors. Many institutions that already perform ECT of superficial metastases could benefit from treatment-planning software that would enable the preparation of patient-specific treatment plans. To this end, we have developed a Web-based treatment-planning software for planning electroporation-based treatments that does not require prior engineering knowledge from the user (e.g., the clinician). The software includes algorithms for automatic tissue segmentation and, after segmentation, generation of a 3D model of the tissue. The procedure allows the user to define how the electrodes will be inserted. Finally, electric field distribution is computed, the position of electrodes and the voltage to be applied are optimized using the 3D model and a downloadable treatment plan is made available to the user.

  8. Pipe Flow Simulation Software: A Team Approach to Solve an Engineering Education Problem.

    ERIC Educational Resources Information Center

    Engel, Renata S.; And Others

    1996-01-01

    A computer simulation program for use in the study of fluid mechanics is described. The package is an interactive tool to explore the fluid flow characteristics of a pipe system by manipulating the physical construction of the system. The motivation, software design requirements, and specific details on how its objectives were met are presented.…

  9. A Requirement Specification Language for AADL

    DTIC Science & Technology

    2016-06-01

    008 | SOFTWARE ENGINEERING INSTITUTE | CARNEGIE MELLON UNIVERSITY Distribution Statement A: Approved for Public Release; Distribution is Unlimited...Copyright 2016 Carnegie Mellon University This material is based upon work funded and supported by the Department of Defense under Contract No...FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineer- ing Institute, a federally funded research and development

  10. NOSS Altimeter Detailed Algorithm specifications

    NASA Technical Reports Server (NTRS)

    Hancock, D. W.; Mcmillan, J. D.

    1982-01-01

    The details of the algorithms and data sets required for satellite radar altimeter data processing are documented in a form suitable for (1) development of the benchmark software and (2) coding the operational software. The algorithms reported in detail are those established for altimeter processing. The algorithms which required some additional development before documenting for production were only scoped. The algorithms are divided into two levels of processing. The first level converts the data to engineering units and applies corrections for instrument variations. The second level provides geophysical measurements derived from altimeter parameters for oceanographic users.

  11. HAL/S-FC compiler system functional specification

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The functional requirements to be met by the HAL/S-FC compiler, and the hardware and software compatibilities between the compiler system and the environment in which it operates are defined. Associated runtime facilities and the interface with the Software Development Laboratory are specified. The construction of the HAL/S-FC system as functionally separate units and the interfaces between those units is described. An overview of the system's capabilities is presented and the hardware/operating system requirements are specified. The computer-dependent aspects of the HAL/S-FC are also specified. Compiler directives are included.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, J.E.

    Many robotic operations, e.g., mapping, scanning, feature following, etc., require accurate surface following of arbitrary targets. This paper presents a versatile surface following and mapping system designed to promote hardware, software and application independence, modular development, and upward expandability. These goals are met by: a full, a priori specification of the hardware and software interfaces; a modular system architecture; and a hierarchical surface-data analysis method, permitting application specific tuning at each conceptual level of topological abstraction. This surface following system was fully designed and independently of any specific robotic host, then successfully integrated with and demonstrated on a completely amore » priori unknown, real-time robotic system. 7 refs.« less

  13. Software engineering methodologies and tools

    NASA Technical Reports Server (NTRS)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  14. 45 CFR 164.312 - Technical safeguards.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... REQUIREMENTS SECURITY AND PRIVACY Security Standards for the Protection of Electronic Protected Health... persons or software programs that have been granted access rights as specified in § 164.308(a)(4). (2) Implementation specifications: (i) Unique user identification (Required). Assign a unique name and/or number for...

  15. Teaching meta-analysis using MetaLight.

    PubMed

    Thomas, James; Graziosi, Sergio; Higgins, Steve; Coe, Robert; Torgerson, Carole; Newman, Mark

    2012-10-18

    Meta-analysis is a statistical method for combining the results of primary studies. It is often used in systematic reviews and is increasingly a method and topic that appears in student dissertations. MetaLight is a freely available software application that runs simple meta-analyses and contains specific functionality to facilitate the teaching and learning of meta-analysis. While there are many courses and resources for meta-analysis available and numerous software applications to run meta-analyses, there are few pieces of software which are aimed specifically at helping those teaching and learning meta-analysis. Valuable teaching time can be spent learning the mechanics of a new software application, rather than on the principles and practices of meta-analysis. We discuss ways in which the MetaLight tool can be used to present some of the main issues involved in undertaking and interpreting a meta-analysis. While there are many software tools available for conducting meta-analysis, in the context of a teaching programme such software can require expenditure both in terms of money and in terms of the time it takes to learn how to use it. MetaLight was developed specifically as a tool to facilitate the teaching and learning of meta-analysis and we have presented here some of the ways it might be used in a training situation.

  16. Toward domain-specific design environments: Some representation ideas from the telecommunications domain

    NASA Technical Reports Server (NTRS)

    Greenspan, Sol; Feblowitz, Mark

    1992-01-01

    ACME is an experimental environment for investigating new approaches to modeling and analysis of system requirements and designs. ACME is built on and extends object-oriented conceptual modeling techniques and knowledge representation and reasoning (KRR) tools. The most immediate intended use for ACME is to help represent, understand, and communicate system designs during the early stages of system planning and requirements engineering. While our research is ostensibly aimed at software systems in general, we are particularly motivated to make an impact in the telecommunications domain, especially in the area referred to as Intelligent Networks (IN's). IN systems contain the software to provide services to users of a telecommunications network (e.g., call processing services, information services, etc.) as well as the software that provides the internal infrastructure for providing the services (e.g., resource management, billing, etc.). The software includes not only systems developed by the network proprietors but also by a growing group of independent service software providers.

  17. Section 3: Quality and Value-Based Requirements

    NASA Astrophysics Data System (ADS)

    Mylopoulos, John

    Traditionally, research and practice in software engineering has focused its attention on specific software qualities, such as functionality and performance. According to this perspective, a system is deemed to be of good quality if it delivers all required functionality (“fitness-for-purpose”) and its performance is above required thresholds. Increasingly, primarily in research but also in practice, other qualities are attracting attention. To facilitate evolution, maintainability and adaptability are gaining popularity. Usability, universal accessibility, innovativeness, and enjoyability are being studied as novel types of non-functional requirements that we do not know how to define, let alone accommodate, but which we realize are critical under some contingencies. The growing importance of the business context in the design of software-intensive systems has also thrust economic value, legal compliance, and potential social and ethical implications into the forefront of requirements topics. A focus on the broader user environment and experience, as well as the organizational and societal implications of system use, thus has become more central to the requirements discourse. This section includes three contributions to this broad and increasingly important topic.

  18. Software Requirements Engineering Methodology

    DTIC Science & Technology

    1976-09-01

    common speech, so that the specification can be read by managers, systems enginetrs, and others who are not specially trained in the language. To...of the system and its DPS. They are usually implicit in the wording of the originating specifications, although the new SREM user must train ...to the name of the ENTITYjCLASS, the operation is applicable only to a single instance. This concentration of the requirements for creation and

  19. The Implementation of Satellite Control System Software Using Object Oriented Design

    NASA Technical Reports Server (NTRS)

    Anderson, Mark O.; Reid, Mark; Drury, Derek; Hansell, William; Phillips, Tom

    1998-01-01

    NASA established the Small Explorer (SMEX) program in 1988 to provide frequent opportunities for highly focused and relatively inexpensive space science missions that can be launched into low earth orbit by small expendable vehicles. The development schedule for each SMEX spacecraft was three years from start to launch. The SMEX program has produced five satellites; Solar Anomalous and Magnetospheric Particle Explorer (SAMPEX), Fast Auroral Snapshot Explorer (FAST), Submillimeter Wave Astronomy Satellite (SWAS), Transition Region and Coronal Explorer (TRACE) and Wide-Field Infrared Explorer (WIRE). SAMPEX and FAST are on-orbit, TRACE is scheduled to be launched in April of 1998, WIRE is scheduled to be launched in September of 1998, and SWAS is scheduled to be launched in January of 1999. In each of these missions, the Attitude Control System (ACS) software was written using a modular procedural design. Current program goals require complete spacecraft development within 18 months. This requirement has increased pressure to write reusable flight software. Object-Oriented Design (OOD) offers the constructs for developing an application that only needs modification for mission unique requirements. This paper describes the OOD that was used to develop the SMEX-Lite ACS software. The SMEX-Lite ACS is three-axis controlled, momentum stabilized, and is capable of performing sub-arc-minute pointing. The paper first describes the high level requirements which governed the architecture of the SMEX-Lite ACS software. Next, the context in which the software resides is explained. The paper describes the benefits of encapsulation, inheritance and polymorphism with respect to the implementation of an ACS software system. This paper will discuss the design of several software components that comprise the ACS software. Specifically, Object-Oriented designs are presented for sensor data processing, attitude control, attitude determination and failure detection. The paper addresses the benefits of the OOD versus a conventional procedural design. The final discussion in this paper will address the establishment of the ACS Foundation Class (AFC) Library. The AFC is a large software repository, requiring a minimal amount of code modifications to produce ACS software for future projects, saving production time and costs.

  20. Interactive Web Graphs with Fewer Restrictions

    NASA Technical Reports Server (NTRS)

    Fiedler, James

    2012-01-01

    There is growing popularity for interactive, statistical web graphs and programs to generate them. However, it seems that these programs tend to be somewhat restricted in which web browsers and statistical software are supported. For example, the software might use SVG (e.g., Protovis, gridSVG) or HTML canvas, both of which exclude most versions of Internet Explorer, or the software might be made specifically for R (gridSVG, CRanvas), thus excluding users of other stats software. There are more general tools (d3, Rapha lJS) which are compatible with most browsers, but using one of these to make statistical graphs requires more coding than is probably desired, and requires learning a new tool. This talk will present a method for making interactive web graphs, which, by design, attempts to support as many browsers and as many statistical programs as possible, while also aiming to be relatively easy to use and relatively easy to extend.

  1. DigiSeis—A software component for digitizing seismic signals using the PC sound card

    NASA Astrophysics Data System (ADS)

    Amin Khan, Khalid; Akhter, Gulraiz; Ahmad, Zulfiqar

    2012-06-01

    An innovative software-based approach to develop an inexpensive experimental seismic recorder is presented. This approach requires no hardware as the built-in PC sound card is used for digitization of seismic signals. DigiSeis, an ActiveX component is developed to capture the digitized seismic signals from the sound card and deliver them to applications for processing and display. A seismic recorder application software SeisWave is developed over this component, which provides real-time monitoring and display of seismic events picked by a pair of external geophones. This recorder can be used as an educational aid for conducting seismic experiments. It can also be connected with suitable seismic sensors to record earthquakes. The software application and the ActiveX component are available for download. This component can be used to develop seismic recording applications according to user specific requirements.

  2. Non-standard analysis and embedded software

    NASA Technical Reports Server (NTRS)

    Platek, Richard

    1995-01-01

    One model for computing in the future is ubiquitous, embedded computational devices analogous to embedded electrical motors. Many of these computers will control physical objects and processes. Such hidden computerized environments introduce new safety and correctness concerns whose treatment go beyond present Formal Methods. In particular, one has to begin to speak about Real Space software in analogy with Real Time software. By this we mean, computerized systems which have to meet requirements expressed in the real geometry of space. How to translate such requirements into ordinary software specifications and how to carry out proofs is a major challenge. In this talk we propose a research program based on the use of no-standard analysis. Much detail remains to be carried out. The purpose of the talk is to inform the Formal Methods community that Non-Standard Analysis provides a possible avenue to attack which we believe will be fruitful.

  3. 42 CFR 495.106 - Incentive payments to CAHs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... PROGRAM Requirements Specific to the Medicare Program § 495.106 Incentive payments to CAHs. (a... computers and associated hardware and software, necessary to administer certified EHR technology as defined... determining if a CAH is a qualifying CAH under this section; (3) Specification of EHR reporting periods, cost...

  4. 42 CFR 495.106 - Incentive payments to CAHs.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... PROGRAM Requirements Specific to the Medicare Program § 495.106 Incentive payments to CAHs. (a... computers and associated hardware and software, necessary to administer certified EHR technology as defined... determining if a CAH is a qualifying CAH under this section; (3) Specification of EHR reporting periods, cost...

  5. Second Evaluation of Job Queuing/Scheduling Software. Phase 1

    NASA Technical Reports Server (NTRS)

    Jones, James Patton; Brickell, Cristy; Chancellor, Marisa (Technical Monitor)

    1997-01-01

    The recent proliferation of high performance workstations and the increased reliability of parallel systems have illustrated the need for robust job management systems to support parallel applications. To address this issue, NAS compiled a requirements checklist for job queuing/scheduling software. Next, NAS evaluated the leading job management system (JMS) software packages against the checklist. A year has now elapsed since the first comparison was published, and NAS has repeated the evaluation. This report describes this second evaluation, and presents the results of Phase 1: Capabilities versus Requirements. We show that JMS support for running parallel applications on clusters of workstations and parallel systems is still lacking, however, definite progress has been made by the vendors to correct the deficiencies. This report is supplemented by a WWW interface to the data collected, to aid other sites in extracting the evaluation information on specific requirements of interest.

  6. Specification Technology Guidelines.

    DTIC Science & Technology

    1985-08-01

    Chicago, Ill., October 27-31, 1980. 0. Marca , D. and D. Thornhill, "Modeling Software Configurability Require- ments," in Requirements Engineering...Environments, ed. Y. Ohno, pp. 51- 58, North-Holland Publishing Company, 1082. B-7 L7 - 0 Marca , D. and C. McGowan, "Static and Dynamic Data Modeling for

  7. Computer program for design and performance analysis of navigation-aid power systems. Program documentation. Volume 1: Software requirements document

    NASA Technical Reports Server (NTRS)

    Goltz, G.; Kaiser, L. M.; Weiner, H.

    1977-01-01

    A computer program has been developed for designing and analyzing the performance of solar array/battery power systems for the U.S. Coast Guard Navigational Aids. This program is called the Design Synthesis/Performance Analysis (DSPA) Computer Program. The basic function of the Design Synthesis portion of the DSPA program is to evaluate functional and economic criteria to provide specifications for viable solar array/battery power systems. The basic function of the Performance Analysis portion of the DSPA program is to simulate the operation of solar array/battery power systems under specific loads and environmental conditions. This document establishes the software requirements for the DSPA computer program, discusses the processing that occurs within the program, and defines the necessary interfaces for operation.

  8. A formal MIM specification and tools for the common exchange of MIM diagrams: an XML-Based format, an API, and a validation method

    PubMed Central

    2011-01-01

    Background The Molecular Interaction Map (MIM) notation offers a standard set of symbols and rules on their usage for the depiction of cellular signaling network diagrams. Such diagrams are essential for disseminating biological information in a concise manner. A lack of software tools for the notation restricts wider usage of the notation. Development of software is facilitated by a more detailed specification regarding software requirements than has previously existed for the MIM notation. Results A formal implementation of the MIM notation was developed based on a core set of previously defined glyphs. This implementation provides a detailed specification of the properties of the elements of the MIM notation. Building upon this specification, a machine-readable format is provided as a standardized mechanism for the storage and exchange of MIM diagrams. This new format is accompanied by a Java-based application programming interface to help software developers to integrate MIM support into software projects. A validation mechanism is also provided to determine whether MIM datasets are in accordance with syntax rules provided by the new specification. Conclusions The work presented here provides key foundational components to promote software development for the MIM notation. These components will speed up the development of interoperable tools supporting the MIM notation and will aid in the translation of data stored in MIM diagrams to other standardized formats. Several projects utilizing this implementation of the notation are outlined herein. The MIM specification is available as an additional file to this publication. Source code, libraries, documentation, and examples are available at http://discover.nci.nih.gov/mim. PMID:21586134

  9. A formal MIM specification and tools for the common exchange of MIM diagrams: an XML-Based format, an API, and a validation method.

    PubMed

    Luna, Augustin; Karac, Evrim I; Sunshine, Margot; Chang, Lucas; Nussinov, Ruth; Aladjem, Mirit I; Kohn, Kurt W

    2011-05-17

    The Molecular Interaction Map (MIM) notation offers a standard set of symbols and rules on their usage for the depiction of cellular signaling network diagrams. Such diagrams are essential for disseminating biological information in a concise manner. A lack of software tools for the notation restricts wider usage of the notation. Development of software is facilitated by a more detailed specification regarding software requirements than has previously existed for the MIM notation. A formal implementation of the MIM notation was developed based on a core set of previously defined glyphs. This implementation provides a detailed specification of the properties of the elements of the MIM notation. Building upon this specification, a machine-readable format is provided as a standardized mechanism for the storage and exchange of MIM diagrams. This new format is accompanied by a Java-based application programming interface to help software developers to integrate MIM support into software projects. A validation mechanism is also provided to determine whether MIM datasets are in accordance with syntax rules provided by the new specification. The work presented here provides key foundational components to promote software development for the MIM notation. These components will speed up the development of interoperable tools supporting the MIM notation and will aid in the translation of data stored in MIM diagrams to other standardized formats. Several projects utilizing this implementation of the notation are outlined herein. The MIM specification is available as an additional file to this publication. Source code, libraries, documentation, and examples are available at http://discover.nci.nih.gov/mim.

  10. Health care professional workstation: software system construction using DSSA scenario-based engineering process.

    PubMed

    Hufnagel, S; Harbison, K; Silva, J; Mettala, E

    1994-01-01

    This paper describes a new method for the evolutionary determination of user requirements and system specifications called scenario-based engineering process (SEP). Health care professional workstations are critical components of large scale health care system architectures. We suggest that domain-specific software architectures (DSSAs) be used to specify standard interfaces and protocols for reusable software components throughout those architectures, including workstations. We encourage the use of engineering principles and abstraction mechanisms. Engineering principles are flexible guidelines, adaptable to particular situations. Abstraction mechanisms are simplifications for management of complexity. We recommend object-oriented design principles, graphical structural specifications, and formal components' behavioral specifications. We give an ambulatory care scenario and associated models to demonstrate SEP. The scenario uses health care terminology and gives patients' and health care providers' system views. Our goal is to have a threefold benefit. (i) Scenario view abstractions provide consistent interdisciplinary communications. (ii) Hierarchical object-oriented structures provide useful abstractions for reuse, understandability, and long term evolution. (iii) SEP and health care DSSA integration into computer aided software engineering (CASE) environments. These environments should support rapid construction and certification of individualized systems, from reuse libraries.

  11. Automated Derivation of Complex System Constraints from User Requirements

    NASA Technical Reports Server (NTRS)

    Muery, Kim; Foshee, Mark; Marsh, Angela

    2006-01-01

    International Space Station (ISS) payload developers submit their payload science requirements for the development of on-board execution timelines. The ISS systems required to execute the payload science operations must be represented as constraints for the execution timeline. Payload developers use a software application, User Requirements Collection (URC), to submit their requirements by selecting a simplified representation of ISS system constraints. To fully represent the complex ISS systems, the constraints require a level of detail that is beyond the insight of the payload developer. To provide the complex representation of the ISS system constraints, HOSC operations personnel, specifically the Payload Activity Requirements Coordinators (PARC), manually translate the payload developers simplified constraints into detailed ISS system constraints used for scheduling the payload activities in the Consolidated Planning System (CPS). This paper describes the implementation for a software application, User Requirements Integration (URI), developed to automate the manual ISS constraint translation process.

  12. NASA's TReK Project: A Case Study in Using the Spiral Model of Software Development

    NASA Technical Reports Server (NTRS)

    Hendrix, T. Dean; Schneider, Michelle P.

    1998-01-01

    Software development projects face numerous challenges that threaten their successful completion. Whether it is not enough money, too little time, or a case of "requirements creep" that has turned into a full sprint, projects must meet these challenges or face possible disastrous consequences. A robust, yet flexible process model can provide a mechanism through which software development teams can meet these challenges head on and win. This article describes how the spiral model has been successfully tailored to a specific project and relates some notable results to date.

  13. Detection of faults and software reliability analysis

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1986-01-01

    Multiversion or N-version programming was proposed as a method of providing fault tolerance in software. The approach requires the separate, independent preparation of multiple versions of a piece of software for some application. Specific topics addressed are: failure probabilities in N-version systems, consistent comparison in N-version systems, descriptions of the faults found in the Knight and Leveson experiment, analytic models of comparison testing, characteristics of the input regions that trigger faults, fault tolerance through data diversity, and the relationship between failures caused by automatically seeded faults.

  14. Results of a Formal Methods Demonstration Project

    NASA Technical Reports Server (NTRS)

    Kelly, J.; Covington, R.; Hamilton, D.

    1994-01-01

    This paper describes the results of a cooperative study conducted by a team of researchers in formal methods at three NASA Centers to demonstrate FM techniques and to tailor them to critical NASA software systems. This pilot project applied FM to an existing critical software subsystem, the Shuttle's Jet Select subsystem (Phase I of an ongoing study). The present study shows that FM can be used successfully to uncover hidden issues in a highly critical and mature Functional Subsystem Software Requirements (FSSR) specification which are very difficult to discover by traditional means.

  15. The Implementation of Satellite Attitude Control System Software Using Object Oriented Design

    NASA Technical Reports Server (NTRS)

    Reid, W. Mark; Hansell, William; Phillips, Tom; Anderson, Mark O.; Drury, Derek

    1998-01-01

    NASA established the Small Explorer (SNMX) program in 1988 to provide frequent opportunities for highly focused and relatively inexpensive space science missions. The SMEX program has produced five satellites, three of which have been successfully launched. The remaining two spacecraft are scheduled for launch within the coming year. NASA has recently developed a prototype for the next generation Small Explorer spacecraft (SMEX-Lite). This paper describes the object-oriented design (OOD) of the SMEX-Lite Attitude Control System (ACS) software. The SMEX-Lite ACS is three-axis controlled and is capable of performing sub-arc-minute pointing. This paper first describes high level requirements governing the SMEX-Lite ACS software architecture. Next, the context in which the software resides is explained. The paper describes the principles of encapsulation, inheritance, and polymorphism with respect to the implementation of an ACS software system. This paper will also discuss the design of several ACS software components. Specifically, object-oriented designs are presented for sensor data processing, attitude determination, attitude control, and failure detection. Finally, this paper will address the establishment of the ACS Foundation Class (AFC) Library. The AFC is a large software repository, requiring a minimal amount of code modifications to produce ACS software for future projects.

  16. Responding to an RFP: A Vendor's Viewpoint.

    ERIC Educational Resources Information Center

    Kington, Robert A.

    1987-01-01

    Outlines factors used by online vendors to decide whether to bid on RFPs (requests for proposals) for library automation systems, including specifications for software, hardware or performance requirements not met by the vendor; specifications based on competitors' systems; the size and complexity of the request itself; and vendors' time…

  17. Cost-effective and business-beneficial computer validation for bioanalytical laboratories.

    PubMed

    McDowall, Rd

    2011-07-01

    Computerized system validation is often viewed as a burden and a waste of time to meet regulatory requirements. This article presents a different approach by looking at validation in a bioanalytical laboratory from the business benefits that computer validation can bring. Ask yourself the question, have you ever bought a computerized system that did not meet your initial expectations? This article will look at understanding the process to be automated, the paper to be eliminated and the records to be signed to meet the requirements of the GLP or GCP and Part 11 regulations. This paper will only consider commercial nonconfigurable and configurable software such as plate readers and LC-MS/MS data systems rather than LIMS or custom applications. Two streamlined life cycle models are presented. The first one consists of a single document for validation of nonconfigurable software. The second is for configurable software and is a five-stage model that avoids the need to write functional and design specifications. Both models are aimed at managing the risk each type of software poses whist reducing the amount of documented evidence required for validation.

  18. Software Requirements

    DTIC Science & Technology

    1990-01-01

    Real-Time Systems Specifications 11 Convenient Auto Rental System Marea82 12 Systems Architecture Marca , D. A., and C. L. McGowan. "Static and...Requirements Marca88 Structured Analysis. Orr’s work is worthy of study Marca , D. A., and C. L. McGowan. SADT: Struc- by the instructor, since it enjoys

  19. SMS crew station (C and D panels and forward structures). CEI part 1: Detail specification, type 1 data

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Established are the requirements for performance, design, test and qualification of one type of equipment identified as SMS C&D panels and forward structures. This CEI is used to provide all hardware and wiring necessary for the C&D panels to be properly interfaced with the computer complex/signal conversion equipment (SCE), crew station, and software requirements as defined in other CEI specifications.

  20. Are the expected benefits of requirements reuse hampered by distance? An experiment.

    PubMed

    Carrillo de Gea, Juan M; Nicolás, Joaquín; Fernández-Alemán, José L; Toval, Ambrosio; Idri, Ali

    2016-01-01

    Software development processes are often performed by distributed teams which may be separated by great distances. Global software development (GSD) has undergone a significant growth in recent years. The challenges concerning GSD are especially relevant to requirements engineering (RE). Stakeholders need to share a common ground, but there are many difficulties as regards the potentially variable interpretation of the requirements in different contexts. We posit that the application of requirements reuse techniques could alleviate this problem through the diminution of the number of requirements open to misinterpretation. This paper presents a reuse-based approach with which to address RE in GSD, with special emphasis on specification techniques, namely parameterised requirements and traceability relationships. An experiment was carried out with the participation of 29 university students enrolled on a Computer Science and Engineering course. Two main scenarios that represented co-localisation and distribution in software development were portrayed by participants from Spain and Morocco. The global teams achieved a slightly better performance than the co-located teams as regards effectiveness , which could be a result of the worse productivity of the global teams in comparison to the co-located teams. Subjective perceptions were generally more positive in the case of the distributed teams ( difficulty , speed and understanding ), with the exception of quality . A theoretical model has been proposed as an evaluation framework with which to analyse, from the point of view of the factor of distance, the effect of requirements specification techniques on a set of performance and perception-based variables. The experiment utilised a new internationalisation requirements catalogue. None of the differences found between co-located and distributed teams were significant according to the outcome of our statistical tests. The well-known benefits of requirements reuse in traditional co-located projects could, therefore, also be expected in GSD projects.

  1. Generic Safety Requirements for Developing Safe Insulin Pump Software

    PubMed Central

    Zhang, Yi; Jetley, Raoul; Jones, Paul L; Ray, Arnab

    2011-01-01

    Background The authors previously introduced a highly abstract generic insulin infusion pump (GIIP) model that identified common features and hazards shared by most insulin pumps on the market. The aim of this article is to extend our previous work on the GIIP model by articulating safety requirements that address the identified GIIP hazards. These safety requirements can be validated by manufacturers, and may ultimately serve as a safety reference for insulin pump software. Together, these two publications can serve as a basis for discussing insulin pump safety in the diabetes community. Methods In our previous work, we established a generic insulin pump architecture that abstracts functions common to many insulin pumps currently on the market and near-future pump designs. We then carried out a preliminary hazard analysis based on this architecture that included consultations with many domain experts. Further consultation with domain experts resulted in the safety requirements used in the modeling work presented in this article. Results Generic safety requirements for the GIIP model are presented, as appropriate, in parameterized format to accommodate clinical practices or specific insulin pump criteria important to safe device performance. Conclusions We believe that there is considerable value in having the diabetes, academic, and manufacturing communities consider and discuss these generic safety requirements. We hope that the communities will extend and revise them, make them more representative and comprehensive, experiment with them, and use them as a means for assessing the safety of insulin pump software designs. One potential use of these requirements is to integrate them into model-based engineering (MBE) software development methods. We believe, based on our experiences, that implementing safety requirements using MBE methods holds promise in reducing design/implementation flaws in insulin pump development and evolutionary processes, therefore improving overall safety of insulin pump software. PMID:22226258

  2. Terrain following of arbitrary surfaces using a high intensity LED proximity sensor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, J.E.

    1992-01-01

    Many robotic operations, e.g., mapping, scanning, feature following, etc., require accurate surface following of arbitrary targets. This paper presents a versatile surface following and mapping system designed to promote hardware, software and application independence, modular development, and upward expandability. These goals are met by: a full, a priori specification of the hardware and software interfaces; a modular system architecture; and a hierarchical surface-data analysis method, permitting application specific tuning at each conceptual level of topological abstraction. This surface following system was fully designed and independently of any specific robotic host, then successfully integrated with and demonstrated on a completely amore » priori unknown, real-time robotic system. 7 refs.« less

  3. End effector monitoring system: An illustrated case of operational prototyping

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Land, Sherry A.; Thronesbery, Carroll

    1994-01-01

    Operational prototyping is introduced to help developers apply software innovations to real-world problems, to help users articulate requirements, and to help develop more usable software. Operational prototyping has been applied to an expert system development project. The expert system supports fault detection and management during grappling operations of the Space Shuttle payload bay arm. The dynamic exchanges among operational prototyping team members are illustrated in a specific prototyping session. We discuss the requirements for operational prototyping technology, types of projects for which operational prototyping is best suited and when it should be applied to those projects.

  4. NASA/NBS (National Aeronautics and Space Administration/National Bureau of Standards) standard reference model for telerobot control system architecture (NASREM)

    NASA Technical Reports Server (NTRS)

    Albus, James S.; Mccain, Harry G.; Lumia, Ronald

    1989-01-01

    The document describes the NASA Standard Reference Model (NASREM) Architecture for the Space Station Telerobot Control System. It defines the functional requirements and high level specifications of the control system for the NASA space Station document for the functional specification, and a guideline for the development of the control system architecture, of the 10C Flight Telerobot Servicer. The NASREM telerobot control system architecture defines a set of standard modules and interfaces which facilitates software design, development, validation, and test, and make possible the integration of telerobotics software from a wide variety of sources. Standard interfaces also provide the software hooks necessary to incrementally upgrade future Flight Telerobot Systems as new capabilities develop in computer science, robotics, and autonomous system control.

  5. NASA TSRV essential flight control system requirements via object oriented analysis

    NASA Technical Reports Server (NTRS)

    Duffy, Keith S.; Hoza, Bradley J.

    1992-01-01

    The objective was to analyze the baseline flight control system of the Transport Systems Research Vehicle (TSRV) and to develop a system specification that offers high visibility of the essential system requirements in order to facilitate the future development of alternate, more advanced software architectures. The flight control system is defined to be the baseline software for the TSRV research flight deck, including all navigation, guidance, and control functions, and primary pilot displays. The Object Oriented Analysis (OOA) methodology developed is used to develop a system requirement definition. The scope of the requirements definition contained herein is limited to a portion of the Flight Management/Flight Control computer functionality. The development of a partial system requirements definition is documented, and includes a discussion of the tasks required to increase the scope of the requirements definition and recommendations for follow-on research.

  6. Next generation of decision making software for nanopatterns characterization: application to semiconductor industry

    NASA Astrophysics Data System (ADS)

    Dervilllé, A.; Labrosse, A.; Zimmermann, Y.; Foucher, J.; Gronheid, R.; Boeckx, C.; Singh, A.; Leray, P.; Halder, S.

    2016-03-01

    The dimensional scaling in IC manufacturing strongly drives the demands on CD and defect metrology techniques and their measurement uncertainties. Defect review has become as important as CD metrology and both of them create a new metrology paradigm because it creates a completely new need for flexible, robust and scalable metrology software. Current, software architectures and metrology algorithms are performant but it must be pushed to another higher level in order to follow roadmap speed and requirements. For example: manage defect and CD in one step algorithm, customize algorithms and outputs features for each R&D team environment, provide software update every day or every week for R&D teams in order to explore easily various development strategies. The final goal is to avoid spending hours and days to manually tune algorithm to analyze metrology data and to allow R&D teams to stay focus on their expertise. The benefits are drastic costs reduction, more efficient R&D team and better process quality. In this paper, we propose a new generation of software platform and development infrastructure which can integrate specific metrology business modules. For example, we will show the integration of a chemistry module dedicated to electronics materials like Direct Self Assembly features. We will show a new generation of image analysis algorithms which are able to manage at the same time defect rates, images classifications, CD and roughness measurements with high throughput performances in order to be compatible with HVM. In a second part, we will assess the reliability, the customization of algorithm and the software platform capabilities to follow new specific semiconductor metrology software requirements: flexibility, robustness, high throughput and scalability. Finally, we will demonstrate how such environment has allowed a drastic reduction of data analysis cycle time.

  7. Guidance and Control Software Project Data - Volume 2: Development Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the development documents from the GCS project. Volume 2 contains three appendices: A. Guidance and Control Software Development Specification; B. Design Description for the Pluto Implementation of the Guidance and Control Software; and C. Source Code for the Pluto Implementation of the Guidance and Control Software

  8. Automated personnel data base system specifications, Task V. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartley, H.J.; Bocast, A.K.; Deppner, F.O.

    1978-11-01

    The full title of this study is 'Development of Qualification Requirements, Training Programs, Career Plans, and Methodologies for Effective Management and Training of Inspection and Enforcement Personnel.' Task V required the development of an automated personnel data base system for NRC/IE. This system is identified as the NRC/IE Personnel, Assignment, Qualifications, and Training System (PAQTS). This Task V report provides the documentation for PAQTS including the Functional Requirements Document (FRD), the Data Requirements Document (DRD), the Hardware and Software Capabilities Assessment, and the Detailed Implementation Schedule. Specific recommendations to facilitate implementation of PAQTS are also included.

  9. Simulation Modeling of Software Development Processes

    NASA Technical Reports Server (NTRS)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  10. Software Innovation in a Mission Critical Environment

    NASA Technical Reports Server (NTRS)

    Fredrickson, Steven

    2015-01-01

    Operating in mission-critical environments requires trusted solutions, and the preference for "tried and true" approaches presents a potential barrier to infusing innovation into mission-critical systems. This presentation explores opportunities to overcome this barrier in the software domain. It outlines specific areas of innovation in software development achieved by the Johnson Space Center (JSC) Engineering Directorate in support of NASA's major human spaceflight programs, including International Space Station, Multi-Purpose Crew Vehicle (Orion), and Commercial Crew Programs. Software engineering teams at JSC work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements for genuinely mission critical applications. The innovations described, including the use of NASA Core Flight Software and its associated software tool chain, can lead to software that is more affordable, more reliable, better modelled, more flexible, more easily maintained, better tested, and enabling of automation.

  11. GCS plan for software aspects of certification

    NASA Technical Reports Server (NTRS)

    Shagnea, Anita M.; Lowman, Douglas S.; Withers, B. Edward

    1990-01-01

    As part of the Guidance and Control Software (GCS) research project being sponsored by NASA to evaluate the failure processes of software, standard industry software development procedures are being employed. To ensure that these procedures are authentic, the guidelines outlined in the Radio Technical Commission for Aeronautics (RTCA/DO-178A document entitled, software considerations in airborne systems and equipment certification, were adopted. A major aspect of these guidelines is proper documentation. As such, this report, the plan for software aspects of certification, was produced in accordance with DO-178A. An overview is given of the GCS research project, including the goals of the project, project organization, and project schedules. It also specifies the plans for all aspects of the project which relate to the certification of the GCS implementations developed under a NASA contract. These plans include decisions made regarding the software specification, accuracy requirements, configuration management, implementation development and verification, and the development of the GCS simulator.

  12. Space Generic Open Avionics Architecture (SGOAA) standard specification

    NASA Technical Reports Server (NTRS)

    Wray, Richard B.; Stovall, John R.

    1993-01-01

    The purpose of this standard is to provide an umbrella set of requirements for applying the generic architecture interface model to the design of a specific avionics hardware/software system. This standard defines a generic set of system interface points to facilitate identification of critical interfaces and establishes the requirements for applying appropriate low level detailed implementation standards to those interface points. The generic core avionics system and processing architecture models provided herein are robustly tailorable to specific system applications and provide a platform upon which the interface model is to be applied.

  13. An introduction to requirements capture using PVS: Specification of a simple autopilot

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1996-01-01

    This paper presents an introduction to capturing software requirements in the PVS formal language. The object of study is a simplified digital autopilot that was motivated in part by the mode control panel of NASA Langley's Boeing 737 research aircraft. The paper first presents the requirements for this autopilot in English and then steps the reader through a translation of these requirements into formal mathematics. Along the way deficiencies in the English specification are noted and repaired. Once completed, the formal PVS requirement is analyzed using the PVS theorem prover and shown to maintain an invariant over its state space.

  14. Space Generic Open Avionics Architecture (SGOAA) reference model technical guide

    NASA Technical Reports Server (NTRS)

    Wray, Richard B.; Stovall, John R.

    1993-01-01

    This report presents a full description of the Space Generic Open Avionics Architecture (SGOAA). The SGOAA consists of a generic system architecture for the entities in spacecraft avionics, a generic processing architecture, and a six class model of interfaces in a hardware/software system. The purpose of the SGOAA is to provide an umbrella set of requirements for applying the generic architecture interface model to the design of specific avionics hardware/software systems. The SGOAA defines a generic set of system interface points to facilitate identification of critical interfaces and establishes the requirements for applying appropriate low level detailed implementation standards to those interface points. The generic core avionics system and processing architecture models provided herein are robustly tailorable to specific system applications and provide a platform upon which the interface model is to be applied.

  15. Developing high-quality educational software.

    PubMed

    Johnson, Lynn A; Schleyer, Titus K L

    2003-11-01

    The development of effective educational software requires a systematic process executed by a skilled development team. This article describes the core skills required of the development team members for the six phases of successful educational software development. During analysis, the foundation of product development is laid including defining the audience and program goals, determining hardware and software constraints, identifying content resources, and developing management tools. The design phase creates the specifications that describe the user interface, the sequence of events, and the details of the content to be displayed. During development, the pieces of the educational program are assembled. Graphics and other media are created, video and audio scripts written and recorded, the program code created, and support documentation produced. Extensive testing by the development team (alpha testing) and with students (beta testing) is conducted. Carefully planned implementation is most likely to result in a flawless delivery of the educational software and maintenance ensures up-to-date content and software. Due to the importance of the sixth phase, evaluation, we have written a companion article on it that follows this one. The development of a CD-ROM product is described including the development team, a detailed description of the development phases, and the lessons learned from the project.

  16. Development of an automated asbestos counting software based on fluorescence microscopy.

    PubMed

    Alexandrov, Maxym; Ichida, Etsuko; Nishimura, Tomoki; Aoki, Kousuke; Ishida, Takenori; Hirota, Ryuichi; Ikeda, Takeshi; Kawasaki, Tetsuo; Kuroda, Akio

    2015-01-01

    An emerging alternative to the commonly used analytical methods for asbestos analysis is fluorescence microscopy (FM), which relies on highly specific asbestos-binding probes to distinguish asbestos from interfering non-asbestos fibers. However, all types of microscopic asbestos analysis require laborious examination of large number of fields of view and are prone to subjective errors and large variability between asbestos counts by different analysts and laboratories. A possible solution to these problems is automated counting of asbestos fibers by image analysis software, which would lower the cost and increase the reliability of asbestos testing. This study seeks to develop a fiber recognition and counting software for FM-based asbestos analysis. We discuss the main features of the developed software and the results of its testing. Software testing showed good correlation between automated and manual counts for the samples with medium and high fiber concentrations. At low fiber concentrations, the automated counts were less accurate, leading us to implement correction mode for automated counts. While the full automation of asbestos analysis would require further improvements in accuracy of fiber identification, the developed software could already assist professional asbestos analysts and record detailed fiber dimensions for the use in epidemiological research.

  17. Requirement analysis for an electronic laboratory notebook for sustainable data management in biomedical research.

    PubMed

    Menzel, Julia; Weil, Philipp; Bittihn, Philip; Hornung, Daniel; Mathieu, Nadine; Demiroglu, Sara Y

    2013-01-01

    Sustainable data management in biomedical research requires documentation of metadata for all experiments and results. Scientists usually document research data and metadata in laboratory paper notebooks. An electronic laboratory notebook (ELN) can keep metadata linked to research data resulting in a better understanding of the research results, meaning a scientific benefit [1]. Besides other challenges [2], the biggest hurdles for introducing an ELN seem to be usability, file formats, and data entry mechanisms [3] and that many ELNs are assigned to specific research fields such as biology, chemistry, or physics [4]. We aimed to identify requirements for the introduction of ELN software in a biomedical collaborative research center [5] consisting of different scientific fields and to find software fulfilling most of these requirements.

  18. Advanced software integration: The case for ITV facilities

    NASA Technical Reports Server (NTRS)

    Garman, John R.

    1990-01-01

    The array of technologies and methodologies involved in the development and integration of avionics software has moved almost as rapidly as computer technology itself. Future avionics systems involve major advances and risks in the following areas: (1) Complexity; (2) Connectivity; (3) Security; (4) Duration; and (5) Software engineering. From an architectural standpoint, the systems will be much more distributed, involve session-based user interfaces, and have the layered architectures typified in the layers of abstraction concepts popular in networking. Typified in the NASA Space Station Freedom will be the highly distributed nature of software development itself. Systems composed of independent components developed in parallel must be bound by rigid standards and interfaces, the clean requirements and specifications. Avionics software provides a challenge in that it can not be flight tested until the first time it literally flies. It is the binding of requirements for such an integration environment into the advances and risks of future avionics systems that form the basis of the presented concept and the basic Integration, Test, and Verification concept within the development and integration life cycle of Space Station Mission and Avionics systems.

  19. HeatmapGenerator: high performance RNAseq and microarray visualization software suite to examine differential gene expression levels using an R and C++ hybrid computational pipeline.

    PubMed

    Khomtchouk, Bohdan B; Van Booven, Derek J; Wahlestedt, Claes

    2014-01-01

    The graphical visualization of gene expression data using heatmaps has become an integral component of modern-day medical research. Heatmaps are used extensively to plot quantitative differences in gene expression levels, such as those measured with RNAseq and microarray experiments, to provide qualitative large-scale views of the transcriptonomic landscape. Creating high-quality heatmaps is a computationally intensive task, often requiring considerable programming experience, particularly for customizing features to a specific dataset at hand. Software to create publication-quality heatmaps is developed with the R programming language, C++ programming language, and OpenGL application programming interface (API) to create industry-grade high performance graphics. We create a graphical user interface (GUI) software package called HeatmapGenerator for Windows OS and Mac OS X as an intuitive, user-friendly alternative to researchers with minimal prior coding experience to allow them to create publication-quality heatmaps using R graphics without sacrificing their desired level of customization. The simplicity of HeatmapGenerator is that it only requires the user to upload a preformatted input file and download the publicly available R software language, among a few other operating system-specific requirements. Advanced features such as color, text labels, scaling, legend construction, and even database storage can be easily customized with no prior programming knowledge. We provide an intuitive and user-friendly software package, HeatmapGenerator, to create high-quality, customizable heatmaps generated using the high-resolution color graphics capabilities of R. The software is available for Microsoft Windows and Apple Mac OS X. HeatmapGenerator is released under the GNU General Public License and publicly available at: http://sourceforge.net/projects/heatmapgenerator/. The Mac OS X direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_MAC_OSX.tar.gz/download. The Windows OS direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_WINDOWS.zip/download.

  20. Engineering Software Suite Validates System Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    EDAptive Computing Inc.'s (ECI) EDAstar engineering software tool suite, created to capture and validate system design requirements, was significantly funded by NASA's Ames Research Center through five Small Business Innovation Research (SBIR) contracts. These programs specifically developed Syscape, used to capture executable specifications of multi-disciplinary systems, and VectorGen, used to automatically generate tests to ensure system implementations meet specifications. According to the company, the VectorGen tests considerably reduce the time and effort required to validate implementation of components, thereby ensuring their safe and reliable operation. EDASHIELD, an additional product offering from ECI, can be used to diagnose, predict, and correct errors after a system has been deployed using EDASTAR -created models. Initial commercialization for EDASTAR included application by a large prime contractor in a military setting, and customers include various branches within the U.S. Department of Defense, industry giants like the Lockheed Martin Corporation, Science Applications International Corporation, and Ball Aerospace and Technologies Corporation, as well as NASA's Langley and Glenn Research Centers

  1. OntoStudyEdit: a new approach for ontology-based representation and management of metadata in clinical and epidemiological research.

    PubMed

    Uciteli, Alexandr; Herre, Heinrich

    2015-01-01

    The specification of metadata in clinical and epidemiological study projects absorbs significant expense. The validity and quality of the collected data depend heavily on the precise and semantical correct representation of their metadata. In various research organizations, which are planning and coordinating studies, the required metadata are specified differently, depending on many conditions, e.g., on the used study management software. The latter does not always meet the needs of a particular research organization, e.g., with respect to the relevant metadata attributes and structuring possibilities. The objective of the research, set forth in this paper, is the development of a new approach for ontology-based representation and management of metadata. The basic features of this approach are demonstrated by the software tool OntoStudyEdit (OSE). The OSE is designed and developed according to the three ontology method. This method for developing software is based on the interactions of three different kinds of ontologies: a task ontology, a domain ontology and a top-level ontology. The OSE can be easily adapted to different requirements, and it supports an ontologically founded representation and efficient management of metadata. The metadata specifications can by imported from various sources; they can be edited with the OSE, and they can be exported in/to several formats, which are used, e.g., by different study management software. Advantages of this approach are the adaptability of the OSE by integrating suitable domain ontologies, the ontological specification of mappings between the import/export formats and the DO, the specification of the study metadata in a uniform manner and its reuse in different research projects, and an intuitive data entry for non-expert users.

  2. Developing Formal Object-oriented Requirements Specifications: A Model, Tool and Technique.

    ERIC Educational Resources Information Center

    Jackson, Robert B.; And Others

    1995-01-01

    Presents a formal object-oriented specification model (OSS) for computer software system development that is supported by a tool that automatically generates a prototype from an object-oriented analysis model (OSA) instance, lets the user examine the prototype, and permits the user to refine the OSA model instance to generate a requirements…

  3. Using the SCR Specification Technique in a High School Programming Course.

    ERIC Educational Resources Information Center

    Rosen, Edward; McKim, James C., Jr.

    1992-01-01

    Presents the underlying ideas of the Software Cost Reduction (SCR) approach to requirements specifications. Results of applying this approach to the teaching of programing to high school students indicate that students perform better in writing programs. An appendix provides two examples of how the method is applied to problem solving. (MDH)

  4. Space Station Freedom biomedical monitoring and countermeasures: Biomedical facility hardware catalog

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This hardware catalog covers that hardware proposed under the Biomedical Monitoring and Countermeasures Development Program supported by the Johnson Space Center. The hardware items are listed separately by item, and are in alphabetical order. Each hardware item specification consists of four pages. The first page describes background information with an illustration, definition and a history/design status. The second page identifies the general specifications, performance, rack interface requirements, problems, issues, concerns, physical description, and functional description. The level of hardware design reliability is also identified under the maintainability and reliability category. The third page specifies the mechanical design guidelines and assumptions. Described are the material types and weights, modules, and construction methods. Also described is an estimation of percentage of construction which utilizes a particular method, and the percentage of required new mechanical design is documented. The fourth page analyzes the electronics, the scope of design effort, and the software requirements. Electronics are described by percentages of component types and new design. The design effort, as well as, the software requirements are identified and categorized.

  5. Development and implementation of software systems for imaging spectroscopy

    USGS Publications Warehouse

    Boardman, J.W.; Clark, R.N.; Mazer, A.S.; Biehl, L.L.; Kruse, F.A.; Torson, J.; Staenz, K.

    2006-01-01

    Specialized software systems have played a crucial role throughout the twenty-five year course of the development of the new technology of imaging spectroscopy, or hyperspectral remote sensing. By their very nature, hyperspectral data place unique and demanding requirements on the computer software used to visualize, analyze, process and interpret them. Often described as a marriage of the two technologies of reflectance spectroscopy and airborne/spaceborne remote sensing, imaging spectroscopy, in fact, produces data sets with unique qualities, unlike previous remote sensing or spectrometer data. Because of these unique spatial and spectral properties hyperspectral data are not readily processed or exploited with legacy software systems inherited from either of the two parent fields of study. This paper provides brief reviews of seven important software systems developed specifically for imaging spectroscopy.

  6. Software design by reusing architectures

    NASA Technical Reports Server (NTRS)

    Bhansali, Sanjay; Nii, H. Penny

    1992-01-01

    Abstraction fosters reuse by providing a class of artifacts that can be instantiated or customized to produce a set of artifacts meeting different specific requirements. It is proposed that significant leverage can be obtained by abstracting software system designs and the design process. The result of such an abstraction is a generic architecture and a set of knowledge-based, customization tools that can be used to instantiate the generic architecture. An approach for designing software systems based on the above idea are described. The approach is illustrated through an implemented example, and the advantages and limitations of the approach are discussed.

  7. Environmental databases and other computerized information tools

    NASA Technical Reports Server (NTRS)

    Clark-Ingram, Marceia

    1995-01-01

    Increasing environmental legislation has brought about the development of many new environmental databases and software application packages to aid in the quest for environmental compliance. These databases and software packages are useful tools and applicable to a wide range of environmental areas from atmospheric modeling to materials replacement technology. The great abundance of such products and services can be very overwhelming when trying to identify the tools which best meet specific needs. This paper will discuss the types of environmental databases and software packages available. This discussion will also encompass the affected environmental areas of concern, product capabilities, and hardware requirements for product utilization.

  8. Final report bridge design system analysis and modernization.

    DOT National Transportation Integrated Search

    2016-09-27

    The Bridge Design System (BDS) is an in-house software program developed by the Michigan Department of Transportations : (MDOT) Bridge Design Unit. The BDS designs bridges according to the required specifications, and outputs corresponding design ...

  9. ENCOMPASS: A SAGA based environment for the compositon of programs and specifications, appendix A

    NASA Technical Reports Server (NTRS)

    Terwilliger, Robert B.; Campbell, Roy H.

    1985-01-01

    ENCOMPASS is an example integrated software engineering environment being constructed by the SAGA project. ENCOMPASS supports the specification, design, construction and maintenance of efficient, validated, and verified programs in a modular programming language. The life cycle paradigm, schema of software configurations, and hierarchical library structure used by ENCOMPASS is presented. In ENCOMPASS, the software life cycle is viewed as a sequence of developments, each of which reuses components from the previous ones. Each development proceeds through the phases planning, requirements definition, validation, design, implementation, and system integration. The components in a software system are modeled as entities which have relationships between them. An entity may have different versions and different views of the same project are allowed. The simple entities supported by ENCOMPASS may be combined into modules which may be collected into projects. ENCOMPASS supports multiple programmers and projects using a hierarchical library system containing a workspace for each programmer; a project library for each project, and a global library common to all projects.

  10. User-friendly tools on handheld devices for observer performance study

    NASA Astrophysics Data System (ADS)

    Matsumoto, Takuya; Hara, Takeshi; Shiraishi, Junji; Fukuoka, Daisuke; Abe, Hiroyuki; Matsusako, Masaki; Yamada, Akira; Zhou, Xiangrong; Fujita, Hiroshi

    2012-02-01

    ROC studies require complex procedures to select cases from many data samples, and to set confidence levels in each selected case to generate ROC curves. In some observer performance studies, researchers have to develop software with specific graphical user interface (GUI) to obtain confidence levels from readers. Because ROC studies could be designed for various clinical situations, it is difficult task for preparing software corresponding to every ROC studies. In this work, we have developed software for recording confidence levels during observer studies on tiny personal handheld devices such as iPhone, iPod touch, and iPad. To confirm the functions of our software, three radiologists performed observer studies to detect lung nodules by using public database of chest radiograms published by Japan Society of Radiological Technology. The output in text format conformed to the format for the famous ROC kit from the University of Chicago. Times required for the reading each case was recorded very precisely.

  11. AdaNET Dynamic Software Inventory (DSI) prototype component acquisition plan

    NASA Technical Reports Server (NTRS)

    Hanley, Lionel

    1989-01-01

    A component acquisition plan contains the information needed to evaluate, select, and acquire software and hardware components necessary for successful completion of the AdaNET Dynamic Software Inventory (DSI) Management System Prototype. This plan will evolve and be applicable to all phases of the DSI prototype development. Resources, budgets, schedules, and organizations related to component acquisition activities are provided. A purpose and description of a software or hardware component which is to be acquired are presented. Since this is a plan for acquisition of all components, this section is not applicable. The procurement activities and events conducted by the acquirer are described and who is responsible is identified, where the activity will be performed, and when the activities will occur for each planned procurement. Acquisition requirements describe the specific requirements and standards to be followed during component acquisition. The activities which will take place during component acquisition are described. A list of abbreviations and acronyms, and a glossary are contained.

  12. Specification for Visual Requirements of Work-Centered Software Systems

    DTIC Science & Technology

    2006-10-01

    no person shall be subject to any penity for fallng to comply wih a collection of N ilo ration it does not display a currenty valid OMB control...work- aiding systems. Based on the design concept for a work- centered support system (WCSS), these software systems support user tasks and goals...through both direct and indirect aiding methods within the interface client. In order to ensure the coherent development and delivery of work- centered

  13. Challenges and Demands on Automated Software Revision

    NASA Technical Reports Server (NTRS)

    Bonakdarpour, Borzoo; Kulkarni, Sandeep S.

    2008-01-01

    In the past three decades, automated program verification has undoubtedly been one of the most successful contributions of formal methods to software development. However, when verification of a program against a logical specification discovers bugs in the program, manual manipulation of the program is needed in order to repair it. Thus, in the face of existence of numerous unverified and un- certified legacy software in virtually any organization, tools that enable engineers to automatically verify and subsequently fix existing programs are highly desirable. In addition, since requirements of software systems often evolve during the software life cycle, the issue of incomplete specification has become a customary fact in many design and development teams. Thus, automated techniques that revise existing programs according to new specifications are of great assistance to designers, developers, and maintenance engineers. As a result, incorporating program synthesis techniques where an algorithm generates a program, that is correct-by-construction, seems to be a necessity. The notion of manual program repair described above turns out to be even more complex when programs are integrated with large collections of sensors and actuators in hostile physical environments in the so-called cyber-physical systems. When such systems are safety/mission- critical (e.g., in avionics systems), it is essential that the system reacts to physical events such as faults, delays, signals, attacks, etc, so that the system specification is not violated. In fact, since it is impossible to anticipate all possible such physical events at design time, it is highly desirable to have automated techniques that revise programs with respect to newly identified physical events according to the system specification.

  14. Development of the ITER magnetic diagnostic set and specification.

    PubMed

    Vayakis, G; Arshad, S; Delhom, D; Encheva, A; Giacomin, T; Jones, L; Patel, K M; Pérez-Lasala, M; Portales, M; Prieto, D; Sartori, F; Simrock, S; Snipes, J A; Udintsev, V S; Watts, C; Winter, A; Zabeo, L

    2012-10-01

    ITER magnetic diagnostics are now in their detailed design and R&D phase. They have passed their conceptual design reviews and a working diagnostic specification has been prepared aimed at the ITER project requirements. This paper highlights specific design progress, in particular, for the in-vessel coils, steady state sensors, saddle loops and divertor sensors. Key changes in the measurement specifications, and a working concept of software and electronics are also outlined.

  15. Modeling software systems by domains

    NASA Technical Reports Server (NTRS)

    Dippolito, Richard; Lee, Kenneth

    1992-01-01

    The Software Architectures Engineering (SAE) Project at the Software Engineering Institute (SEI) has developed engineering modeling techniques that both reduce the complexity of software for domain-specific computer systems and result in systems that are easier to build and maintain. These techniques allow maximum freedom for system developers to apply their domain expertise to software. We have applied these techniques to several types of applications, including training simulators operating in real time, engineering simulators operating in non-real time, and real-time embedded computer systems. Our modeling techniques result in software that mirrors both the complexity of the application and the domain knowledge requirements. We submit that the proper measure of software complexity reflects neither the number of software component units nor the code count, but the locus of and amount of domain knowledge. As a result of using these techniques, domain knowledge is isolated by fields of engineering expertise and removed from the concern of the software engineer. In this paper, we will describe kinds of domain expertise, describe engineering by domains, and provide relevant examples of software developed for simulator applications using the techniques.

  16. Software System Safety and the NASA Aeronautics Blueprint

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael; Hayhurst, Kelly J.

    2002-01-01

    NASA's Aeronautics Blueprint lays out a research agenda for the Agency s aeronautics program. The word software appears only four times in this Blueprint, but the critical importance of safe and correct software to the fulfillment of the proposed research is evident on almost every page. Most of the technology solutions proposed to address challenges in aviation are software dependent technologies. Of the fifty-two specific technology solutions described in the Blueprint, forty-one depend, at least in part, on software for success. For thirty-five of these forty-one, software is not only critical to success, but also to human safety. That is, implementing the technology solutions will require using software in such a way that it may, if not specified, designed, and implemented properly, lead to fatal accidents. These results have at least two implications for the research based on the Blueprint: (1) knowledge about the current state-of-the-art and state-of-the-practice in software engineering and software system safety is essential, and (2) research into current unsolved problems in these software disciplines is also essential.

  17. Managing Complexity in the MSL/Curiosity Entry, Descent, and Landing Flight Software and Avionics Verification and Validation Campaign

    NASA Technical Reports Server (NTRS)

    Stehura, Aaron; Rozek, Matthew

    2013-01-01

    The complexity of the Mars Science Laboratory (MSL) mission presented the Entry, Descent, and Landing systems engineering team with many challenges in its Verification and Validation (V&V) campaign. This paper describes some of the logistical hurdles related to managing a complex set of requirements, test venues, test objectives, and analysis products in the implementation of a specific portion of the overall V&V program to test the interaction of flight software with the MSL avionics suite. Application-specific solutions to these problems are presented herein, which can be generalized to other space missions and to similar formidable systems engineering problems.

  18. DNA Commission of the International Society for Forensic Genetics: Recommendations on the validation of software programs performing biostatistical calculations for forensic genetics applications.

    PubMed

    Coble, M D; Buckleton, J; Butler, J M; Egeland, T; Fimmers, R; Gill, P; Gusmão, L; Guttman, B; Krawczak, M; Morling, N; Parson, W; Pinto, N; Schneider, P M; Sherry, S T; Willuweit, S; Prinz, M

    2016-11-01

    The use of biostatistical software programs to assist in data interpretation and calculate likelihood ratios is essential to forensic geneticists and part of the daily case work flow for both kinship and DNA identification laboratories. Previous recommendations issued by the DNA Commission of the International Society for Forensic Genetics (ISFG) covered the application of bio-statistical evaluations for STR typing results in identification and kinship cases, and this is now being expanded to provide best practices regarding validation and verification of the software required for these calculations. With larger multiplexes, more complex mixtures, and increasing requests for extended family testing, laboratories are relying more than ever on specific software solutions and sufficient validation, training and extensive documentation are of upmost importance. Here, we present recommendations for the minimum requirements to validate bio-statistical software to be used in forensic genetics. We distinguish between developmental validation and the responsibilities of the software developer or provider, and the internal validation studies to be performed by the end user. Recommendations for the software provider address, for example, the documentation of the underlying models used by the software, validation data expectations, version control, implementation and training support, as well as continuity and user notifications. For the internal validations the recommendations include: creating a validation plan, requirements for the range of samples to be tested, Standard Operating Procedure development, and internal laboratory training and education. To ensure that all laboratories have access to a wide range of samples for validation and training purposes the ISFG DNA commission encourages collaborative studies and public repositories of STR typing results. Published by Elsevier Ireland Ltd.

  19. 40 CFR 1033.625 - Special certification provisions for non-locomotive-specific engines.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... and engine family identifier for the engines. (ii) A brief engineering analysis describing how the... engine software. Note that this allowance to separately submit some of the information required by § 1033...

  20. 40 CFR 1033.625 - Special certification provisions for non-locomotive-specific engines.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... and engine family identifier for the engines. (ii) A brief engineering analysis describing how the... engine software. Note that this allowance to separately submit some of the information required by § 1033...

  1. 40 CFR 1033.625 - Special certification provisions for non-locomotive-specific engines.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... name of the engine manufacturer and engine family identifier for the engines. (ii) A brief engineering... proprietary engine software. Note that this allowance to separately submit some of the information required by...

  2. 40 CFR 1033.625 - Special certification provisions for non-locomotive-specific engines.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... name of the engine manufacturer and engine family identifier for the engines. (ii) A brief engineering... proprietary engine software. Note that this allowance to separately submit some of the information required by...

  3. 40 CFR 1033.625 - Special certification provisions for non-locomotive-specific engines.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... name of the engine manufacturer and engine family identifier for the engines. (ii) A brief engineering... proprietary engine software. Note that this allowance to separately submit some of the information required by...

  4. Architecture for distributed design and fabrication

    NASA Astrophysics Data System (ADS)

    McIlrath, Michael B.; Boning, Duane S.; Troxel, Donald E.

    1997-01-01

    We describe a flexible, distributed system architecture capable of supporting collaborative design and fabrication of semi-conductor devices and integrated circuits. Such capabilities are of particular importance in the development of new technologies, where both equipment and expertise are limited. Distributed fabrication enables direct, remote, physical experimentation in the development of leading edge technology, where the necessary manufacturing resources are new, expensive, and scarce. Computational resources, software, processing equipment, and people may all be widely distributed; their effective integration is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages current vendor and consortia developments to define software interfaces and infrastructure based on existing and merging networking, CIM, and CAD standards. Process engineers and product designers access processing and simulation results through a common interface and collaborate across the distributed manufacturing environment.

  5. Generating target system specifications from a domain model using CLIPS

    NASA Technical Reports Server (NTRS)

    Sugumaran, Vijayan; Gomaa, Hassan; Kerschberg, Larry

    1991-01-01

    The quest for reuse in software engineering is still being pursued and researchers are actively investigating the domain modeling approach to software construction. There are several domain modeling efforts reported in the literature and they all agree that the components that are generated from domain modeling are more conducive to reuse. Once a domain model is created, several target systems can be generated by tailoring the domain model or by evolving the domain model and then tailoring it according to the specified requirements. This paper presents the Evolutionary Domain Life Cycle (EDLC) paradigm in which a domain model is created using multiple views, namely, aggregation hierarchy, generalization/specialization hierarchies, object communication diagrams and state transition diagrams. The architecture of the Knowledge Based Requirements Elicitation Tool (KBRET) which is used to generate target system specifications is also presented. The preliminary version of KBRET is implemented in the C Language Integrated Production System (CLIPS).

  6. Research on performance requirements of turbofan engine used on carrier-based UAV

    NASA Astrophysics Data System (ADS)

    Zhao, Shufan; Li, Benwei; Zhang, Wenlong; Wu, Heng; Feng, Tang

    2017-05-01

    According to the mission requirements of the carrier-based unmanned aerial vehicle (UAV), a mode level flight was established to calculate the thrust requirements from altitude 9 km to 13 km. Then, the estimation method of flight profile was used to calculate the weight of UAV in each stage to get the specific fuel consumption requirements of the UAV in standby stage. The turbofan engine of carrier-based UAV should meet the thrust and specific fuel consumption requirements. Finally, the GSP software was used to verify the simulation of a small high-bypass turbofan engine. The conclusion is useful for the turbofan engine selection of carrier-based UAV.

  7. Technical Specifications for Hardware and Software, and Maintenance in Support of Computer Literacy Program. Volume II.

    ERIC Educational Resources Information Center

    District of Columbia Public Schools, Washington, DC.

    Designed for use by vendors, this guide provides an overview of the objectives for the 5-year computer literacy program to be implemented in the District of Columbia Public Schools; outlines requirements which are mandatory elements of vendors' bids unless explicitly designated "desirable"; and details specifications for computing…

  8. Point source detection in infrared astronomical surveys

    NASA Technical Reports Server (NTRS)

    Pelzmann, R. F., Jr.

    1977-01-01

    Data processing techniques useful for infrared astronomy data analysis systems are reported. This investigation is restricted to consideration of data from space-based telescope systems operating as survey instruments. In this report the theoretical background for specific point-source detection schemes is completed, and the development of specific algorithms and software for the broad range of requirements is begun.

  9. Requirements and specifications of the space telescope for scientific operations

    NASA Technical Reports Server (NTRS)

    West, D. K.

    1976-01-01

    Requirements for the scientific operations of the Space Telescope and the Science Institute are used to develop operational interfaces between user scientists and the NASA ground system. General data systems are defined for observatory scheduling, daily science planning, and science data management. Hardware, software, manpower, and space are specified for several science institute locations and support options.

  10. Agile Methods for Open Source Safety-Critical Software

    PubMed Central

    Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-01-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the right amount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545

  11. Agile Methods for Open Source Safety-Critical Software.

    PubMed

    Gary, Kevin; Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-08-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the rightamount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion.

  12. Improving Reliability of Spectrum Analysis for Software Quality Requirements Using TCM

    NASA Astrophysics Data System (ADS)

    Kaiya, Haruhiko; Tanigawa, Masaaki; Suzuki, Shunichi; Sato, Tomonori; Osada, Akira; Kaijiri, Kenji

    Quality requirements are scattered over a requirements specification, thus it is hard to measure and trace such quality requirements to validate the specification against stakeholders' needs. We proposed a technique called “spectrum analysis for quality requirements” which enabled analysts to sort a requirements specification to measure and track quality requirements in the specification. In the same way as a spectrum in optics, a quality spectrum of a specification shows a quantitative feature of the specification with respect to quality. Therefore, we can compare a specification of a system to another one with respect to quality. As a result, we can validate such a specification because we can check whether the specification has common quality features and know its specific features against specifications of existing similar systems. However, our first spectrum analysis for quality requirements required a lot of effort and knowledge of a problem domain and it was hard to reuse such knowledge to reduce the effort. We thus introduce domain knowledge called term-characteristic map (TCM) to reuse the knowledge for our quality spectrum analysis. Through several experiments, we evaluate our spectrum analysis, and main finding are as follows. First, we confirmed specifications of similar systems have similar quality spectra. Second, results of spectrum analysis using TCM are objective, i.e., different analysts can generate almost the same spectra when they analyze the same specification.

  13. New software for statistical analysis of Cambridge Structural Database data

    PubMed Central

    Sykes, Richard A.; McCabe, Patrick; Allen, Frank H.; Battle, Gary M.; Bruno, Ian J.; Wood, Peter A.

    2011-01-01

    A collection of new software tools is presented for the analysis of geometrical, chemical and crystallographic data from the Cambridge Structural Database (CSD). This software supersedes the program Vista. The new functionality is integrated into the program Mercury in order to provide statistical, charting and plotting options alongside three-dimensional structural visualization and analysis. The integration also permits immediate access to other information about specific CSD entries through the Mercury framework, a common requirement in CSD data analyses. In addition, the new software includes a range of more advanced features focused towards structural analysis such as principal components analysis, cone-angle correction in hydrogen-bond analyses and the ability to deal with topological symmetry that may be exhibited in molecular search fragments. PMID:22477784

  14. Space Station: NASA's software development approach increases safety and cost risks. Report to the Chairman, Committee on Science, Space, and Technology, House of Representatives

    NASA Astrophysics Data System (ADS)

    1992-06-01

    The House Committee on Science, Space, and Technology asked NASA to study software development issues for the space station. How well NASA has implemented key software engineering practices for the station was asked. Specifically, the objectives were to determine: (1) if independent verification and validation techniques are being used to ensure that critical software meets specified requirements and functions; (2) if NASA has incorporated software risk management techniques into program; (3) whether standards are in place that will prescribe a disciplined, uniform approach to software development; and (4) if software support tools will help, as intended, to maximize efficiency in developing and maintaining the software. To meet the objectives, NASA proceeded: (1) reviewing and analyzing software development objectives and strategies contained in NASA conference publications; (2) reviewing and analyzing NASA, other government, and industry guidelines for establishing good software development practices; (3) reviewing and analyzing technical proposals and contracts; (4) reviewing and analyzing software management plans, risk management plans, and program requirements; (4) reviewing and analyzing reports prepared by NASA and contractor officials that identified key issues and challenges facing the program; (5) obtaining expert opinions on what constitutes appropriate independent V-and-V and software risk management activities; (6) interviewing program officials at NASA headquarters in Washington, DC; at the Space Station Program Office in Reston, Virginia; and at the three work package centers; Johnson in Houston, Texas; Marshall in Huntsville, Alabama; and Lewis in Cleveland, Ohio; and (7) interviewing contractor officials doing work for NASA at Johnson and Marshall. The audit work was performed in accordance with generally accepted government auditing standards, between April 1991 and May 1992.

  15. Failure detection and recovery in the assembly/contingency subsystem

    NASA Technical Reports Server (NTRS)

    Gantenbein, Rex E.

    1993-01-01

    The Assembly/Contingency Subsystem (ACS) is the primary communications link on board the Space Station. Any failure in a component of this system or in the external devices through which it communicates with ground-based systems will isolate the Station. The ACS software design includes a failure management capability (ACFM) that provides protocols for failure detection, isolation, and recovery (FDIR). The the ACFM design requirements as outlined in the current ACS software requirements specification document are reviewed. The activities carried out in this review include: (1) an informal, but thorough, end-to-end failure mode and effects analysis of the proposed software architecture for the ACFM; and (2) a prototype of the ACFM software, implemented as a C program under the UNIX operating system. The purpose of this review is to evaluate the FDIR protocols specified in the ACS design and the specifications themselves in light of their use in implementing the ACFM. The basis of failure detection in the ACFM is the loss of signal between the ground and the Station, which (under the appropriate circumstances) will initiate recovery to restore communications. This recovery involves the reconfiguration of the ACS to either a backup set of components or to a degraded communications mode. The initiation of recovery depends largely on the criticality of the failure mode, which is defined by tables in the ACFM and can be modified to provide a measure of flexibility in recovery procedures.

  16. The A-7E Software Requirements Document: Three Years of Change Data.

    DTIC Science & Technology

    1982-11-08

    Washington DC: Naval Research Laboratory. 1982. Interface Specifications for the A-7E Shared Services Module NRL Memorandum Report. Forthcoming...function driver module (Clements 1981), specifications for the extended computer module (Britton et al. 1982), and specifications for the shared ... services module (Clements 1982). The projected completion date for the SCR project is September 1985. As of the end of 1981, approximately 10 man-years of

  17. Improving the Agency's Software Acquisition Capability

    NASA Technical Reports Server (NTRS)

    Hankinson, Allen

    2003-01-01

    External development of software has oftc n led to unsatisfactory results and great frustration for the assurE 7ce community. Contracts frequently omit critical assuranc 4 processes or the right to oversee software development activitie: At a time when NASA depends more and more on software to in plement critical system functions, combination of three factors ex; cerbate this problem: I ) the ever-increasing trend to acquire rather than develop software in-house, 2) the trend toward performance based contracts, and 3) acquisition vehicles that only state softwar 2 requirements while leaving development standards and assur! ince methodologies up to the contractor. We propose to identify specific methods at d tools that NASA projects can use to mitigate the adverse el ects of the three problems. TWO broad classes of methoddt ols will be explored. The first will be those that provide NASA p ojects with insight and oversight into contractors' activities. The st cond will be those that help projects objectively assess, and thus i nprwe, their software acquisition capability. Of particular interest is the Software Engineering Institute's (SEI) Software Acqt isition Capability Maturity Model (SA-CMMO).

  18. Recognition and categorization considerations for information assurance requirements development and speficication

    NASA Astrophysics Data System (ADS)

    Stytz, Martin R.; May, Michael; Banks, Sheila B.

    2009-04-01

    Department of Defense (DoD) Information Technology (IT) systems operate in an environment different from the commercial world, the differences arise from the differences in the types of attacks, the interdependencies between DoD software systems, and the reliance upon commercial software to provide basic capabilities. The challenge that we face is determining how to specify the information assurance requirements for a system without requiring changes to the commercial software and in light of the interdependencies between systems. As a result of the interdependencies and interconnections between systems introduced by the global information grid (GIG), an assessment of the IA requirements for a system must consider three facets of a system's IA capabilities: 1) the IA vulnerabilities of the system, 2) the ability of a system to repel IA attacks, and 3) the ability of a system to insure that any IA attack that penetrates the system is contained within the system and does not spread. Each facet should be assessed independently and the requirements should be derived independently from the assessments. In addition to the desired IA technology capabilities of the system, a complete assessment of the system's overall IA security technology readiness level cannot be accomplished without an assessment of the capabilities required of the system for its capability to recover from and remediate IA vulnerabilities and compromises. To allow us to accomplish these three formidable tasks, we propose a general system architecture designed to separate the system's IA capabilities from its other capability requirements; thereby allowing the IA capabilities to be developed and assessed separately from the other system capabilities. The architecture also enables independent requirements specification, implementation, assessment, measurement, and improvement of a system's IA capabilities without requiring modification of the underlying application software.

  19. LArSoft: toolkit for simulation, reconstruction and analysis of liquid argon TPC neutrino detectors

    NASA Astrophysics Data System (ADS)

    Snider, E. L.; Petrillo, G.

    2017-10-01

    LArSoft is a set of detector-independent software tools for the simulation, reconstruction and analysis of data from liquid argon (LAr) neutrino experiments The common features of LAr time projection chambers (TPCs) enable sharing of algorithm code across detectors of very different size and configuration. LArSoft is currently used in production simulation and reconstruction by the ArgoNeuT, DUNE, LArlAT, MicroBooNE, and SBND experiments. The software suite offers a wide selection of algorithms and utilities, including those for associated photo-detectors and the handling of auxiliary detectors outside the TPCs. Available algorithms cover the full range of simulation and reconstruction, from raw waveforms to high-level reconstructed objects, event topologies and classification. The common code within LArSoft is contributed by adopting experiments, which also provide detector-specific geometry descriptions, and code for the treatment of electronic signals. LArSoft is also a collaboration of experiments, Fermilab and associated software projects which cooperate in setting requirements, priorities, and schedules. In this talk, we outline the general architecture of the software and the interaction with external libraries and detector-specific code. We also describe the dynamics of LArSoft software development between the contributing experiments, the projects supporting the software infrastructure LArSoft relies on, and the core LArSoft support project.

  20. The Orion GN and C Data-Driven Flight Software Architecture for Automated Sequencing and Fault Recovery

    NASA Technical Reports Server (NTRS)

    King, Ellis; Hart, Jeremy; Odegard, Ryan

    2010-01-01

    The Orion Crew Exploration Vehicle (CET) is being designed to include significantly more automation capability than either the Space Shuttle or the International Space Station (ISS). In particular, the vehicle flight software has requirements to accommodate increasingly automated missions throughout all phases of flight. A data-driven flight software architecture will provide an evolvable automation capability to sequence through Guidance, Navigation & Control (GN&C) flight software modes and configurations while maintaining the required flexibility and human control over the automation. This flexibility is a key aspect needed to address the maturation of operational concepts, to permit ground and crew operators to gain trust in the system and mitigate unpredictability in human spaceflight. To allow for mission flexibility and reconfrgurability, a data driven approach is being taken to load the mission event plan as well cis the flight software artifacts associated with the GN&C subsystem. A database of GN&C level sequencing data is presented which manages and tracks the mission specific and algorithm parameters to provide a capability to schedule GN&C events within mission segments. The flight software data schema for performing automated mission sequencing is presented with a concept of operations for interactions with ground and onboard crew members. A prototype architecture for fault identification, isolation and recovery interactions with the automation software is presented and discussed as a forward work item.

  1. Quality Improvement With Discrete Event Simulation: A Primer for Radiologists.

    PubMed

    Booker, Michael T; O'Connell, Ryan J; Desai, Bhushan; Duddalwar, Vinay A

    2016-04-01

    The application of simulation software in health care has transformed quality and process improvement. Specifically, software based on discrete-event simulation (DES) has shown the ability to improve radiology workflows and systems. Nevertheless, despite the successful application of DES in the medical literature, the power and value of simulation remains underutilized. For this reason, the basics of DES modeling are introduced, with specific attention to medical imaging. In an effort to provide readers with the tools necessary to begin their own DES analyses, the practical steps of choosing a software package and building a basic radiology model are discussed. In addition, three radiology system examples are presented, with accompanying DES models that assist in analysis and decision making. Through these simulations, we provide readers with an understanding of the theory, requirements, and benefits of implementing DES in their own radiology practices. Copyright © 2016 American College of Radiology. All rights reserved.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valassi, A.; Clemencic, M.; Dykstra, D.

    The Persistency Framework consists of three software packages (CORAL, COOL and POOL) addressing the data access requirements of the LHC experiments in different areas. It is the result of the collaboration between the CERN IT Department and the three experiments (ATLAS, CMS and LHCb) that use this software to access their data. POOL is a hybrid technology store for C++ objects, metadata catalogs and collections. CORAL is a relational database abstraction layer with an SQL-free API. COOL provides specific software tools and components for the handling of conditions data. This paper reports on the status and outlook of the projectmore » and reviews in detail the usage of each package in the three experiments.« less

  3. The Application of V&V within Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward

    1996-01-01

    Verification and Validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In reuse-based software engineering, decisions on the requirements, design and even implementation of domain assets can can be made prior to beginning development of a specific system. in order to bring the effectiveness of V&V to bear within reuse-based software engineering. V&V must be incorporated within the domain engineering process.

  4. Software package for modeling spin-orbit motion in storage rings

    NASA Astrophysics Data System (ADS)

    Zyuzin, D. V.

    2015-12-01

    A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 106-109 particles in a beam during 109 turns in an accelerator (about 1012-1015 integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin-orbit dynamics.

  5. SandiaMRCR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-01-05

    SandiaMCR was developed to identify pure components and their concentrations from spectral data. This software efficiently implements the multivariate calibration regression alternating least squares (MCR-ALS), principal component analysis (PCA), and singular value decomposition (SVD). Version 3.37 also includes the PARAFAC-ALS Tucker-1 (for trilinear analysis) algorithms. The alternating least squares methods can be used to determine the composition without or with incomplete prior information on the constituents and their concentrations. It allows the specification of numerous preprocessing, initialization and data selection and compression options for the efficient processing of large data sets. The software includes numerous options including the definition ofmore » equality and non-negativety constraints to realistically restrict the solution set, various normalization or weighting options based on the statistics of the data, several initialization choices and data compression. The software has been designed to provide a practicing spectroscopist the tools required to routinely analysis data in a reasonable time and without requiring expert intervention.« less

  6. Generic Kalman Filter Software

    NASA Technical Reports Server (NTRS)

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on the basis of the aforementioned templates. The GKF software can be used to develop many different types of unfactorized Kalman filters. A developer can choose to implement either a linearized or an extended Kalman filter algorithm, without having to modify the GKF software. Control dynamics can be taken into account or neglected in the filter-dynamics model. Filter programs developed by use of the GKF software can be made to propagate equations of motion for linear or nonlinear dynamical systems that are deterministic or stochastic. In addition, filter programs can be made to operate in user-selectable "covariance analysis" and "propagation-only" modes that are useful in design and development stages.

  7. The software analysis project for the Office of Human Resources

    NASA Technical Reports Server (NTRS)

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil service employee with several years until retirement. The employee enters current salary and savings information as well as goals concerning salary at retirement, assumptions on inflation, and the return on investments. The program produces a picture of the employee's retirement income from all sources based on the assumptions entered. A session showing features of the program was conducted for key personnel at the Center. After analysis, it was decided to offer the program through the Learning Center starting in August 1994.

  8. HAL/S language specification. Version IR-542

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The formal HAL/S language specification is documented with particular referral to the essentials of HAL/S syntax and semantics. The language is intended to satisfy virtually all of the flight software requirements of NASA programs. To achieve this, HAL/S incorporates a wide range of features, including applications oriented data types and organizations, real time control mechanisms, and constructs for systems programming tasks.

  9. Software for visualization, analysis, and manipulation of laser scan images

    NASA Astrophysics Data System (ADS)

    Burnsides, Dennis B.

    1997-03-01

    The recent introduction of laser surface scanning to scientific applications presents a challenge to computer scientists and engineers. Full utilization of this two- dimensional (2-D) and three-dimensional (3-D) data requires advances in techniques and methods for data processing and visualization. This paper explores the development of software to support the visualization, analysis and manipulation of laser scan images. Specific examples presented are from on-going efforts at the Air Force Computerized Anthropometric Research and Design (CARD) Laboratory.

  10. Software Product Lines: Report of the 2009 U.S. Army Software Product Line Workshop

    DTIC Science & Technology

    2009-04-01

    record system was fielded in 2008. One early challenge for Overwatch was coming up with a funding model that would support core asset development (a...match the organizational model to the funding model . Product line architecture is essential. Address product line requirements up front. Put processes...when trying to move from a customer-driven, product-specific funding model to one in which at least some of the funds are allocated to the creation and

  11. IAC level "O" program development

    NASA Technical Reports Server (NTRS)

    Vos, R. G.

    1982-01-01

    The current status of the IAC development activity is summarized. The listed prototype software and documentation was delivered, and details were planned for development of the level 1 operational system. The planned end product IAC is required to support LSST design analysis and performance evaluation, with emphasis on the coupling of required technical disciplines. The long term IAC effectively provides two distinct features: a specific set of analysis modules (thermal, structural, controls, antenna radiation performance and instrument optical performance) that will function together with the IAC supporting software in an integrated and user friendly manner; and a general framework whereby new analysis modules can readily be incorporated into IAC or be allowed to communicate with it.

  12. NASA Data Acquisition System Software Development for Rocket Propulsion Test Facilities

    NASA Technical Reports Server (NTRS)

    Herbert, Phillip W., Sr.; Elliot, Alex C.; Graves, Andrew R.

    2015-01-01

    Current NASA propulsion test facilities include Stennis Space Center in Mississippi, Marshall Space Flight Center in Alabama, Plum Brook Station in Ohio, and White Sands Test Facility in New Mexico. Within and across these centers, a diverse set of data acquisition systems exist with different hardware and software platforms. The NASA Data Acquisition System (NDAS) is a software suite designed to operate and control many critical aspects of rocket engine testing. The software suite combines real-time data visualization, data recording to a variety formats, short-term and long-term acquisition system calibration capabilities, test stand configuration control, and a variety of data post-processing capabilities. Additionally, data stream conversion functions exist to translate test facility data streams to and from downstream systems, including engine customer systems. The primary design goals for NDAS are flexibility, extensibility, and modularity. Providing a common user interface for a variety of hardware platforms helps drive consistency and error reduction during testing. In addition, with an understanding that test facilities have different requirements and setups, the software is designed to be modular. One engine program may require real-time displays and data recording; others may require more complex data stream conversion, measurement filtering, or test stand configuration management. The NDAS suite allows test facilities to choose which components to use based on their specific needs. The NDAS code is primarily written in LabVIEW, a graphical, data-flow driven language. Although LabVIEW is a general-purpose programming language; large-scale software development in the language is relatively rare compared to more commonly used languages. The NDAS software suite also makes extensive use of a new, advanced development framework called the Actor Framework. The Actor Framework provides a level of code reuse and extensibility that has previously been difficult to achieve using LabVIEW. The

  13. Why and how Mastering an Incremental and Iterative Software Development Process

    NASA Astrophysics Data System (ADS)

    Dubuc, François; Guichoux, Bernard; Cormery, Patrick; Mescam, Jean Christophe

    2004-06-01

    One of the key issues regularly mentioned in the current software crisis of the space domain is related to the software development process that must be performed while the system definition is not yet frozen. This is especially true for complex systems like launchers or space vehicles.Several more or less mature solutions are under study by EADS SPACE Transportation and are going to be presented in this paper. The basic principle is to develop the software through an iterative and incremental process instead of the classical waterfall approach, with the following advantages:- It permits systematic management and incorporation of requirements changes over the development cycle with a minimal cost. As far as possible the most dimensioning requirements are analyzed and developed in priority for validating very early the architecture concept without the details.- A software prototype is very quickly available. It improves the communication between system and software teams, as it enables to check very early and efficiently the common understanding of the system requirements.- It allows the software team to complete a whole development cycle very early, and thus to become quickly familiar with the software development environment (methodology, technology, tools...). This is particularly important when the team is new, or when the environment has changed since the previous development. Anyhow, it improves a lot the learning curve of the software team.These advantages seem very attractive, but mastering efficiently an iterative development process is not so easy and induces a lot of difficulties such as:- How to freeze one configuration of the system definition as a development baseline, while most of thesystem requirements are completely and naturally unstable?- How to distinguish stable/unstable and dimensioning/standard requirements?- How to plan the development of each increment?- How to link classical waterfall development milestones with an iterative approach: when should theclassical reviews be performed: Software Specification Review? Preliminary Design Review? CriticalDesign Review? Code Review? Etc...Several solutions envisaged or already deployed by EADS SPACE Transportation will be presented, both from a methodological and technological point of view:- How the MELANIE EADS ST internal methodology improves the concurrent engineering activitiesbetween GNC, software and simulation teams in a very iterative and reactive way.- How the CMM approach can help by better formalizing Requirements Management and Planningprocesses.- How the Automatic Code Generation with "certified" tools (SCADE) can still dramatically shorten thedevelopment cycle.Then the presentation will conclude by showing an evaluation of the cost and planning reduction based on a pilot application by comparing figures on two similar projects: one with the classical waterfall process, the other one with an iterative and incremental approach.

  14. Develop a Model Component

    NASA Technical Reports Server (NTRS)

    Ensey, Tyler S.

    2013-01-01

    During my internship at NASA, I was a model developer for Ground Support Equipment (GSE). The purpose of a model developer is to develop and unit test model component libraries (fluid, electrical, gas, etc.). The models are designed to simulate software for GSE (Ground Special Power, Crew Access Arm, Cryo, Fire and Leak Detection System, Environmental Control System (ECS), etc. .) before they are implemented into hardware. These models support verifying local control and remote software for End-Item Software Under Test (SUT). The model simulates the physical behavior (function, state, limits and 110) of each end-item and it's dependencies as defined in the Subsystem Interface Table, Software Requirements & Design Specification (SRDS), Ground Integrated Schematic (GIS), and System Mechanical Schematic.(SMS). The software of each specific model component is simulated through MATLAB's Simulink program. The intensiv model development life cycle is a.s follows: Identify source documents; identify model scope; update schedule; preliminary design review; develop model requirements; update model.. scope; update schedule; detailed design review; create/modify library component; implement library components reference; implement subsystem components; develop a test script; run the test script; develop users guide; send model out for peer review; the model is sent out for verifictionlvalidation; if there is empirical data, a validation data package is generated; if there is not empirical data, a verification package is generated; the test results are then reviewed; and finally, the user. requests accreditation, and a statement of accreditation is prepared. Once each component model is reviewed and approved, they are intertwined together into one integrated model. This integrated model is then tested itself, through a test script and autotest, so that it can be concluded that all models work conjointly, for a single purpose. The component I was assigned, specifically, was a fluid component, a discrete pressure switch. The switch takes a fluid pressure input, and if the pressure is greater than a designated cutoff pressure, the switch would stop fluid flow.

  15. Software Formal Inspections Guidebook

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  16. A Methodology for Writing High Quality Requirement Specifications and for Evaluating Existing Ones

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda; Hammer, Theodore

    1999-01-01

    Requirements development and management have always been critical in the implementation of software systems-engineers are unable to build what analysts can not define. It is generally accepted that the earlier in the life cycle potential risks are identified the easier it is to eliminate or manage the conditions that introduce that risk. Problems that are not found until testing are approximately 14 times more costly to fix than if the problem was found in the requirement phase. The requirements specification, as the first tangible representation of the capability to be produced, establishes the basis for all of the project's engineering management and assurance functions. If the quality of the requirements specification is poor it can give rise to risks in all areas of the project. Recently, automated tools have become available to support requirements management. The use of these tools not only provides support in the definition and tracing of requirements, but it also opens the door to effective use of metrics in characterizing and assessing the quality of the requirement specifications.

  17. A Methodology for Writing High Quality Requirements Specification and Evaluating Existing Ones

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda; Hammer, Theodore

    1999-01-01

    Requirements development and management have always been critical in the implementation of software systems; engineers are unable to build what analysts can't define. It is generally accepted that the earlier in the life cycle potential risks are identified the easier it is to eliminate or manage the conditions that introduce that risk. Problems that are not found until testing are approximately 14 times more costly to fix than if the problem was found in the requirement phase. The requirements specification, as the first tangible representation of the capability to be produced, establishes the basis for all of the project's engineering management and assurance functions. If the quality of the requirements specification is poor it can give rise to risks in all areas of the project. Recently, automated tools have become available to support requirements management. The use of these tools not only provides support in the definition and tracing of requirements, but it also opens the door to effective use of metrics in characterizing and assessing the quality of the requirement specifications.

  18. Computer programming for generating visual stimuli.

    PubMed

    Bukhari, Farhan; Kurylo, Daniel D

    2008-02-01

    Critical to vision research is the generation of visual displays with precise control over stimulus metrics. Generating stimuli often requires adapting commercial software or developing specialized software for specific research applications. In order to facilitate this process, we give here an overview that allows nonexpert users to generate and customize stimuli for vision research. We first give a review of relevant hardware and software considerations, to allow the selection of display hardware, operating system, programming language, and graphics packages most appropriate for specific research applications. We then describe the framework of a generic computer program that can be adapted for use with a broad range of experimental applications. Stimuli are generated in the context of trial events, allowing the display of text messages, the monitoring of subject responses and reaction times, and the inclusion of contingency algorithms. This approach allows direct control and management of computer-generated visual stimuli while utilizing the full capabilities of modern hardware and software systems. The flowchart and source code for the stimulus-generating program may be downloaded from www.psychonomic.org/archive.

  19. Distributed Visualization Project

    NASA Technical Reports Server (NTRS)

    Craig, Douglas; Conroy, Michael; Kickbusch, Tracey; Mazone, Rebecca

    2016-01-01

    Distributed Visualization allows anyone, anywhere to see any simulation at any time. Development focuses on algorithms, software, data formats, data systems and processes to enable sharing simulation-based information across temporal and spatial boundaries without requiring stakeholders to possess highly-specialized and very expensive display systems. It also introduces abstraction between the native and shared data, which allows teams to share results without giving away proprietary or sensitive data. The initial implementation of this capability is the Distributed Observer Network (DON) version 3.1. DON 3.1 is available for public release in the NASA Software Store (https://software.nasa.gov/software/KSC-13775) and works with version 3.0 of the Model Process Control specification (an XML Simulation Data Representation and Communication Language) to display complex graphical information and associated Meta-Data.

  20. IGDS/TRAP Interface Program (ITIP). Software User Manual (SUM). [network flow diagrams for coal gasification studies

    NASA Technical Reports Server (NTRS)

    Jefferys, S.; Johnson, W.; Lewis, R.; Rich, R.

    1981-01-01

    This specification establishes the requirements, concepts, and preliminary design for a set of software known as the IGDS/TRAP Interface Program (ITIP). This software provides the capability to develop at an Interactive Graphics Design System (IGDS) design station process flow diagrams for use by the NASA Coal Gasification Task Team. In addition, ITIP will use the Data Management and Retrieval System (DMRS) to maintain a data base from which a properly formatted input file to the Time-Line and Resources Analysis Program (TRAP) can be extracted. This set of software will reside on the PDP-11/70 and will become the primary interface between the Coal Gasification Task Team and IGDS, DMRS, and TRAP. The user manual for the computer program is presented.

  1. Design and Testing of Space Telemetry SCA Waveform

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Handler, Louis M.; Quinn, Todd M.

    2006-01-01

    A Software Communications Architecture (SCA) Waveform for space telemetry is being developed at the NASA Glenn Research Center (GRC). The space telemetry waveform is implemented in a laboratory testbed consisting of general purpose processors, field programmable gate arrays (FPGAs), analog-to-digital converters (ADCs), and digital-to-analog converters (DACs). The radio hardware is integrated with an SCA Core Framework and other software development tools. The waveform design is described from both the bottom-up signal processing and top-down software component perspectives. Simulations and model-based design techniques used for signal processing subsystems are presented. Testing with legacy hardware-based modems verifies proper design implementation and dynamic waveform operations. The waveform development is part of an effort by NASA to define an open architecture for space based reconfigurable transceivers. Use of the SCA as a reference has increased understanding of software defined radio architectures. However, since space requirements put a premium on size, mass, and power, the SCA may be impractical for today s space ready technology. Specific requirements for an SCA waveform and other lessons learned from this development are discussed.

  2. Using virtual reality for science mission planning: A Mars Pathfinder case

    NASA Technical Reports Server (NTRS)

    Kim, Jacqueline H.; Weidner, Richard J.; Sacks, Allan L.

    1994-01-01

    NASA's Mars Pathfinder Project requires a Ground Data System (GDS) that supports both engineering and scientific payloads with reduced mission operations staffing, and short planning schedules. Also, successful surface operation of the lander camera requires efficient mission planning and accurate pointing of the camera. To meet these challenges, a new software strategy that integrates virtual reality technology with existing navigational ancillary information and image processing capabilities. The result is an interactive workstation based applications software that provides a high resolution, 3-dimensial, stereo display of Mars as if it were viewed through the lander camera. The design, implementation strategy and parametric specification phases for the development of this software were completed, and the prototype tested. When completed, the software will allow scientists and mission planners to access simulated and actual scenes of Mars' surface. The perspective from the lander camera will enable scientists to plan activities more accurately and completely. The application will also support the sequence and command generation process and will allow testing and verification of camera pointing commands via simulation.

  3. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  4. Design of Control Software for a High-Speed Coherent Doppler Lidar System for CO2 Measurement

    NASA Technical Reports Server (NTRS)

    Vanvalkenburg, Randal L.; Beyon, Jeffrey Y.; Koch, Grady J.; Yu, Jirong; Singh, Upendra N.; Kavaya, Michael J.

    2010-01-01

    The design of the software for a 2-micron coherent high-speed Doppler lidar system for CO2 measurement at NASA Langley Research Center is discussed in this paper. The specific strategy and design topology to meet the requirements of the system are reviewed. In order to attain the high-speed digitization of the different types of signals to be sampled on multiple channels, a carefully planned design of the control software is imperative. Samples of digitized data from each channel and their roles in data analysis post processing are also presented. Several challenges of extremely-fast, high volume data acquisition are discussed. The software must check the validity of each lidar return as well as other monitoring channel data in real-time. For such high-speed data acquisition systems, the software is a key component that enables the entire scope of CO2 measurement studies using commercially available system components.

  5. Plant Habitat Telemetry / Command Interface and E-MIST

    NASA Technical Reports Server (NTRS)

    Walker, Uriae M.

    2013-01-01

    Plant Habitat (PH) is an experiment to be taken to the International Space Station (ISS) in 2016. It is critical that ground support computers have the ability to uplink commands to control PH, and that ISS computers have the ability to downlink PH telemetry data to ground support. This necessitates communication software that can send, receive, and process, PH specific commands and telemetry. The objective of the Plant Habitat Telemetry/ Command Interface is to provide this communication software, and to couple it with an intuitive Graphical User Interface (GUI). Initial investigation of the project objective led to the decision that code be written in C++ because of its compatibility with existing source code infrastructures and robustness. Further investigation led to a determination that multiple Ethernet packet structures would need to be created to effectively transmit data. Setting a standard for packet structures would allow us to distinguish these packets that would range from command type packets to sub categories of telemetry packets. In order to handle this range of packet types, the conclusion was made to take an object-oriented programming approach which complemented our decision to use the C++ programming language. In addition, extensive utilization of port programming concepts was required to implement the core functionality of the communication software. Also, a concrete understanding of a packet processing software was required in order to put aU the components of ISS-to-Ground Support Equipment (GSE) communication together and complete the objective. A second project discussed in this paper is Exposing Microbes to the Stratosphere (EMIST). This project exposes microbes into the stratosphere to observe how they are impacted by atmospheric effects. This paper focuses on the electrical and software expectations of the project, specifically drafting the printed circuit board, and programming the on-board sensors. The Eagle Computer-Aided Drafting (CAD) software was used to draft the E-MIST circuit. This required several component libraries to be created. Coding the sensors and obtaining sensor data involved using the Arduino Uno developmental board and coding language, and properly wiring peripheral sensors to the microcontroller (the central control unit of the experiment).

  6. 40 CFR 85.1902 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... POLLUTION FROM MOBILE SOURCES Emission Defect Reporting Requirements § 85.1902 Definitions. For the purposes...) which affects any parameter or specification enumerated in appendix VIII of this part; or (2) A defect..., components, systems, software or elements of design which must function properly to ensure continued...

  7. MultiLIS: A Description of the System Design and Operational Features.

    ERIC Educational Resources Information Center

    Kelly, Glen J.; And Others

    1988-01-01

    Describes development, hardware requirements, and features of the MultiLIS integrated library software package. A system profile provides pricing information, operational characteristics, and technical specifications. Sidebars discuss MultiLIS integration structure, incremental architecture, and NCR Tower Computers. (4 references) (MES)

  8. 40 CFR 85.1902 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... POLLUTION FROM MOBILE SOURCES Emission Defect Reporting Requirements § 85.1902 Definitions. For the purposes...) which affects any parameter or specification enumerated in appendix VIII of this part; or (2) A defect..., components, systems, software or elements of design which must function properly to assure continued...

  9. 40 CFR 85.1902 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... POLLUTION FROM MOBILE SOURCES Emission Defect Reporting Requirements § 85.1902 Definitions. For the purposes...) which affects any parameter or specification enumerated in appendix VIII of this part; or (2) A defect..., components, systems, software or elements of design which must function properly to ensure continued...

  10. 40 CFR 85.1902 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... POLLUTION FROM MOBILE SOURCES Emission Defect Reporting Requirements § 85.1902 Definitions. For the purposes...) which affects any parameter or specification enumerated in appendix VIII of this part; or (2) A defect..., components, systems, software or elements of design which must function properly to ensure continued...

  11. Requirements: Towards an understanding on why software projects fail

    NASA Astrophysics Data System (ADS)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.

    2016-08-01

    Requirement engineering is at the foundation of every successful software project. There are many reasons for software project failures; however, poorly engineered requirements process contributes immensely to the reason why software projects fail. Software project failure is usually costly and risky and could also be life threatening. Projects that undermine requirements engineering suffer or are likely to suffer from failures, challenges and other attending risks. The cost of project failures and overruns when estimated is very huge. Furthermore, software project failures or overruns pose a challenge in today's competitive market environment. It affects the company's image, goodwill, and revenue drive and decreases the perceived satisfaction of customers and clients. In this paper, requirements engineering was discussed. Its role in software projects success was elaborated. The place of software requirements process in relation to software project failure was explored and examined. Also, project success and failure factors were also discussed with emphasis placed on requirements factors as they play a major role in software projects' challenges, successes and failures. The paper relied on secondary data and empirical statistics to explore and examine factors responsible for the successes, challenges and failures of software projects in large, medium and small scaled software companies.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guyer, H.B.; McChesney, C.A.

    The overall primary Objective of HDAR is to create a repository of historical personnel security documents and provide the functionality needed for archival and retrieval use by other software modules and application users of the DISS/ET system. The software product to be produced from this specification is the Historical Document Archival and Retrieval Subsystem The product will provide the functionality to capture, retrieve and manage documents currently contained in the personnel security folders in DOE Operations Offices vaults at various locations across the United States. The long-term plan for DISS/ET includes the requirement to allow for capture and storage ofmore » arbitrary, currently undefined, clearance-related documents that fall outside the scope of the ``cradle-to-grave`` electronic processing provided by DISS/ET. However, this requirement is not within the scope of the requirements specified in this document.« less

  13. Repository-based software engineering program: Concept document

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This document provides the context for Repository-Based Software Engineering's (RBSE's) evolving functional and operational product requirements, and it is the parent document for development of detailed technical and management plans. When furnished, requirements documents will serve as the governing RBSE product specification. The RBSE Program Management Plan will define resources, schedules, and technical and organizational approaches to fulfilling the goals and objectives of this concept. The purpose of this document is to provide a concise overview of RBSE, describe the rationale for the RBSE Program, and define a clear, common vision for RBSE team members and customers. The document also provides the foundation for developing RBSE user and system requirements and a corresponding Program Management Plan. The concept is used to express the program mission to RBSE users and managers and to provide an exhibit for community review.

  14. Development of an expert system prototype for determining software functional requirements for command management activities at NASA Goddard

    NASA Technical Reports Server (NTRS)

    Liebowitz, J.

    1986-01-01

    The development of an expert system prototype for software functional requirement determination for NASA Goddard's Command Management System, as part of its process of transforming general requests into specific near-earth satellite commands, is described. The present knowledge base was formulated through interactions with domain experts, and was then linked to the existing Knowledge Engineering Systems (KES) expert system application generator. Steps in the knowledge-base development include problem-oriented attribute hierarchy development, knowledge management approach determination, and knowledge base encoding. The KES Parser and Inspector, in addition to backcasting and analogical mapping, were used to validate the expert system-derived requirements for one of the major functions of a spacecraft, the solar Maximum Mission. Knowledge refinement, evaluation, and implementation procedures of the expert system were then accomplished.

  15. Energy Emergency Management Information System (EEMIS): Functional requirements

    NASA Astrophysics Data System (ADS)

    1980-10-01

    These guidelines state that in order to create the widest practicable competition, the system's requirements, with few exceptions, must be expressed in functional terms without reference to specific hardware or software products, and that wherever exceptions are made a statement of justification must be provided. In addition, these guidelines set forth a recommended maximum threshold limit of annual contract value for schedule contract procurements.

  16. Precise Documentation: The Key to Better Software

    NASA Astrophysics Data System (ADS)

    Parnas, David Lorge

    The prime cause of the sorry “state of the art” in software development is our failure to produce good design documentation. Poor documentation is the cause of many errors and reduces efficiency in every phase of a software product's development and use. Most software developers believe that “documentation” refers to a collection of wordy, unstructured, introductory descriptions, thousands of pages that nobody wanted to write and nobody trusts. In contrast, Engineers in more traditional disciplines think of precise blueprints, circuit diagrams, and mathematical specifications of component properties. Software developers do not know how to produce precise documents for software. Software developments also think that documentation is something written after the software has been developed. In other fields of Engineering much of the documentation is written before and during the development. It represents forethought not afterthought. Among the benefits of better documentation would be: easier reuse of old designs, better communication about requirements, more useful design reviews, easier integration of separately written modules, more effective code inspection, more effective testing, and more efficient corrections and improvements. This paper explains how to produce and use precise software documentation and illustrate the methods with several examples.

  17. Formal methods and digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.

  18. Generic domain models in software engineering

    NASA Technical Reports Server (NTRS)

    Maiden, Neil

    1992-01-01

    This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.

  19. Needs challenge software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-07-01

    New hardware and software tools build on existing platforms and add performance and ease-of-use benefits as the struggle to find and produce hydrocarbons at the lowest cost becomes more and more competitive. Software tools now provide geoscientists and petroleum engineers with a better understanding of reservoirs from the shape and makeup of formation to behavior projections as hydrocarbons are extracted. Petroleum software tools allow scientists to simulate oil flow, predict the life expectancy of a reservoir, and even help determine how to extend the life and economic viability of the reservoir. The requirement of the petroleum industry to find andmore » extract petroleum more efficiently drives the solutions provided by software and service companies. To one extent or another, most of the petroleum software products available today have achieved an acceptable level of competency. Innovative, high-impact products from small, focussed companies often were bought out by larger companies with deeper pockets if their developers couldn`t fund their expansion. Other products disappeared from the scene, because they were unable to evolve fast enough to compete. There are still enough small companies around producing excellent products to prevent the marketplace from feeling too narrow and lacking in choice. Oil companies requiring specific solutions to their problems have helped fund product development within the commercial sector. As the industry has matured, strategic alliances between vendors, both hardware and software, have provided market advantages, often combining strengths to enter new and undeveloped areas for technology. The pace of technological development has been fast and constant.« less

  20. Are You Listening to Your Computer?

    ERIC Educational Resources Information Center

    Shugg, Alan

    1992-01-01

    Accepting the great motivational value of computers in second-language learning, this article describes ways to use authentic language recorded on a computer with HyperCard. Graphics, sound, and hardware/software requirements are noted, along with brief descriptions of programing with sound and specific programs. (LB)

  1. The Future of Library Automation in Schools.

    ERIC Educational Resources Information Center

    Anderson, Elaine

    2000-01-01

    Addresses the future of library automation programs for schools. Discusses requirements of emerging OPACs and circulation systems; the Schools Interoperability Framework (SIF), an industry initiatives to develop an open specification for ensuring that K-12 instructional and administrative software applications work together more effectively; home…

  2. Psychosocial Risks Generated By Assets Specific Design Software

    NASA Astrophysics Data System (ADS)

    Remus, Furtună; Angela, Domnariu; Petru, Lazăr

    2015-07-01

    The human activity concerning an occupation is resultant from the interaction between the psycho-biological, socio-cultural and organizational-occupational factors. Tehnological development, automation and computerization that are to be found in all the branches of activity, the level of speed in which things develop, as well as reaching their complexity, require less and less physical aptitudes and more cognitive qualifications. The person included in the work process is bound in most of the cases to come in line with the organizational-occupational situations that are specific to the demands of the job. The role of the programmer is essencial in the process of execution of ordered softwares, thus the truly brilliant ideas can only come from well-rested minds, concentrated on their tasks. The actual requirements of the jobs, besides the high number of benefits and opportunities, also create a series of psycho-social risks, which can increase the level of stress during work activity, especially for those who work under pressure.

  3. MapFactory - Towards a mapping design pattern for big geospatial data

    NASA Astrophysics Data System (ADS)

    Rautenbach, Victoria; Coetzee, Serena

    2018-05-01

    With big geospatial data emerging, cartographers and geographic information scientists have to find new ways of dealing with the volume, variety, velocity, and veracity (4Vs) of the data. This requires the development of tools that allow processing, filtering, analysing, and visualising of big data through multidisciplinary collaboration. In this paper, we present the MapFactory design pattern that will be used for the creation of different maps according to the (input) design specification for big geospatial data. The design specification is based on elements from ISO19115-1:2014 Geographic information - Metadata - Part 1: Fundamentals that would guide the design and development of the map or set of maps to be produced. The results of the exploratory research suggest that the MapFactory design pattern will help with software reuse and communication. The MapFactory design pattern will aid software developers to build the tools that are required to automate map making with big geospatial data. The resulting maps would assist cartographers and others to make sense of big geospatial data.

  4. Software Tool Integrating Data Flow Diagrams and Petri Nets

    NASA Technical Reports Server (NTRS)

    Thronesbery, Carroll; Tavana, Madjid

    2010-01-01

    Data Flow Diagram - Petri Net (DFPN) is a software tool for analyzing other software to be developed. The full name of this program reflects its design, which combines the benefit of data-flow diagrams (which are typically favored by software analysts) with the power and precision of Petri-net models, without requiring specialized Petri-net training. (A Petri net is a particular type of directed graph, a description of which would exceed the scope of this article.) DFPN assists a software analyst in drawing and specifying a data-flow diagram, then translates the diagram into a Petri net, then enables graphical tracing of execution paths through the Petri net for verification, by the end user, of the properties of the software to be developed. In comparison with prior means of verifying the properties of software to be developed, DFPN makes verification by the end user more nearly certain, thereby making it easier to identify and correct misconceptions earlier in the development process, when correction is less expensive. After the verification by the end user, DFPN generates a printable system specification in the form of descriptions of processes and data.

  5. The Software Architecture of the Upgraded ESA DRAMA Software Suite

    NASA Astrophysics Data System (ADS)

    Kebschull, Christopher; Flegel, Sven; Gelhaus, Johannes; Mockel, Marek; Braun, Vitali; Radtke, Jonas; Wiedemann, Carsten; Vorsmann, Peter; Sanchez-Ortiz, Noelia; Krag, Holger

    2013-08-01

    In the beginnings of man's space flight activities there was the belief that space is so big that everybody could use it without any repercussions. However during the last six decades the increasing use of Earth's orbits has lead to a rapid growth in the space debris environment, which has a big influence on current and future space missions. For this reason ESA issued the "Requirements on Space Debris Mitigation for ESA Projects" [1] in 2008, which apply to all ESA missions henceforth. The DRAMA (Debris Risk Assessment and Mitigation Analysis) software suite had been developed to support the planning of space missions to comply with these requirements. During the last year the DRAMA software suite has been upgraded under ESA contract by TUBS and DEIMOS to include additional tools and increase the performance of existing ones. This paper describes the overall software architecture of the ESA DRAMA software suite. Specifically the new graphical user interface, which manages the five main tools ARES (Assessment of Risk Event Statistics), MIDAS (MASTER-based Impact Flux and Damage Assessment Software), OSCAR (Orbital Spacecraft Active Removal), CROC (Cross Section of Complex Bodies) and SARA (Re-entry Survival and Risk Analysis) is being discussed. The advancements are highlighted as well as the challenges that arise from the integration of the five tool interfaces. A framework had been developed at the ILR and was used for MASTER-2009 and PROOF-2009. The Java based GUI framework, enables the cross-platform deployment, and its underlying model-view-presenter (MVP) software pattern, meet strict design requirements necessary to ensure a robust and reliable method of operation in an environment where the GUI is separated from the processing back-end. While the GUI framework evolved with each project, allowing an increasing degree of integration of services like validators for input fields, it has also increased in complexity. The paper will conclude with an outlook on the future development of the GUI framework, where the potential for advancements will be shown.

  6. Designing for Change: Minimizing the Impact of Changing Requirements in the Later Stages of a Spaceflight Software Project

    NASA Technical Reports Server (NTRS)

    Allen, B. Danette

    1998-01-01

    In the traditional 'waterfall' model of the software project life cycle, the Requirements Phase ends and flows into the Design Phase, which ends and flows into the Development Phase. Unfortunately, the process rarely, if ever, works so smoothly in practice. Instead, software developers often receive new requirements, or modifications to the original requirements, well after the earlier project phases have been completed. In particular, projects with shorter than ideal schedules are highly susceptible to frequent requirements changes, as the software requirements analysis phase is often forced to begin before the overall system requirements and top-level design are complete. This results in later modifications to the software requirements, even though the software design and development phases may be complete. Requirements changes received in the later stages of a software project inevitably lead to modification of existing developed software. Presented here is a series of software design techniques that can greatly reduce the impact of last-minute requirements changes. These techniques were successfully used to add built-in flexibility to two complex software systems in which the requirements were expected to (and did) change frequently. These large, real-time systems were developed at NASA Langley Research Center (LaRC) to test and control the Lidar In-Space Technology Experiment (LITE) instrument which flew aboard the space shuttle Discovery as the primary payload on the STS-64 mission.

  7. Formal specification and mechanical verification of SIFT - A fault-tolerant flight control system

    NASA Technical Reports Server (NTRS)

    Melliar-Smith, P. M.; Schwartz, R. L.

    1982-01-01

    The paper describes the methodology being employed to demonstrate rigorously that the SIFT (software-implemented fault-tolerant) computer meets its requirements. The methodology uses a hierarchy of design specifications, expressed in the mathematical domain of multisorted first-order predicate calculus. The most abstract of these, from which almost all details of mechanization have been removed, represents the requirements on the system for reliability and intended functionality. Successive specifications in the hierarchy add design and implementation detail until the PASCAL programs implementing the SIFT executive are reached. A formal proof that a SIFT system in a 'safe' state operates correctly despite the presence of arbitrary faults has been completed all the way from the most abstract specifications to the PASCAL program.

  8. A CMMI-based approach for medical software project life cycle study.

    PubMed

    Chen, Jui-Jen; Su, Wu-Chen; Wang, Pei-Wen; Yen, Hung-Chi

    2013-01-01

    In terms of medical techniques, Taiwan has gained international recognition in recent years. However, the medical information system industry in Taiwan is still at a developing stage compared with the software industries in other nations. In addition, systematic development processes are indispensable elements of software development. They can help developers increase their productivity and efficiency and also avoid unnecessary risks arising during the development process. Thus, this paper presents an application of Light-Weight Capability Maturity Model Integration (LW-CMMI) to Chang Gung Medical Research Project (CMRP) in the Nuclear medicine field. This application was intended to integrate user requirements, system design and testing of software development processes into three layers (Domain, Concept and Instance) model. Then, expressing in structural System Modeling Language (SysML) diagrams and converts part of the manual effort necessary for project management maintenance into computational effort, for example: (semi-) automatic delivery of traceability management. In this application, it supports establishing artifacts of "requirement specification document", "project execution plan document", "system design document" and "system test document", and can deliver a prototype of lightweight project management tool on the Nuclear Medicine software project. The results of this application can be a reference for other medical institutions in developing medical information systems and support of project management to achieve the aim of patient safety.

  9. Space-Based Reconfigurable Software Defined Radio Test Bed Aboard International Space Station

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Lux, James P.

    2014-01-01

    The National Aeronautical and Space Administration (NASA) recently launched a new software defined radio research test bed to the International Space Station. The test bed, sponsored by the Space Communications and Navigation (SCaN) Office within NASA is referred to as the SCaN Testbed. The SCaN Testbed is a highly capable communications system, composed of three software defined radios, integrated into a flight system, and mounted to the truss of the International Space Station. Software defined radios offer the future promise of in-flight reconfigurability, autonomy, and eventually cognitive operation. The adoption of software defined radios offers space missions a new way to develop and operate space transceivers for communications and navigation. Reconfigurable or software defined radios with communications and navigation functions implemented in software or VHDL (Very High Speed Hardware Description Language) provide the capability to change the functionality of the radio during development or after launch. The ability to change the operating characteristics of a radio through software once deployed to space offers the flexibility to adapt to new science opportunities, recover from anomalies within the science payload or communication system, and potentially reduce development cost and risk by adapting generic space platforms to meet specific mission requirements. The software defined radios on the SCaN Testbed are each compliant to NASA's Space Telecommunications Radio System (STRS) Architecture. The STRS Architecture is an open, non-proprietary architecture that defines interfaces for the connections between radio components. It provides an operating environment to abstract the communication waveform application from the underlying platform specific hardware such as digital-to-analog converters, analog-to-digital converters, oscillators, RF attenuators, automatic gain control circuits, FPGAs, general-purpose processors, etc. and the interconnections among different radio components.

  10. A 'user friendly' geographic information system in a color interactive digital image processing system environment

    NASA Technical Reports Server (NTRS)

    Campbell, W. J.; Goldberg, M.

    1982-01-01

    NASA's Eastern Regional Remote Sensing Applications Center (ERRSAC) has recognized the need to accommodate spatial analysis techniques in its remote sensing technology transfer program. A computerized Geographic Information System to incorporate remotely sensed data, specifically Landsat, with other relevant data was considered a realistic approach to address a given resource problem. Questions arose concerning the selection of a suitable available software system to demonstrate, train, and undertake demonstration projects with ERRSAC's user community. The very specific requirements for such a system are discussed. The solution found involved the addition of geographic information processing functions to the Interactive Digital Image Manipulation System (IDIMS). Details regarding the functions of the new integrated system are examined along with the characteristics of the software.

  11. A study of interactive control scheduling and economic assessment for robotic systems

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A class of interactive control systems is derived by generalizing interactive manipulator control systems. Tasks of interactive control systems can be represented as a network of a finite set of actions which have specific operational characteristics and specific resource requirements, and which are of limited duration. This has enabled the decomposition of the overall control algorithm simultaneously and asynchronously. The performance benefits of sensor referenced and computer-aided control of manipulators in a complex environment is evaluated. The first phase of the CURV arm control system software development and the basic features of the control algorithms and their software implementation are presented. An optimal solution for a production scheduling problem that will be easy to implement in practical situations is investigated.

  12. Software to Control and Monitor Gas Streams

    NASA Technical Reports Server (NTRS)

    Arkin, C.; Curley, Charles; Gore, Eric; Floyd, David; Lucas, Damion

    2012-01-01

    This software package interfaces with various gas stream devices such as pressure transducers, flow meters, flow controllers, valves, and analyzers such as a mass spectrometer. The software provides excellent user interfacing with various windows that provide time-domain graphs, valve state buttons, priority- colored messages, and warning icons. The user can configure the software to save as much or as little data as needed to a comma-delimited file. The software also includes an intuitive scripting language for automated processing. The configuration allows for the assignment of measured values or calibration so that raw signals can be viewed as usable pressures, flows, or concentrations in real time. The software is based on those used in two safety systems for shuttle processing and one volcanic gas analysis system. Mass analyzers typically have very unique applications and vary from job to job. As such, software available on the market is usually inadequate or targeted on a specific application (such as EPA methods). The goal was to develop powerful software that could be used with prototype systems. The key problem was to generalize the software to be easily and quickly reconfigurable. At Kennedy Space Center (KSC), the prior art consists of two primary methods. The first method was to utilize Lab- VIEW and a commercial data acquisition system. This method required rewriting code for each different application and only provided raw data. To obtain data in engineering units, manual calculations were required. The second method was to utilize one of the embedded computer systems developed for another system. This second method had the benefit of providing data in engineering units, but was limited in the number of control parameters.

  13. Bonsai: an event-based framework for processing and controlling data streams

    PubMed Central

    Lopes, Gonçalo; Bonacchi, Niccolò; Frazão, João; Neto, Joana P.; Atallah, Bassam V.; Soares, Sofia; Moreira, Luís; Matias, Sara; Itskov, Pavel M.; Correia, Patrícia A.; Medina, Roberto E.; Calcaterra, Lorenza; Dreosti, Elena; Paton, Joseph J.; Kampff, Adam R.

    2015-01-01

    The design of modern scientific experiments requires the control and monitoring of many different data streams. However, the serial execution of programming instructions in a computer makes it a challenge to develop software that can deal with the asynchronous, parallel nature of scientific data. Here we present Bonsai, a modular, high-performance, open-source visual programming framework for the acquisition and online processing of data streams. We describe Bonsai's core principles and architecture and demonstrate how it allows for the rapid and flexible prototyping of integrated experimental designs in neuroscience. We specifically highlight some applications that require the combination of many different hardware and software components, including video tracking of behavior, electrophysiology and closed-loop control of stimulation. PMID:25904861

  14. An empirical evaluation of software quality assurance practices and challenges in a developing country: a comparison of Nigeria and Turkey.

    PubMed

    Sowunmi, Olaperi Yeside; Misra, Sanjay; Fernandez-Sanz, Luis; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    The importance of quality assurance in the software development process cannot be overemphasized because its adoption results in high reliability and easy maintenance of the software system and other software products. Software quality assurance includes different activities such as quality control, quality management, quality standards, quality planning, process standardization and improvement amongst others. The aim of this work is to further investigate the software quality assurance practices of practitioners in Nigeria. While our previous work covered areas on quality planning, adherence to standardized processes and the inherent challenges, this work has been extended to include quality control, software process improvement and international quality standard organization membership. It also makes comparison based on a similar study carried out in Turkey. The goal is to generate more robust findings that can properly support decision making by the software community. The qualitative research approach, specifically, the use of questionnaire research instruments was applied to acquire data from software practitioners. In addition to the previous results, it was observed that quality assurance practices are quite neglected and this can be the cause of low patronage. Moreover, software practitioners are neither aware of international standards organizations or the required process improvement techniques; as such their claimed standards are not aligned to those of accredited bodies, and are only limited to their local experience and knowledge, which makes it questionable. The comparison with Turkey also yielded similar findings, making the results typical of developing countries. The research instrument used was tested for internal consistency using the Cronbach's alpha, and it was proved reliable. For the software industry in developing countries to grow strong and be a viable source of external revenue, software assurance practices have to be taken seriously because its effect is evident in the final product. Moreover, quality frameworks and tools which require minimum time and cost are highly needed in these countries.

  15. 21 CFR 801.50 - Labeling requirements for stand-alone software.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Labeling requirements for stand-alone software....50 Labeling requirements for stand-alone software. (a) Stand-alone software that is not distributed... in packaged form, stand-alone software regulated as a medical device must provide its unique device...

  16. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.

    1986-01-01

    Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.

  17. Space shuttle orbiter guidance, naviagation and control software functional requirements: Horizontal flight operations

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The shuttle GN&C software functions for horizontal flight operations are defined. Software functional requirements are grouped into two categories: first horizontal flight requirements and full mission horizontal flight requirements. The document privides the intial step in the shuttle GN&C software design process. It also serves as a management tool to identify analyses which are required to define requirements.

  18. Ergonomic Considerations for the Human Environment: Color Treatment, Lighting, and Furniture Selection.

    ERIC Educational Resources Information Center

    Robertson, Michelle M.

    1992-01-01

    Discusses ergonomic design considerations for library media centers. Specific design variables, including temperature and humidity, noise, illumination, color, and windows are discussed; and computer workstation design requirements are presented that address furniture and keyboard design, monitor and display features, software issues, and…

  19. Pulling the Internet Together with Mosaic.

    ERIC Educational Resources Information Center

    Sheehan, Mark

    1995-01-01

    Presents the history of the Internet with specific emphasis on Mosaic; discusses hypertext and hypermedia information; and describes software and hardware requirements. Sidebars include information on the National Center for Super Computing Applications (NCSA); World Wide Web browsers for use in Windows, Macintosh, and X-Windows (UNIX); and…

  20. HAL/S-FC compiler system functional specification

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Compiler organization is discussed, including overall compiler structure, internal data transfer, compiler development, and code optimization. The user, system, and SDL interfaces are described, along with compiler system requirements. Run-time software support package and restrictions and dependencies are also considered of the HAL/S-FC system.

  1. Software Requirements Specification of A Proposed Plant Property Management Information System for the Naval Postgraduate School.

    DTIC Science & Technology

    1982-12-01

    information system (MIS) to support Plant Account equipment related functions could eliminate data handling redundancy and improved Plant Account...system users and accountability for more than 2000 individual equipment items worth over seven million dollars. Implementation of a management

  2. Design of Inhouse Automated Library Systems.

    ERIC Educational Resources Information Center

    Cortez, Edwin M.

    1984-01-01

    Examines six steps inherent to development of in-house automated library system: (1) problem definition, (2) requirement specifications, (3) analysis of alternatives and solutions, (4, 5) design and implementation of hardware and software, and (6) evaluation. Practical method for comparing and weighting options is illustrated and explained. A…

  3. Advanced information processing system: Inter-computer communication services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.

    1991-01-01

    The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.

  4. System requirements specification for SMART structures mode

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Specified here are the functional and informational requirements for software modules which address the geometric and data modeling needs of the aerospace structural engineer. The modules are to be included as part of the Solid Modeling Aerospace Research Tool (SMART) package developed for the Vehicle Analysis Branch (VAB) at the NASA Langley Research Center (LaRC). The purpose is to precisely state what the SMART Structures modules will do, without consideration of how it will be done. Each requirement is numbered for reference in development and testing.

  5. Space station operating system study

    NASA Technical Reports Server (NTRS)

    Horn, Albert E.; Harwell, Morris C.

    1988-01-01

    The current phase of the Space Station Operating System study is based on the analysis, evaluation, and comparison of the operating systems implemented on the computer systems and workstations in the software development laboratory. Primary emphasis has been placed on the DEC MicroVMS operating system as implemented on the MicroVax II computer, with comparative analysis of the SUN UNIX system on the SUN 3/260 workstation computer, and to a limited extent, the IBM PC/AT microcomputer running PC-DOS. Some benchmark development and testing was also done for the Motorola MC68010 (VM03 system) before the system was taken from the laboratory. These systems were studied with the objective of determining their capability to support Space Station software development requirements, specifically for multi-tasking and real-time applications. The methodology utilized consisted of development, execution, and analysis of benchmark programs and test software, and the experimentation and analysis of specific features of the system or compilers in the study.

  6. Software package for modeling spin–orbit motion in storage rings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zyuzin, D. V., E-mail: d.zyuzin@fz-juelich.de

    2015-12-15

    A software package providing a graphical user interface for computer experiments on the motion of charged particle beams in accelerators, as well as analysis of obtained data, is presented. The software package was tested in the framework of the international project on electric dipole moment measurement JEDI (Jülich Electric Dipole moment Investigations). The specific features of particle spin motion imply the requirement to use a cyclic accelerator (storage ring) consisting of electrostatic elements, which makes it possible to preserve horizontal polarization for a long time. Computer experiments study the dynamics of 10{sup 6}–10{sup 9} particles in a beam during 10{supmore » 9} turns in an accelerator (about 10{sup 12}–10{sup 15} integration steps for the equations of motion). For designing an optimal accelerator structure, a large number of computer experiments on polarized beam dynamics are required. The numerical core of the package is COSY Infinity, a program for modeling spin–orbit dynamics.« less

  7. Online Learning Flight Control for Intelligent Flight Control Systems (IFCS)

    NASA Technical Reports Server (NTRS)

    Niewoehner, Kevin R.; Carter, John (Technical Monitor)

    2001-01-01

    The research accomplishments for the cooperative agreement 'Online Learning Flight Control for Intelligent Flight Control Systems (IFCS)' include the following: (1) previous IFC program data collection and analysis; (2) IFC program support site (configured IFC systems support network, configured Tornado/VxWorks OS development system, made Configuration and Documentation Management Systems Internet accessible); (3) Airborne Research Test Systems (ARTS) II Hardware (developed hardware requirements specification, developing environmental testing requirements, hardware design, and hardware design development); (4) ARTS II software development laboratory unit (procurement of lab style hardware, configured lab style hardware, and designed interface module equivalent to ARTS II faceplate); (5) program support documentation (developed software development plan, configuration management plan, and software verification and validation plan); (6) LWR algorithm analysis (performed timing and profiling on algorithm); (7) pre-trained neural network analysis; (8) Dynamic Cell Structures (DCS) Neural Network Analysis (performing timing and profiling on algorithm); and (9) conducted technical interchange and quarterly meetings to define IFC research goals.

  8. Multiplex primer prediction software for divergent targets

    PubMed Central

    Gardner, Shea N.; Hiddessen, Amy L.; Williams, Peter L.; Hara, Christine; Wagner, Mark C.; Colston, Bill W.

    2009-01-01

    We describe a Multiplex Primer Prediction (MPP) algorithm to build multiplex compatible primer sets to amplify all members of large, diverse and unalignable sets of target sequences. The MPP algorithm is scalable to larger target sets than other available software, and it does not require a multiple sequence alignment. We applied it to questions in viral detection, and demonstrated that there are no universally conserved priming sequences among viruses and that it could require an unfeasibly large number of primers (∼3700 18-mers or ∼2000 10-mers) to generate amplicons from all sequenced viruses. We then designed primer sets separately for each viral family, and for several diverse species such as foot-and-mouth disease virus (FMDV), hemagglutinin (HA) and neuraminidase (NA) segments of influenza A virus, Norwalk virus, and HIV-1. We empirically demonstrated the application of the software with a multiplex set of 16 short (10 nt) primers designed to amplify the Poxviridae family to produce a specific amplicon from vaccinia virus. PMID:19759213

  9. BEARS: a multi-mission anomaly response system

    NASA Astrophysics Data System (ADS)

    Roberts, Bryce A.

    2009-05-01

    The Mission Operations Group at UC Berkeley's Space Sciences Laboratory operates a highly automated ground station and presently a fleet of seven satellites, each with its own associated command and control console. However, the requirement for prompt anomaly detection and resolution is shared commonly between the ground segment and all spacecraft. The efficient, low-cost operation and "lights-out" staffing of the Mission Operations Group requires that controllers and engineers be notified of spacecraft and ground system problems around the clock. The Berkeley Emergency Anomaly and Response System (BEARS) is an in-house developed web- and paging-based software system that meets this need. BEARS was developed as a replacement for an existing emergency reporting software system that was too closedsource, platform-specific, expensive, and antiquated to expand or maintain. To avoid these limitations, the new system design leverages cross-platform, open-source software products such as MySQL, PHP, and Qt. Anomaly notifications and responses make use of the two-way paging capabilities of modern smart phones.

  10. Digital Partnerships for Health: Steps to develop a community-specific health portal aimed at promoting health and well-being

    PubMed Central

    Kukafka, Rita; Khan, Sharib A.; Hutchinson, Carly; McFarlane, Delano J.; Li, Jianhua; Ancker, Jessica S.; Cohall, Alwyn

    2007-01-01

    We describe the steps taken by the Harlem Health Promotion Center to develop a community-specific health web portal aimed at promoting health and well-being in Harlem. Methods and results that begin with data collection and move onto elucidating requirements for the web portal are discussed. Sentiments of distrust in medical institutions, and the desire for community specific content and resources were among the needs emanating from our data analysis. These findings guided our decision to customize social software designed to foster connections, collaborations, flexibility, and interactivity; an “architecture of participation”. While we maintain that the leveraging of social software may indeed be the way to build healthy communities and support learning and engagement in underserved communities, our conclusion calls for careful thinking, testing and evaluation research to establish best practice models for leveraging these emerging technologies to support health improvements in the community. PMID:18693872

  11. 75 FR 16817 - Meeting for Software Developers on the Technical Specifications for Common Formats for Patient...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-02

    ... Software Developers on the Technical Specifications for Common Formats for Patient Safety Data Collection... software developers can provide input on these technical specifications for the Common Formats Version 1.1... specifications, which provide direction to software developers that plan to implement the Common Formats...

  12. An Analysis of an Automatic Coolant Bypass in the International Space Station Node 2 Internal Active Thermal Control System

    NASA Technical Reports Server (NTRS)

    Clanton, Stephen E.; Holt, James M.; Turner, Larry D. (Technical Monitor)

    2001-01-01

    A challenging part of International Space Station (ISS) thermal control design is the ability to incorporate design changes into an integrated system without negatively impacting performance. The challenge presents itself in that the typical ISS Internal Active Thermal Control System (IATCS) consists of an integrated hardware/software system that provides active coolant resources to a variety of users. Software algorithms control the IATCS to specific temperatures, flow rates, and pressure differentials in order to meet the user-defined requirements. What may seem to be small design changes imposed on the system may in fact result in system instability or the temporary inability to meet user requirements. The purpose of this paper is to provide a brief description of the solution process and analyses used to implement one such design change that required the incorporation of an automatic coolant bypass in the ISS Node 2 element.

  13. Automated Methodologies for the Design of Flow Diagrams for Development and Maintenance Activities

    NASA Astrophysics Data System (ADS)

    Shivanand M., Handigund; Shweta, Bhat

    The Software Requirements Specification (SRS) of the organization is a text document prepared by strategic management incorporating the requirements of the organization. These requirements of ongoing business/ project development process involve the software tools, the hardware devices, the manual procedures, the application programs and the communication commands. These components are appropriately ordered for achieving the mission of the concerned process both in the project development and the ongoing business processes, in different flow diagrams viz. activity chart, workflow diagram, activity diagram, component diagram and deployment diagram. This paper proposes two generic, automatic methodologies for the design of various flow diagrams of (i) project development activities, (ii) ongoing business process. The methodologies also resolve the ensuing deadlocks in the flow diagrams and determine the critical paths for the activity chart. Though both methodologies are independent, each complements other in authenticating its correctness and completeness.

  14. An Open-Source Hardware and Software System for Acquisition and Real-Time Processing of Electrophysiology during High Field MRI

    PubMed Central

    Purdon, Patrick L.; Millan, Hernan; Fuller, Peter L.; Bonmassar, Giorgio

    2008-01-01

    Simultaneous recording of electrophysiology and functional magnetic resonance imaging (fMRI) is a technique of growing importance in neuroscience. Rapidly evolving clinical and scientific requirements have created a need for hardware and software that can be customized for specific applications. Hardware may require customization to enable a variety of recording types (e.g., electroencephalogram, local field potentials, or multi-unit activity) while meeting the stringent and costly requirements of MRI safety and compatibility. Real-time signal processing tools are an enabling technology for studies of learning, attention, sleep, epilepsy, neurofeedback, and neuropharmacology, yet real-time signal processing tools are difficult to develop. We describe an open source system for simultaneous electrophysiology and fMRI featuring low-noise (< 0.6 uV p-p input noise), electromagnetic compatibility for MRI (tested up to 7 Tesla), and user-programmable real-time signal processing. The hardware distribution provides the complete specifications required to build an MRI-compatible electrophysiological data acquisition system, including circuit schematics, print circuit board (PCB) layouts, Gerber files for PCB fabrication and robotic assembly, a bill of materials with part numbers, data sheets, and vendor information, and test procedures. The software facilitates rapid implementation of real-time signal processing algorithms. This system has used in human EEG/fMRI studies at 3 and 7 Tesla examining the auditory system, visual system, sleep physiology, and anesthesia, as well as in intracranial electrophysiological studies of the non-human primate visual system during 3 Tesla fMRI, and in human hyperbaric physiology studies at depths of up to 300 feet below sea level. PMID:18761038

  15. An open-source hardware and software system for acquisition and real-time processing of electrophysiology during high field MRI.

    PubMed

    Purdon, Patrick L; Millan, Hernan; Fuller, Peter L; Bonmassar, Giorgio

    2008-11-15

    Simultaneous recording of electrophysiology and functional magnetic resonance imaging (fMRI) is a technique of growing importance in neuroscience. Rapidly evolving clinical and scientific requirements have created a need for hardware and software that can be customized for specific applications. Hardware may require customization to enable a variety of recording types (e.g., electroencephalogram, local field potentials, or multi-unit activity) while meeting the stringent and costly requirements of MRI safety and compatibility. Real-time signal processing tools are an enabling technology for studies of learning, attention, sleep, epilepsy, neurofeedback, and neuropharmacology, yet real-time signal processing tools are difficult to develop. We describe an open-source system for simultaneous electrophysiology and fMRI featuring low-noise (<0.6microV p-p input noise), electromagnetic compatibility for MRI (tested up to 7T), and user-programmable real-time signal processing. The hardware distribution provides the complete specifications required to build an MRI-compatible electrophysiological data acquisition system, including circuit schematics, print circuit board (PCB) layouts, Gerber files for PCB fabrication and robotic assembly, a bill of materials with part numbers, data sheets, and vendor information, and test procedures. The software facilitates rapid implementation of real-time signal processing algorithms. This system has been used in human EEG/fMRI studies at 3 and 7T examining the auditory system, visual system, sleep physiology, and anesthesia, as well as in intracranial electrophysiological studies of the non-human primate visual system during 3T fMRI, and in human hyperbaric physiology studies at depths of up to 300 feet below sea level.

  16. Working Notes from the 1992 AAAI Workshop on Automating Software Design. Theme: Domain Specific Software Design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M. (Editor); Barstow, David; Lowry, Michael R.; Tong, Christopher H.

    1992-01-01

    The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface.

  17. Self-assembling software generator

    DOEpatents

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2011-11-25

    A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

  18. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice; Baggs, Rhoda

    2007-01-01

    Safety-critical computer systems must be engineered to meet system and software safety requirements. For legacy safety-critical computer systems, software safety requirements may not have been formally specified during development. When process-oriented software safety requirements are levied on a legacy system after the fact, where software development artifacts don't exist or are incomplete, the question becomes 'how can this be done?' The risks associated with only meeting certain software safety requirements in a legacy safety-critical computer system must be addressed should such systems be selected as candidates for reuse. This paper proposes a method for ascertaining formally, a software safety risk assessment, that provides measurements for software safety for legacy systems which may or may not have a suite of software engineering documentation that is now normally required. It relies upon the NASA Software Safety Standard, risk assessment methods based upon the Taxonomy-Based Questionnaire, and the application of reverse engineering CASE tools to produce original design documents for legacy systems.

  19. e-Stars Template Builder

    NASA Technical Reports Server (NTRS)

    Cox, Brian

    2003-01-01

    e-Stars Template Builder is a computer program that implements a concept of enabling users to rapidly gain access to information on projects of NASA's Jet Propulsion Laboratory. The information about a given project is not stored in a data base, but rather, in a network that follows the project as it develops. e-Stars Template Builder resides on a server computer, using Practical Extraction and Reporting Language (PERL) scripts to create what are called "e-STARS node templates," which are software constructs that allow for project-specific configurations. The software resides on the server and does not require specific software on the user machine except for an Internet browser. A user's computer need not be equipped with special software (other than an Internet-browser program). e-Stars Template Builder is compatible with Windows, Macintosh, and UNIX operating systems. A user invokes e-Stars Template Builder from a browser window. Operations that can be performed by the user include the creation of child processes and the addition of links and descriptions of documentation to existing pages or nodes. By means of this addition of "child processes" of nodes, a network that reflects the development of a project is generated.

  20. Cardiovascular imaging environment: will the future be cloud-based?

    PubMed

    Kawel-Boehm, Nadine; Bluemke, David A

    2017-07-01

    In cardiovascular CT and MR imaging large datasets have to be stored, post-processed, analyzed and distributed. Beside basic assessment of volume and function in cardiac magnetic resonance imaging e.g., more sophisticated quantitative analysis is requested requiring specific software. Several institutions cannot afford various types of software and provide expertise to perform sophisticated analysis. Areas covered: Various cloud services exist related to data storage and analysis specifically for cardiovascular CT and MR imaging. Instead of on-site data storage, cloud providers offer flexible storage services on a pay-per-use basis. To avoid purchase and maintenance of specialized software for cardiovascular image analysis, e.g. to assess myocardial iron overload, MR 4D flow and fractional flow reserve, evaluation can be performed with cloud based software by the consumer or complete analysis is performed by the cloud provider. However, challenges to widespread implementation of cloud services include regulatory issues regarding patient privacy and data security. Expert commentary: If patient privacy and data security is guaranteed cloud imaging is a valuable option to cope with storage of large image datasets and offer sophisticated cardiovascular image analysis for institutions of all sizes.

  1. Software Analyzes Complex Systems in Real Time

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Expert system software programs, also known as knowledge-based systems, are computer programs that emulate the knowledge and analytical skills of one or more human experts, related to a specific subject. SHINE (Spacecraft Health Inference Engine) is one such program, a software inference engine (expert system) designed by NASA for the purpose of monitoring, analyzing, and diagnosing both real-time and non-real-time systems. It was developed to meet many of the Agency s demanding and rigorous artificial intelligence goals for current and future needs. NASA developed the sophisticated and reusable software based on the experience and requirements of its Jet Propulsion Laboratory s (JPL) Artificial Intelligence Research Group in developing expert systems for space flight operations specifically, the diagnosis of spacecraft health. It was designed to be efficient enough to operate in demanding real time and in limited hardware environments, and to be utilized by non-expert systems applications written in conventional programming languages. The technology is currently used in several ongoing NASA applications, including the Mars Exploration Rovers and the Spacecraft Health Automatic Reasoning Pilot (SHARP) program for the diagnosis of telecommunication anomalies during the Neptune Voyager Encounter. It is also finding applications outside of the Space Agency.

  2. iSDS: a self-configurable software-defined storage system for enterprise

    NASA Astrophysics Data System (ADS)

    Chen, Wen-Shyen Eric; Huang, Chun-Fang; Huang, Ming-Jen

    2018-01-01

    Storage is one of the most important aspects of IT infrastructure for various enterprises. But, enterprises are interested in more than just data storage; they are interested in such things as more reliable data protection, higher performance and reduced resource consumption. Traditional enterprise-grade storage satisfies these requirements at high cost. It is because traditional enterprise-grade storage is usually designed and constructed by customised field-programmable gate array to achieve high-end functionality. However, in this ever-changing environment, enterprises request storage with more flexible deployment and at lower cost. Moreover, the rise of new application fields, such as social media, big data, video streaming service etc., makes operational tasks for administrators more complex. In this article, a new storage system called intelligent software-defined storage (iSDS), based on software-defined storage, is described. More specifically, this approach advocates using software to replace features provided by traditional customised chips. To alleviate the management burden, it also advocates applying machine learning to automatically configure storage to meet dynamic requirements of workloads running on storage. This article focuses on the analysis feature of iSDS cluster by detailing its architecture and design.

  3. High-performance software-only H.261 video compression on PC

    NASA Astrophysics Data System (ADS)

    Kasperovich, Leonid

    1996-03-01

    This paper describes an implementation of a software H.261 codec for PC, that takes an advantage of the fast computational algorithms for DCT-based video compression, which have been presented by the author at the February's 1995 SPIE/IS&T meeting. The motivation for developing the H.261 prototype system is to demonstrate a feasibility of real time software- only videoconferencing solution to operate across a wide range of network bandwidth, frame rate, and resolution of the input video. As the bandwidths of current network technology will be increased, the higher frame rate and resolution of video to be transmitted is allowed, that requires, in turn, a software codec to be able to compress pictures of CIF (352 X 288) resolution at up to 30 frame/sec. Running on Pentium 133 MHz PC the codec presented is capable to compress video in CIF format at 21 - 23 frame/sec. This result is comparable to the known hardware-based H.261 solutions, but it doesn't require any specific hardware. The methods to achieve high performance, the program optimization technique for Pentium microprocessor along with the performance profile, showing the actual contribution of the different encoding/decoding stages to the overall computational process, are presented.

  4. Integration of an expert system into a user interface language demonstration

    NASA Technical Reports Server (NTRS)

    Stclair, D. C.

    1986-01-01

    The need for a User Interface Language (UIL) has been recognized by the Space Station Program Office as a necessary tool to aid in minimizing the cost of software generation by multiple users. Previous history in the Space Shuttle Program has shown that many different areas of software generation, such as operations, integration, testing, etc., have each used a different user command language although the types of operations being performed were similar in many respects. Since the Space Station represents a much more complex software task, a common user command language--a user interface language--is required to support the large spectrum of space station software developers and users. To assist in the selection of an appropriate set of definitions for a UIL, a series of demonstration programs was generated with which to test UIL concepts against specific Space Station scenarios using operators for the astronaut and scientific community. Because of the importance of expert system in the space station, it was decided that an expert system should be embedded in the UIL. This would not only provide insight into the UIL components required but would indicate the effectiveness with which an expert system could function in such an environment.

  5. Software Requirements for the Move to Unix

    NASA Astrophysics Data System (ADS)

    Rees, Paul

    This document provides information concerning the software requirements of each STARLINK site to move entirely to UNIX. It provides a list of proposed UNIX migration deadlines for all sites and lists of software requirements, both STARLINK and non-STARLINK software, which must be met before the existing VMS hardware can be switched off. The information presented in this document is used for the planning of software porting and distribution activities and also for setting realistic migration deadlines for STARLINK sites. The information on software requirements has been provided by STARLINK Site Managers.

  6. Changes and challenges in the Software Engineering Laboratory

    NASA Technical Reports Server (NTRS)

    Pajerski, Rose

    1994-01-01

    Since 1976, the Software Engineering Laboratory (SEL) has been dedicated to understanding and improving the way in which one NASA organization, the Flight Dynamics Division (FDD), develops, maintains, and manages complex flight dynamics systems. The SEL is composed of three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation. During the past 18 years, the SEL's overall goal has remained the same: to improve the FDD's software products and processes in a measured manner. This requires that each development and maintenance effort be viewed, in part, as a SEL experiment which examines a specific technology or builds a model of interest for use on subsequent efforts. The SEL has undertaken many technology studies while developing operational support systems for numerous NASA spacecraft missions.

  7. LCG Persistency Framework (CORAL, COOL, POOL): Status and Outlook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valassi, A.; /CERN; Clemencic, M.

    2012-04-19

    The Persistency Framework consists of three software packages (CORAL, COOL and POOL) addressing the data access requirements of the LHC experiments in different areas. It is the result of the collaboration between the CERN IT Department and the three experiments (ATLAS, CMS and LHCb) that use this software to access their data. POOL is a hybrid technology store for C++ objects, metadata catalogs and collections. CORAL is a relational database abstraction layer with an SQL-free API. COOL provides specific software tools and components for the handling of conditions data. This paper reports on the status and outlook of the projectmore » and reviews in detail the usage of each package in the three experiments.« less

  8. Practical Application of Model-based Programming and State-based Architecture to Space Missions

    NASA Technical Reports Server (NTRS)

    Horvath, Gregory A.; Ingham, Michel D.; Chung, Seung; Martin, Oliver; Williams, Brian

    2006-01-01

    Innovative systems and software engineering solutions are required to meet the increasingly challenging demands of deep-space robotic missions. While recent advances in the development of an integrated systems and software engineering approach have begun to address some of these issues, they are still at the core highly manual and, therefore, error-prone. This paper describes a task aimed at infusing MIT's model-based executive, Titan, into JPL's Mission Data System (MDS), a unified state-based architecture, systems engineering process, and supporting software framework. Results of the task are presented, including a discussion of the benefits and challenges associated with integrating mature model-based programming techniques and technologies into a rigorously-defined domain specific architecture.

  9. 34 CFR 648.62 - How can the institutional payment be used?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., or supplies required of students in the same course of study. (2) Costs of computer hardware, project specific software, and other equipment prorated by the length of the student's fellowship over the reasonable life of the equipment. (3) Membership fees of professional associations. (4) Travel and per diem...

  10. The Co-Creation of Information Systems

    ERIC Educational Resources Information Center

    Gomillion, David

    2013-01-01

    In information systems development, end-users have shifted in their role: from consumers of information to informants for requirements to developers of systems. This shift in the role of users has also changed how information systems are developed. Instead of systems developers creating specifications for software or end-users creating small…

  11. The Use of Microcomputers in the Treatment of Cognitive-Communicative Impairments.

    ERIC Educational Resources Information Center

    Story, Tamara B.; Sbordone, Robert J.

    1988-01-01

    The use of microcomputer-assisted therapy as part of the total rehabilitation plan for brain-injured individuals with cognitive-communicative impairments is addressed. Design of effective computer-assisted remediation requires a careful decision-making process. Specific types of software are suggested for dealing with deficits in organization,…

  12. 10 CFR 719.44 - What categories of costs require advance approval?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... application software, or non-routine computerized databases, if they are specifically created for a particular matter. For costs associated with the creation and use of computerized databases, contractors and retained legal counsel must ensure that the creation and use of computerized databases is necessary and...

  13. Cargo Movement Operations System (CMOS). Final Software Requirements Specification, (Applications CSCI), Increment II

    DTIC Science & Technology

    1991-01-29

    NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN ( ] CLOSED [ ] ORIGINATOR CONTROL Nt3MBFR: SRS1-0002 PROGRAM OFFICE CONTROL NUMBER: DATA ITEM...floppy diskette interface with CMOS. CMOS PMO ACCEPTS COMMENT: YES [ ] NO [ ] ERCI ACCEPTS COMMENT: YES ( 3 NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN [ ] CLOSED [

  14. [Multimag-M magnetotherapy system of the new generation].

    PubMed

    Borisov, A G; Grigor'ev, E M; Gurzhin, S G; Zhulev, V I; Kriakov, V G; Proshin, E M

    2007-01-01

    The Multimag-M microprocessor chronomagne-totherapy system of the new generation is described. The system provides on-line diagnosis of the pulse parameters and the breathing rate during a biotechnical feedback session. The requirements to the system software, as well as its specific features and design principles, are considered.

  15. Objective Criteria for the Selection of Software.

    ERIC Educational Resources Information Center

    Burk, Laurena

    The seven stages in the system development process are discussed in the context of implementing basic arithmetic drill and practice exercises on a computer-based system: (1) feasibility study; (2) requirements definition; (3) alternative specifications; (4) evaluation and selection of an alternative; (5) system design; (6) development and testing;…

  16. A statistical method for determining the dimensions, tolerances and specification of optics for the Laser Megajoule facility (LMJ)

    NASA Astrophysics Data System (ADS)

    Denis, Vincent

    2008-09-01

    This paper presents a statistical method for determining the dimensions, tolerance and specifications of components for the Laser MegaJoule (LMJ). Numerous constraints inherent to a large facility require specific tolerances: the huge number of optical components; the interdependence of these components between the beams of same bundle; angular multiplexing for the amplifier section; distinct operating modes between the alignment and firing phases; the definition and use of alignment software in the place of classic optimization. This method provides greater flexibility to determine the positioning and manufacturing specifications of the optical components. Given the enormous power of the Laser MegaJoule (over 18 kJ in the infrared and 9 kJ in the ultraviolet), one of the major risks is damage the optical mounts and pollution of the installation by mechanical ablation. This method enables estimation of the beam occultation probabilities and quantification of the risks for the facility. All the simulations were run using the ZEMAX-EE optical design software.

  17. Design ATE systems for complex assemblies

    NASA Astrophysics Data System (ADS)

    Napier, R. S.; Flammer, G. H.; Moser, S. A.

    1983-06-01

    The use of ATE systems in radio specification testing can reduce the test time by approximately 90 to 95 percent. What is more, the test station does not require a highly trained operator. Since the system controller has full power over all the measurements, human errors are not introduced into the readings. The controller is immune to any need to increase output by allowing marginal units to pass through the system. In addition, the software compensates for predictable, repeatable system errors, for example, cabling losses, which are an inherent part of the test setup. With no variation in test procedures from unit to unit, there is a constant repeatability factor. Preparing the software, however, usually entails considerable expense. It is pointed out that many of the problems associated with ATE system software can be avoided with the use of a software-intensive, or computer-intensive, system organization. Its goal is to minimize the user's need for software development, thereby saving time and money.

  18. A preliminary architecture for building communication software from traffic captures

    NASA Astrophysics Data System (ADS)

    Acosta, Jaime C.; Estrada, Pedro

    2017-05-01

    Security analysts are tasked with identifying and mitigating network service vulnerabilities. A common problem associated with in-depth testing of network protocols is the availability of software that communicates across disparate protocols. Many times, the software required to communicate with these services is not publicly available. Developing this software is a time-consuming undertaking that requires expertise and understanding of the protocol specification. The work described in this paper aims at developing a software package that is capable of automatically creating communication clients by using packet capture (pcap) and TShark dissectors. Currently, our focus is on simple protocols with fixed fields. The methodologies developed as part of this work will extend to other complex protocols such as the Gateway Load Balancing Protocol (GLBP), Port Aggregation Protocol (PAgP), and Open Shortest Path First (OSPF). Thus far, we have architected a modular pipeline for an automatic traffic-based software generator. We start the transformation of captured network traffic by employing TShark to convert packets into a Packet Details Markup Language (PDML) file. The PDML file contains a parsed, textual, representation of the packet data. Then, we extract field data, types, along with inter and intra-packet dependencies. This information is then utilized to construct an XML file that encompasses the protocol state machine and field vocabulary. Finally, this XML is converted into executable code. Using our methodology, and as a starting point, we have succeeded in automatically generating software that communicates with other hosts using an automatically generated Internet Control Message Protocol (ICMP) client program.

  19. Strengthening Software Authentication with the ROSE Software Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, G

    2006-06-15

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlightmore » suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects.« less

  20. Real-time optimizations for integrated smart network camera

    NASA Astrophysics Data System (ADS)

    Desurmont, Xavier; Lienard, Bruno; Meessen, Jerome; Delaigle, Jean-Francois

    2005-02-01

    We present an integrated real-time smart network camera. This system is composed of an image sensor, an embedded PC based electronic card for image processing and some network capabilities. The application detects events of interest in visual scenes, highlights alarms and computes statistics. The system also produces meta-data information that could be shared between other cameras in a network. We describe the requirements of such a system and then show how the design of the system is optimized to process and compress video in real-time. Indeed, typical video-surveillance algorithms as background differencing, tracking and event detection should be highly optimized and simplified to be used in this hardware. To have a good adequation between hardware and software in this light embedded system, the software management is written on top of the java based middle-ware specification established by the OSGi alliance. We can integrate easily software and hardware in complex environments thanks to the Java Real-Time specification for the virtual machine and some network and service oriented java specifications (like RMI and Jini). Finally, we will report some outcomes and typical case studies of such a camera like counter-flow detection.

  1. Autoplot and the HAPI Server

    NASA Astrophysics Data System (ADS)

    Faden, J.; Vandegriff, J. D.; Weigel, R. S.

    2016-12-01

    Autoplot was introduced in 2008 as an easy-to-use plotting tool for the space physics community. It reads data from a variety of file resources, such as CDF and HDF files, and a number of specialized data servers, such as the PDS/PPI's DIT-DOS, CDAWeb, and from the University of Iowa's RPWG Das2Server. Each of these servers have optimized methods for transmitting data to display in Autoplot, but require coordination and specialized software to work, limiting Autoplot's ability to access new servers and datasets. Likewise, groups who would like software to access their APIs must either write thier own clients, or publish a specification document in hopes that people will write clients. The HAPI specification was written so that a simple, standard API could be used by both Autoplot and server implementations, to remove these barriers to free flow of time series data. Autoplot's software for communicating with HAPI servers is presented, showing the user interface scientists will use, and how data servers might implement the HAPI specification to provide access to their data. This will also include instructions on how Autoplot is used and installed desktop computers, and used to view data from the RBSP, Juno, and other missions.

  2. Formalizing structured file services for the data storage and retrieval subsystem of the data management system for Spacestation Freedom

    NASA Technical Reports Server (NTRS)

    Jamsek, Damir A.

    1993-01-01

    A brief example of the use of formal methods techniques in the specification of a software system is presented. The report is part of a larger effort targeted at defining a formal methods pilot project for NASA. One possible application domain that may be used to demonstrate the effective use of formal methods techniques within the NASA environment is presented. It is not intended to provide a tutorial on either formal methods techniques or the application being addressed. It should, however, provide an indication that the application being considered is suitable for a formal methods by showing how such a task may be started. The particular system being addressed is the Structured File Services (SFS), which is a part of the Data Storage and Retrieval Subsystem (DSAR), which in turn is part of the Data Management System (DMS) onboard Spacestation Freedom. This is a software system that is currently under development for NASA. An informal mathematical development is presented. Section 3 contains the same development using Penelope (23), an Ada specification and verification system. The complete text of the English version Software Requirements Specification (SRS) is reproduced in Appendix A.

  3. A Quantitative Study of Global Software Development Teams, Requirements, and Software Projects

    ERIC Educational Resources Information Center

    Parker, Linda L.

    2016-01-01

    The study explored the relationship between global software development teams, effective software requirements, and stakeholders' perception of successful software development projects within the field of information technology management. It examined the critical relationship between Global Software Development (GSD) teams creating effective…

  4. Time assignment system and its performance aboard the Hitomi satellite

    NASA Astrophysics Data System (ADS)

    Terada, Yukikatsu; Yamaguchi, Sunao; Sugimoto, Shigenobu; Inoue, Taku; Nakaya, Souhei; Murakami, Maika; Yabe, Seiya; Oshimizu, Kenya; Ogawa, Mina; Dotani, Tadayasu; Ishisaki, Yoshitaka; Mizushima, Kazuyo; Kominato, Takashi; Mine, Hiroaki; Hihara, Hiroki; Iwase, Kaori; Kouzu, Tomomi; Tashiro, Makoto S.; Natsukari, Chikara; Ozaki, Masanobu; Kokubun, Motohide; Takahashi, Tadayuki; Kawakami, Satoko; Kasahara, Masaru; Kumagai, Susumu; Angelini, Lorella; Witthoeft, Michael

    2018-01-01

    Fast timing capability in x-ray observation of astrophysical objects is one of the key properties for the ASTRO-H (Hitomi) mission. Absolute timing accuracies of 350 or 35 μs are required to achieve nominal scientific goals or to study fast variabilities of specific sources. The satellite carries a GPS receiver to obtain accurate time information, which is distributed from the central onboard computer through the large and complex SpaceWire network. The details of the time system on the hardware and software design are described. In the distribution of the time information, the propagation delays and jitters affect the timing accuracy. Six other items identified within the timing system will also contribute to absolute time error. These error items have been measured and checked on ground to ensure the time error budgets meet the mission requirements. The overall timing performance in combination with hardware performance, software algorithm, and the orbital determination accuracies, etc. under nominal conditions satisfies the mission requirements of 35 μs. This work demonstrates key points for space-use instruments in hardware and software designs and calibration measurements for fine timing accuracy on the order of microseconds for midsized satellites using the SpaceWire (IEEE1355) network.

  5. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  6. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  7. Computer modeling in the practice of acoustical consulting: An evolving variety of uses from marketing and diagnosis through design to eventually research

    NASA Astrophysics Data System (ADS)

    Madaras, Gary S.

    2002-05-01

    The use of computer modeling as a marketing, diagnosis, design, and research tool in the practice of acoustical consulting is discussed. From the time it is obtained, the software can be used as an effective marketing tool. It is not until the software basics are learned and some amount of testing and verification occurs that the software can be used as a tool for diagnosing the acoustics of existing rooms. A greater understanding of the output types and formats as well as experience in interpreting the results is required before the software can be used as an efficient design tool. Lastly, it is only after repetitive use as a design tool that the software can be used as a cost-effective means of conducting research in practice. The discussion is supplemented with specific examples of actual projects provided by various consultants within multiple firms. Focus is placed on the use of CATT-Acoustic software and predicting the room acoustics of large performing arts halls as well as other public assembly spaces.

  8. Space Generic Open Avionics Architecture (SGOAA) standard specification

    NASA Technical Reports Server (NTRS)

    Wray, Richard B.; Stovall, John R.

    1994-01-01

    This standard establishes the Space Generic Open Avionics Architecture (SGOAA). The SGOAA includes a generic functional model, processing structural model, and an architecture interface model. This standard defines the requirements for applying these models to the development of spacecraft core avionics systems. The purpose of this standard is to provide an umbrella set of requirements for applying the generic architecture models to the design of a specific avionics hardware/software processing system. This standard defines a generic set of system interface points to facilitate identification of critical services and interfaces. It establishes the requirement for applying appropriate low level detailed implementation standards to those interfaces points. The generic core avionics functions and processing structural models provided herein are robustly tailorable to specific system applications and provide a platform upon which the interface model is to be applied.

  9. Healthcare software assurance.

    PubMed

    Cooper, Jason G; Pauley, Keith A

    2006-01-01

    Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Drug Administration (FDA) regulatory requirements and guidance documentation do not address certain aspects of complete software assurance activities. In addition, the FDA's software oversight processes require enhancement to include increasingly complex healthcare systems such as Hospital Information Systems (HIS). The importance of complete software assurance is introduced, current regulatory requirements and guidance discussed, and the necessity for enhancements to the current processes shall be highlighted.

  10. Healthcare Software Assurance

    PubMed Central

    Cooper, Jason G.; Pauley, Keith A.

    2006-01-01

    Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Drug Administration (FDA) regulatory requirements and guidance documentation do not address certain aspects of complete software assurance activities. In addition, the FDA’s software oversight processes require enhancement to include increasingly complex healthcare systems such as Hospital Information Systems (HIS). The importance of complete software assurance is introduced, current regulatory requirements and guidance discussed, and the necessity for enhancements to the current processes shall be highlighted. PMID:17238324

  11. Network Security Validation Using Game Theory

    NASA Astrophysics Data System (ADS)

    Papadopoulou, Vicky; Gregoriades, Andreas

    Non-functional requirements (NFR) such as network security recently gained widespread attention in distributed information systems. Despite their importance however, there is no systematic approach to validate these requirements given the complexity and uncertainty characterizing modern networks. Traditionally, network security requirements specification has been the results of a reactive process. This however, limited the immunity property of the distributed systems that depended on these networks. Security requirements specification need a proactive approach. Networks' infrastructure is constantly under attack by hackers and malicious software that aim to break into computers. To combat these threats, network designers need sophisticated security validation techniques that will guarantee the minimum level of security for their future networks. This paper presents a game-theoretic approach to security requirements validation. An introduction to game theory is presented along with an example that demonstrates the application of the approach.

  12. Software control and system configuration management - A process that works

    NASA Technical Reports Server (NTRS)

    Petersen, K. L.; Flores, C., Jr.

    1983-01-01

    A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.

  13. Army-NASA aircrew/aircraft integration program: Phase 4 A(3)I Man-Machine Integration Design and Analysis System (MIDAS) software detailed design document

    NASA Technical Reports Server (NTRS)

    Banda, Carolyn; Bushnell, David; Chen, Scott; Chiu, Alex; Constantine, Betsy; Murray, Jerry; Neukom, Christian; Prevost, Michael; Shankar, Renuka; Staveland, Lowell

    1991-01-01

    The Man-Machine Integration Design and Analysis System (MIDAS) is an integrated suite of software components that constitutes a prototype workstation to aid designers in applying human factors principles to the design of complex human-machine systems. MIDAS is intended to be used at the very early stages of conceptual design to provide an environment wherein designers can use computational representations of the crew station and operator, instead of hardware simulators and man-in-the-loop studies, to discover problems and ask 'what if' questions regarding the projected mission, equipment, and environment. This document is the Software Product Specification for MIDAS. Introductory descriptions of the processing requirements, hardware/software environment, structure, I/O, and control are given in the main body of the document for the overall MIDAS system, with detailed discussion of the individual modules included in Annexes A-J.

  14. Software control and system configuration management: A systems-wide approach

    NASA Technical Reports Server (NTRS)

    Petersen, K. L.; Flores, C., Jr.

    1984-01-01

    A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.

  15. Automated Acquisition of Proximal Femur Morphological Characteristics

    NASA Astrophysics Data System (ADS)

    Tabakovic, Slobodan; Zeljkovic, Milan; Milojevic, Zoran

    2014-10-01

    The success of the hip arthroplasty surgery largely depends on the endoprosthesis adjustment to the patient's femur. This implies that the position of the femoral bone in relation to the pelvis is preserved and that the endoprosthesis position ensures its longevity. Dimensions and body shape of the hip joint endoprosthesis and its position after the surgery depend on a number of geometrical parameters of the patient's femur. One of the most suitable methods for determination of these parameters involves 3D reconstruction of femur, based on diagnostic images, and subsequent determination of the required geometric parameters. In this paper, software for automated determination of geometric parameters of the femur is presented. Detailed software development procedure for the purpose of faster and more efficient design of the hip endoprosthesis that ensures patients' specific requirements is also offered

  16. A Software Safety Risk Taxonomy for Use in Retrospective Safety Cases

    NASA Technical Reports Server (NTRS)

    Hill, Janice L.

    2007-01-01

    Safety standards contain technical and process-oriented safely requirements. The best time to include these requirements is early in the development lifecycle of the system. When software safety requirements are levied on a legacy system after the fact, a retrospective safety case will need to be constructed for the software in the system. This can be a difficult task because there may be few to no art facts available to show compliance to the software safely requirements. The risks associated with not meeting safely requirements in a legacy safely-critical computer system must be addressed to give confidence for reuse. This paper introduces a proposal for a software safely risk taxonomy for legacy safely-critical computer systems, by specializing the Software Engineering Institute's 'Software Development Risk Taxonomy' with safely elements and attributes.

  17. Section 508 Electronic Information Accessibility Requirements for Software Development

    NASA Technical Reports Server (NTRS)

    Ellis, Rebecca

    2014-01-01

    Section 508 Subpart B 1194.21 outlines requirements for operating system and software development in order to create a product that is accessible to users with various disabilities. This portion of Section 508 contains a variety of standards to enable those using assistive technology and with visual, hearing, cognitive and motor difficulties to access all information provided in software. The focus on requirements was limited to the Microsoft Windows® operating system as it is the predominant operating system used at this center. Compliance with this portion of the requirements can be obtained by integrating the requirements into the software development cycle early and by remediating issues in legacy software if possible. There are certain circumstances with software that may arise necessitating an exemption from these requirements, such as design or engineering software using dynamically changing graphics or numbers to convey information. These exceptions can be discussed with the Section 508 Coordinator and another method of accommodation used.

  18. Use of Soft Computing Technologies For Rocket Engine Control

    NASA Technical Reports Server (NTRS)

    Trevino, Luis C.; Olcmen, Semih; Polites, Michael

    2003-01-01

    The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to further improve overall engine system reliability and performance. Specifically, this will be presented by enhancing rocket engine control and engine health management (EHM) using SCT coupled with conventional control technologies, and sound software engineering practices used in Marshall s Flight Software Group. The principle goals are to improve software management, software development time and maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control and EHM methodologies, but to provide alternative design choices for control, EHM, implementation, performance, and sustaining engineering. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion, software engineering for embedded systems, and soft computing technologies (i.e., neural networks, fuzzy logic, and Bayesian belief networks), much of which is presented in this paper. The first targeted demonstration rocket engine platform is the MC-1 (formerly FASTRAC Engine) which is simulated with hardware and software in the Marshall Avionics & Software Testbed laboratory that

  19. Ground and Space Radar Volume Matching and Comparison Software

    NASA Technical Reports Server (NTRS)

    Morris, Kenneth; Schwaller, Mathew

    2010-01-01

    This software enables easy comparison of ground- and space-based radar observations. The software was initially designed to compare ground radar reflectivity from operational, ground based Sand C-band meteorological radars with comparable measurements from the Tropical Rainfall Measuring Mission (TRMM) satellite s Precipitation Radar (PR) instrument. The software is also applicable to other ground-based and space-based radars. The ground and space radar volume matching and comparison software was developed in response to requirements defined by the Ground Validation System (GVS) of Goddard s Global Precipitation Mission (GPM) project. This software innovation is specifically concerned with simplifying the comparison of ground- and spacebased radar measurements for the purpose of GPM algorithm and data product validation. This software is unique in that it provides an operational environment to routinely create comparison products, and uses a direct geometric approach to derive common volumes of space- and ground-based radar data. In this approach, spatially coincident volumes are defined by the intersection of individual space-based Precipitation Radar rays with the each of the conical elevation sweeps of the ground radar. Thus, the resampled volume elements of the space and ground radar reflectivity can be directly compared to one another.

  20. Generic health/safety/environment cases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelland, A.N.; Primrose, M.; Pickles, J.C.

    1996-12-31

    A desire to implement HSE Management Systems including HSE Cases in all Shell companies operations prompted the development of a relational data base software package (THESIS) to provide a structured way of preparing an HSE Case. The software includes features which facilitate the management of {open_quotes}Keeping the Case Alive{close_quotes}, enabling the dissemination of tasks and hazard information to the workplace. During the software development it was recognized that a significant reduction could be made in the resources which would be required to prepare an HSE Case for each and every operation by the building of {open_quotes}Generic HSE Cases{close_quotes} addressing specificmore » activities which were repeated across the Company`s operations. This was recognized to be particularly valid for the smaller Single String Venture type of operations. The activities selected for the initial Generic HSE Case development include Land Drilling Operations, Land Seismic Acquisition, and Land Transport. To establish the Generic HSE Case, the THESIS data base is populated with data for a generic operation, identifying all the hazards and activities associated with that operation including all the associated controls, with established formats for the textual sections. In effect, the Generic Case defines the standards required for that type of operation. To generate an operation specific HSE Case, the Generic Case thereafter requires to be modified/adapted so that it represents the actual situation in the operation which it defines. This process includes itemization of all the operation specific details, and may involve the inclusion/deletion of any additional/existing activities or hazards together with their associated controls.« less

  1. User Interactive Software for Analysis of Human Physiological Data

    NASA Technical Reports Server (NTRS)

    Cowings, Patricia S.; Toscano, William; Taylor, Bruce C.; Acharya, Soumydipta

    2006-01-01

    Ambulatory physiological monitoring has been used to study human health and performance in space and in a variety of Earth-based environments (e.g., military aircraft, armored vehicles, small groups in isolation, and patients). Large, multi-channel data files are typically recorded in these environments, and these files often require the removal of contaminated data prior to processing and analyses. Physiological data processing can now be performed with user-friendly, interactive software developed by the Ames Psychophysiology Research Laboratory. This software, which runs on a Windows platform, contains various signal-processing routines for both time- and frequency- domain data analyses (e.g., peak detection, differentiation and integration, digital filtering, adaptive thresholds, Fast Fourier Transform power spectrum, auto-correlation, etc.). Data acquired with any ambulatory monitoring system that provides text or binary file format are easily imported to the processing software. The application provides a graphical user interface where one can manually select and correct data artifacts utilizing linear and zero interpolation and adding trigger points for missed peaks. Block and moving average routines are also provided for data reduction. Processed data in numeric and graphic format can be exported to Excel. This software, PostProc (for post-processing) requires the Dadisp engineering spreadsheet (DSP Development Corp), or equivalent, for implementation. Specific processing routines were written for electrocardiography, electroencephalography, electromyography, blood pressure, skin conductance level, impedance cardiography (cardiac output, stroke volume, thoracic fluid volume), temperature, and respiration

  2. Autonomy Software: V&V Challenges and Characteristics

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Visser, Willem

    2006-01-01

    The successful operation of unmanned air vehicles requires software with a high degree of autonomy. Only if high level functions can be carried out without human control and intervention, complex missions in a changing and potentially unknown environment can be carried out successfully. Autonomy software is highly mission and safety critical: failures, caused by flaws in the software cannot only jeopardize the mission, but could also endanger human life (e.g., a crash of an UAV in a densely populated area). Due to its large size, high complexity, and use of specialized algorithms (planner, constraint-solver, etc.), autonomy software poses specific challenges for its verification, validation, and certification. -- - we have carried out a survey among researchers aid scientists at NASA to study these issues. In this paper, we will present major results of this study, discussing the broad spectrum. of notions and characteristics of autonomy software and its challenges for design and development. A main focus of this survey was to evaluate verification and validation (V&V) issues and challenges, compared to the development of "traditional" safety-critical software. We will discuss important issues in V&V of autonomous software and advanced V&V tools which can help to mitigate software risks. Results of this survey will help to identify and understand safety concerns in autonomy software and will lead to improved strategies for mitigation of these risks.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arion is a library and tool set that enables researchers to holistically define test system models. To define a complex system for testing an algorithm or control requires expertise across multiple domains. Simulating a complex system requires the integration of multiple simulators and test hardware, each with their own specification languages and concepts. This requires extensive set of knowledge and capabilities. Arion was developed to alleviate this challenge. Arion is a library of Java libraries that abstracts the concepts from supported simulators into a cohesive model language that allows someone to build models to their needed level of fidelity andmore » expertise. Arion is also a software tool that translates the users model back into the specification languages of the simulators and test hardware needed for execution.« less

  4. RT-Syn: A real-time software system generator

    NASA Technical Reports Server (NTRS)

    Setliff, Dorothy E.

    1992-01-01

    This paper presents research into providing highly reusable and maintainable components by using automatic software synthesis techniques. This proposal uses domain knowledge combined with automatic software synthesis techniques to engineer large-scale mission-critical real-time software. The hypothesis centers on a software synthesis architecture that specifically incorporates application-specific (in this case real-time) knowledge. This architecture synthesizes complex system software to meet a behavioral specification and external interaction design constraints. Some examples of these external constraints are communication protocols, precisions, timing, and space limitations. The incorporation of application-specific knowledge facilitates the generation of mathematical software metrics which are used to narrow the design space, thereby making software synthesis tractable. Success has the potential to dramatically reduce mission-critical system life-cycle costs not only by reducing development time, but more importantly facilitating maintenance, modifications, and extensions of complex mission-critical software systems, which are currently dominating life cycle costs.

  5. PD5: a general purpose library for primer design software.

    PubMed

    Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda

    2013-01-01

    Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.

  6. Flight Guidance System Validation Using SPIN

    NASA Technical Reports Server (NTRS)

    Naydich, Dimitri; Nowakowski, John

    1998-01-01

    To verify the requirements for the mode control logic of a Flight Guidance System (FGS) we applied SPIN, a widely used software package that supports the formal verification of distributed systems. These requirements, collectively called the FGS specification, were developed at Rockwell Avionics & Communications and expressed in terms of the Consortium Requirements Engineering (CoRE) method. The properties to be verified are the invariants formulated in the FGS specification, along with the standard properties of consistency and completeness. The project had two stages. First, the FGS specification and the properties to be verified were reformulated in PROMELA, the input language of SPIN. This involved a semantics issue, as some constructs of the FGS specification do not have well-defined semantics in CoRE. Then we attempted to verify the requirements' properties using the automatic model checking facilities of SPIN. Due to the large size of the state space of the FGS specification an exhaustive state space analysis with SPIN turned out to be impossible. So we used the supertrace model checking procedure of SPIN that provides for a partial analysis of the state space. During this process, we found some subtle errors in the FGS specification.

  7. 77 FR 60146 - Biweekly Notice: Applications and Amendments to Facility Operating Licenses and Combined Licenses...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-02

    ... support unlisted software, and the NRC Meta System Help Desk will not be able to offer assistance in using... supported Technical Specification (TS) systems inoperable when the associated snubber(s) cannot perform its... allowed before declaring a TS supported system inoperable and taking its Conditions and Required Actions...

  8. 76 FR 16386 - NOAA Policy on Prohibited and Approved Uses of the Asset Forfeiture Fund

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-23

    ..., including hardware, software and maintenance, required specifically for NOAA's enforcement and legal systems... applicable legal authorities. The AFF will also be used for compliance assistance activities, consistent with legal authorities, to better serve the needs of our stakeholders and improve the way NOAA engages and...

  9. Computer software to estimate timber harvesting system production, cost, and revenue

    Treesearch

    Dr. John E. Baumgras; Dr. Chris B. LeDoux

    1992-01-01

    Large variations in timber harvesting cost and revenue can result from the differences between harvesting systems, the variable attributes of harvesting sites and timber stands, or changing product markets. Consequently, system and site specific estimates of production rates and costs are required to improve estimates of harvesting revenue. This paper describes...

  10. The MAP program: building the digital terrain model.

    Treesearch

    R.H. Twito; R.W. Mifflin; R.J. McGaughey

    1987-01-01

    PLANS, a software package for integrated timber-harvest planning, uses digital terrain models to provide the topographic data needed to fit harvest and transportation designs to specific terrain. MAP, an integral program in the PLANS package, is used to construct the digital terrain models required by PLANS. MAP establishes digital terrain models using digitizer-traced...

  11. Atropos: specific, sensitive, and speedy trimming of sequencing reads.

    PubMed

    Didion, John P; Martin, Marcel; Collins, Francis S

    2017-01-01

    A key step in the transformation of raw sequencing reads into biological insights is the trimming of adapter sequences and low-quality bases. Read trimming has been shown to increase the quality and reliability while decreasing the computational requirements of downstream analyses. Many read trimming software tools are available; however, no tool simultaneously provides the accuracy, computational efficiency, and feature set required to handle the types and volumes of data generated in modern sequencing-based experiments. Here we introduce Atropos and show that it trims reads with high sensitivity and specificity while maintaining leading-edge speed. Compared to other state-of-the-art read trimming tools, Atropos achieves significant increases in trimming accuracy while remaining competitive in execution times. Furthermore, Atropos maintains high accuracy even when trimming data with elevated rates of sequencing errors. The accuracy, high performance, and broad feature set offered by Atropos makes it an appropriate choice for the pre-processing of Illumina, ABI SOLiD, and other current-generation short-read sequencing datasets. Atropos is open source and free software written in Python (3.3+) and available at https://github.com/jdidion/atropos.

  12. Protecting Against Faults in JPL Spacecraft

    NASA Technical Reports Server (NTRS)

    Morgan, Paula

    2007-01-01

    A paper discusses techniques for protecting against faults in spacecraft designed and operated by NASA s Jet Propulsion Laboratory (JPL). The paper addresses, more specifically, fault-protection requirements and techniques common to most JPL spacecraft (in contradistinction to unique, mission specific techniques), standard practices in the implementation of these techniques, and fault-protection software architectures. Common requirements include those to protect onboard command, data-processing, and control computers; protect against loss of Earth/spacecraft radio communication; maintain safe temperatures; and recover from power overloads. The paper describes fault-protection techniques as part of a fault-management strategy that also includes functional redundancy, redundant hardware, and autonomous monitoring of (1) the operational and health statuses of spacecraft components, (2) temperatures inside and outside the spacecraft, and (3) allocation of power. The strategy also provides for preprogrammed automated responses to anomalous conditions. In addition, the software running in almost every JPL spacecraft incorporates a general-purpose "Safe Mode" response algorithm that configures the spacecraft in a lower-power state that is safe and predictable, thereby facilitating diagnosis of more complex faults by a team of human experts on Earth.

  13. Atropos: specific, sensitive, and speedy trimming of sequencing reads

    PubMed Central

    Collins, Francis S.

    2017-01-01

    A key step in the transformation of raw sequencing reads into biological insights is the trimming of adapter sequences and low-quality bases. Read trimming has been shown to increase the quality and reliability while decreasing the computational requirements of downstream analyses. Many read trimming software tools are available; however, no tool simultaneously provides the accuracy, computational efficiency, and feature set required to handle the types and volumes of data generated in modern sequencing-based experiments. Here we introduce Atropos and show that it trims reads with high sensitivity and specificity while maintaining leading-edge speed. Compared to other state-of-the-art read trimming tools, Atropos achieves significant increases in trimming accuracy while remaining competitive in execution times. Furthermore, Atropos maintains high accuracy even when trimming data with elevated rates of sequencing errors. The accuracy, high performance, and broad feature set offered by Atropos makes it an appropriate choice for the pre-processing of Illumina, ABI SOLiD, and other current-generation short-read sequencing datasets. Atropos is open source and free software written in Python (3.3+) and available at https://github.com/jdidion/atropos. PMID:28875074

  14. Composing, Analyzing and Validating Software Models

    NASA Astrophysics Data System (ADS)

    Sheldon, Frederick T.

    1998-10-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  15. Composing, Analyzing and Validating Software Models

    NASA Technical Reports Server (NTRS)

    Sheldon, Frederick T.

    1998-01-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  16. Tandem mass spectrometry for the detection of plant pathogenic fungi and the effects of database composition on protein inferences.

    PubMed

    Padliya, Neerav D; Garrett, Wesley M; Campbell, Kimberly B; Tabb, David L; Cooper, Bret

    2007-11-01

    LC-MS/MS has demonstrated potential for detecting plant pathogens. Unlike PCR or ELISA, LC-MS/MS does not require pathogen-specific reagents for the detection of pathogen-specific proteins and peptides. However, the MS/MS approach we and others have explored does require a protein sequence reference database and database-search software to interpret tandem mass spectra. To evaluate the limitations of database composition on pathogen identification, we analyzed proteins from cultured Ustilago maydis, Phytophthora sojae, Fusarium graminearum, and Rhizoctonia solani by LC-MS/MS. When the search database did not contain sequences for a target pathogen, or contained sequences to related pathogens, target pathogen spectra were reliably matched to protein sequences from nontarget organisms, giving an illusion that proteins from nontarget organisms were identified. Our analysis demonstrates that when database-search software is used as part of the identification process, a paradox exists whereby additional sequences needed to detect a wide variety of possible organisms may lead to more cross-species protein matches and misidentification of pathogens.

  17. 76 FR 16785 - Meeting for Software Developers on the Technical Specifications for Common Formats for Patient...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-25

    ... Software Developers on the Technical Specifications for Common Formats for Patient Safety Data Collection... designed as an interactive forum where PSOs and software developers can provide input on these technical... updated event descriptions, forms, and technical specifications for software developers. As an update to...

  18. Spacelab user implementation assessment study. (Software requirements analysis). Volume 2: Technical report

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The engineering analyses and evaluation studies conducted for the Software Requirements Analysis are discussed. Included are the development of the study data base, synthesis of implementation approaches for software required by both mandatory onboard computer services and command/control functions, and identification and implementation of software for ground processing activities.

  19. Approaches in highly parameterized inversion: TSPROC, a general time-series processor to assist in model calibration and result summarization

    USGS Publications Warehouse

    Westenbroek, Stephen M.; Doherty, John; Walker, John F.; Kelson, Victor A.; Hunt, Randall J.; Cera, Timothy B.

    2012-01-01

    The TSPROC (Time Series PROCessor) computer software uses a simple scripting language to process and analyze time series. It was developed primarily to assist in the calibration of environmental models. The software is designed to perform calculations on time-series data commonly associated with surface-water models, including calculation of flow volumes, transformation by means of basic arithmetic operations, and generation of seasonal and annual statistics and hydrologic indices. TSPROC can also be used to generate some of the key input files required to perform parameter optimization by means of the PEST (Parameter ESTimation) computer software. Through the use of TSPROC, the objective function for use in the model-calibration process can be focused on specific components of a hydrograph.

  20. The integration of automated knowledge acquisition with computer-aided software engineering for space shuttle expert systems

    NASA Technical Reports Server (NTRS)

    Modesitt, Kenneth L.

    1990-01-01

    A prediction was made that the terms expert systems and knowledge acquisition would begin to disappear over the next several years. This is not because they are falling into disuse; it is rather that practitioners are realizing that they are valuable adjuncts to software engineering, in terms of problem domains addressed, user acceptance, and in development methodologies. A specific problem was discussed, that of constructing an automated test analysis system for the Space Shuttle Main Engine. In this domain, knowledge acquisition was part of requirements systems analysis, and was performed with the aid of a powerful inductive ESBT in conjunction with a computer aided software engineering (CASE) tool. The original prediction is not a very risky one -- it has already been accomplished.

  1. Flight dynamics software in a distributed network environment

    NASA Technical Reports Server (NTRS)

    Jeletic, J.; Weidow, D.; Boland, D.

    1995-01-01

    As with all NASA facilities, the announcement of reduced budgets, reduced staffing, and the desire to implement smaller/quicker/cheaper missions has required the Agency's organizations to become more efficient in what they do. To accomplish these objectives, the FDD has initiated the development of the Flight Dynamics Distributed System (FDDS). The underlying philosophy of FDDS is to build an integrated system that breaks down the traditional barriers of attitude, mission planning, and navigation support software to provide a uniform approach to flight dynamics applications. Through the application of open systems concepts and state-of-the-art technologies, including object-oriented specification concepts, object-oriented software, and common user interface, communications, data management, and executive services, the FDD will reengineer most of its six million lines of code.

  2. Using Penelope to assess the correctness of NASA Ada software: A demonstration of formal methods as a counterpart to testing

    NASA Technical Reports Server (NTRS)

    Eichenlaub, Carl T.; Harper, C. Douglas; Hird, Geoffrey

    1993-01-01

    Life-critical applications warrant a higher level of software reliability than has yet been achieved. Since it is not certain that traditional methods alone can provide the required ultra reliability, new methods should be examined as supplements or replacements. This paper describes a mathematical counterpart to the traditional process of empirical testing. ORA's Penelope verification system is demonstrated as a tool for evaluating the correctness of Ada software. Grady Booch's Ada calendar utility package, obtained through NASA, was specified in the Larch/Ada language. Formal verification in the Penelope environment established that many of the package's subprograms met their specifications. In other subprograms, failed attempts at verification revealed several errors that had escaped detection by testing.

  3. Workstation Segment Specification for the World-Wide Military Command and Control (WWMCCS) Information System (WIS)

    DTIC Science & Technology

    1989-07-01

    software eventually, and which indicate the general thrust of evolution and growth capabilities required of the workstation segment. These GRAY sections...configuration control over the evolution of the system. (" - The transition from the WWMCCS Standard ADP system to WIS will be gradual and will be...has been designed to support the workstation needs required for most WIS users, and is the workstation that will support the long-term evolution of

  4. Systems, methods and apparatus for modeling, specifying and deploying policies in autonomous and autonomic systems using agent-oriented software engineering

    NASA Technical Reports Server (NTRS)

    Sterritt, Roy (Inventor); Hinchey, Michael G. (Inventor); Penn, Joaquin (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which in some embodiments, an agent-oriented specification modeled with MaCMAS, is analyzed, flaws in the agent-oriented specification modeled with MaCMAS are corrected, and an implementation is derived from the corrected agent-oriented specification. Described herein are systems, method and apparatus that produce fully (mathematically) tractable development of agent-oriented specification(s) modeled with methodology fragment for analyzing complex multiagent systems (MaCMAS) and policies for autonomic systems from requirements through to code generation. The systems, method and apparatus described herein are illustrated through an example showing how user formulated policies can be translated into a formal mode which can then be converted to code. The requirements-based programming systems, method and apparatus described herein may provide faster, higher quality development and maintenance of autonomic systems based on user formulation of policies.

  5. Requirements Engineering in Building Climate Science Software

    ERIC Educational Resources Information Center

    Batcheller, Archer L.

    2011-01-01

    Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling…

  6. Web-based interactive 2D/3D medical image processing and visualization software.

    PubMed

    Mahmoudi, Seyyed Ehsan; Akhondi-Asl, Alireza; Rahmani, Roohollah; Faghih-Roohi, Shahrooz; Taimouri, Vahid; Sabouri, Ahmad; Soltanian-Zadeh, Hamid

    2010-05-01

    There are many medical image processing software tools available for research and diagnosis purposes. However, most of these tools are available only as local applications. This limits the accessibility of the software to a specific machine, and thus the data and processing power of that application are not available to other workstations. Further, there are operating system and processing power limitations which prevent such applications from running on every type of workstation. By developing web-based tools, it is possible for users to access the medical image processing functionalities wherever the internet is available. In this paper, we introduce a pure web-based, interactive, extendable, 2D and 3D medical image processing and visualization application that requires no client installation. Our software uses a four-layered design consisting of an algorithm layer, web-user-interface layer, server communication layer, and wrapper layer. To compete with extendibility of the current local medical image processing software, each layer is highly independent of other layers. A wide range of medical image preprocessing, registration, and segmentation methods are implemented using open source libraries. Desktop-like user interaction is provided by using AJAX technology in the web-user-interface. For the visualization functionality of the software, the VRML standard is used to provide 3D features over the web. Integration of these technologies has allowed implementation of our purely web-based software with high functionality without requiring powerful computational resources in the client side. The user-interface is designed such that the users can select appropriate parameters for practical research and clinical studies. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.

  7. Investigation of the current requirements engineering practices among software developers at the Universiti Utara Malaysia Information Technology (UUMIT) centre

    NASA Astrophysics Data System (ADS)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Abdullah, Inam

    2016-08-01

    Requirements Engineering (RE) is a systemic and integrated process of eliciting, elaborating, negotiating, validating and managing of the requirements of a system in a software development project. UUM has been supported by various systems developed and maintained by the UUM Information Technology (UUMIT) Centre. The aim of this study was to assess the current requirements engineering practices at UUMIT. The main problem that prompted this research is the lack of studies that support software development activities at the UUMIT. The study is geared at helping UUMIT produce quality but time and cost saving software products by implementing cutting edge and state of the art requirements engineering practices. Also, the study contributes to UUM by identifying the activities needed for software development so that the management will be able to allocate budget to provide adequate and precise training for the software developers. Three variables were investigated: Requirement Description, Requirements Development (comprising: Requirements Elicitation, Requirements Analysis and Negotiation, Requirements Validation), and Requirement Management. The results from the study showed that the current practice of requirement engineering in UUMIT is encouraging, but still need further development and improvement because a few RE practices were seldom practiced.

  8. CellAnimation: an open source MATLAB framework for microscopy assays.

    PubMed

    Georgescu, Walter; Wikswo, John P; Quaranta, Vito

    2012-01-01

    Advances in microscopy technology have led to the creation of high-throughput microscopes that are capable of generating several hundred gigabytes of images in a few days. Analyzing such wealth of data manually is nearly impossible and requires an automated approach. There are at present a number of open-source and commercial software packages that allow the user to apply algorithms of different degrees of sophistication to the images and extract desired metrics. However, the types of metrics that can be extracted are severely limited by the specific image processing algorithms that the application implements, and by the expertise of the user. In most commercial software, code unavailability prevents implementation by the end user of newly developed algorithms better suited for a particular type of imaging assay. While it is possible to implement new algorithms in open-source software, rewiring an image processing application requires a high degree of expertise. To obviate these limitations, we have developed an open-source high-throughput application that allows implementation of different biological assays such as cell tracking or ancestry recording, through the use of small, relatively simple image processing modules connected into sophisticated imaging pipelines. By connecting modules, non-expert users can apply the particular combination of well-established and novel algorithms developed by us and others that are best suited for each individual assay type. In addition, our data exploration and visualization modules make it easy to discover or select specific cell phenotypes from a heterogeneous population. CellAnimation is distributed under the Creative Commons Attribution-NonCommercial 3.0 Unported license (http://creativecommons.org/licenses/by-nc/3.0/). CellAnimationsource code and documentation may be downloaded from www.vanderbilt.edu/viibre/software/documents/CellAnimation.zip. Sample data are available at www.vanderbilt.edu/viibre/software/documents/movies.zip. walter.georgescu@vanderbilt.edu Supplementary data available at Bioinformatics online.

  9. Imaging Sensor Flight and Test Equipment Software

    NASA Technical Reports Server (NTRS)

    Freestone, Kathleen; Simeone, Louis; Robertson, Byran; Frankford, Maytha; Trice, David; Wallace, Kevin; Wilkerson, DeLisa

    2007-01-01

    The Lightning Imaging Sensor (LIS) is one of the components onboard the Tropical Rainfall Measuring Mission (TRMM) satellite, and was designed to detect and locate lightning over the tropics. The LIS flight code was developed to run on a single onboard digital signal processor, and has operated the LIS instrument since 1997 when the TRMM satellite was launched. The software provides controller functions to the LIS Real-Time Event Processor (RTEP) and onboard heaters, collects the lightning event data from the RTEP, compresses and formats the data for downlink to the satellite, collects housekeeping data and formats the data for downlink to the satellite, provides command processing and interface to the spacecraft communications and data bus, and provides watchdog functions for error detection. The Special Test Equipment (STE) software was designed to operate specific test equipment used to support the LIS hardware through development, calibration, qualification, and integration with the TRMM spacecraft. The STE software provides the capability to control instrument activation, commanding (including both data formatting and user interfacing), data collection, decompression, and display and image simulation. The LIS STE code was developed for the DOS operating system in the C programming language. Because of the many unique data formats implemented by the flight instrument, the STE software was required to comprehend the same formats, and translate them for the test operator. The hardware interfaces to the LIS instrument using both commercial and custom computer boards, requiring that the STE code integrate this variety into a working system. In addition, the requirement to provide RTEP test capability dictated the need to provide simulations of background image data with short-duration lightning transients superimposed. This led to the development of unique code used to control the location, intensity, and variation above background for simulated lightning strikes at user-selected locations.

  10. Large Scale Software Building with CMake in ATLAS

    NASA Astrophysics Data System (ADS)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.

  11. Introducing Risk Management Techniques Within Project Based Software Engineering Courses

    NASA Astrophysics Data System (ADS)

    Port, Daniel; Boehm, Barry

    2002-03-01

    In 1996, USC switched its core two-semester software engineering course from a hypothetical-project, homework-and-exam course based on the Bloom taxonomy of educational objectives (knowledge, comprehension, application, analysis, synthesis, and evaluation). The revised course is a real-client team-project course based on the CRESST model of learning objectives (content understanding, problem solving, collaboration, communication, and self-regulation). We used the CRESST cognitive demands analysis to determine the necessary student skills required for software risk management and the other major project activities, and have been refining the approach over the last 5 years of experience, including revised versions for one-semester undergraduate and graduate project course at Columbia. This paper summarizes our experiences in evolving the risk management aspects of the project course. These have helped us mature more general techniques such as risk-driven specifications, domain-specific simplifier and complicator lists, and the schedule as an independent variable (SAIV) process model. The largely positive results in terms of review of pass / fail rates, client evaluations, product adoption rates, and hiring manager feedback are summarized as well.

  12. Qualitative Analysis for Maintenance Process Assessment

    NASA Technical Reports Server (NTRS)

    Brand, Lionel; Kim, Yong-Mi; Melo, Walcelio; Seaman, Carolyn; Basili, Victor

    1996-01-01

    In order to improve software maintenance processes, we first need to be able to characterize and assess them. These tasks must be performed in depth and with objectivity since the problems are complex. One approach is to set up a measurement-based software process improvement program specifically aimed at maintenance. However, establishing a measurement program requires that one understands the problems to be addressed by the measurement program and is able to characterize the maintenance environment and processes in order to collect suitable and cost-effective data. Also, enacting such a program and getting usable data sets takes time. A short term substitute is therefore needed. We propose in this paper a characterization process aimed specifically at maintenance and based on a general qualitative analysis methodology. This process is rigorously defined in order to be repeatable and usable by people who are not acquainted with such analysis procedures. A basic feature of our approach is that actual implemented software changes are analyzed in order to understand the flaws in the maintenance process. Guidelines are provided and a case study is shown that demonstrates the usefulness of the approach.

  13. 15 CFR 30.37 - Miscellaneous exemptions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... requirements of the licensing Federal agency. (f) Exports of technology and software as defined in 15 CFR 772... required for mass-market software. For purposes of this part, mass-market software is defined as software... of commodities and software intended for use by individual USPPIs or by employees or representatives...

  14. 15 CFR 30.37 - Miscellaneous exemptions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... requirements of the licensing Federal agency. (f) Exports of technology and software as defined in 15 CFR 772... required for mass-market software. For purposes of this part, mass-market software is defined as software... of commodities and software intended for use by individual USPPIs or by employees or representatives...

  15. 15 CFR 30.37 - Miscellaneous exemptions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... requirements of the licensing Federal agency. (f) Exports of technology and software as defined in 15 CFR 772... required for mass-market software. For purposes of this part, mass-market software is defined as software... of commodities and software intended for use by individual USPPIs or by employees or representatives...

  16. 15 CFR 30.37 - Miscellaneous exemptions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... requirements of the licensing Federal agency. (f) Exports of technology and software as defined in 15 CFR 772... required for mass-market software. For purposes of this part, mass-market software is defined as software... of commodities and software intended for use by individual USPPIs or by employees or representatives...

  17. UTM TCL2 Software Requirements

    NASA Technical Reports Server (NTRS)

    Smith, Irene S.; Rios, Joseph L.; McGuirk, Patrick O.; Mulfinger, Daniel G.; Venkatesan, Priya; Smith, David R.; Baskaran, Vijayakumar; Wang, Leo

    2017-01-01

    The Unmanned Aircraft Systems (UAS) Traffic Management (UTM) Technical Capability Level (TCL) 2 software implements the UTM TCL 2 software requirements described herein. These software requirements are linked to the higher level UTM TCL 2 System Requirements. Each successive TCL implements additional UTM functionality, enabling additional use cases. TCL 2 demonstrated how to enable expanded multiple operations by implementing automation for beyond visual line-of-sight, tracking operations, and operations flying over sparsely populated areas.

  18. Assessing students' performance in software requirements engineering education using scoring rubrics

    NASA Astrophysics Data System (ADS)

    Mkpojiogu, Emmanuel O. C.; Hussain, Azham

    2017-10-01

    The study investigates how helpful the use of scoring rubrics is, in the performance assessment of software requirements engineering students and whether its use can lead to students' performance improvement in the development of software requirements artifacts and models. Scoring rubrics were used by two instructors to assess the cognitive performance of a student in the design and development of software requirements artifacts. The study results indicate that the use of scoring rubrics is very helpful in objectively assessing the performance of software requirements or software engineering students. Furthermore, the results revealed that the use of scoring rubrics can also produce a good achievement assessments direction showing whether a student is either improving or not in a repeated or iterative assessment. In a nutshell, its use leads to the performance improvement of students. The results provided some insights for further investigation and will be beneficial to researchers, requirements engineers, system designers, developers and project managers.

  19. Electronic nose for space program applications

    NASA Technical Reports Server (NTRS)

    Young, Rebecca C.; Buttner, William J.; Linnell, Bruce R.; Ramesham, Rajeshuni

    2003-01-01

    The ability to monitor air contaminants in the shuttle and the International Space Station is important to ensure the health and safety of astronauts, and equipment integrity. Three specific space applications have been identified that would benefit from a chemical monitor: (a) organic contaminants in space cabin air; (b) hypergolic propellant contaminants in the shuttle airlock; (c) pre-combustion signature vapors from electrical fires. NASA at Kennedy Space Center (KSC) is assessing several commercial and developing electronic noses (E-noses) for these applications. A short series of tests identified those E-noses that exhibited sufficient sensitivity to the vapors of interest. Only two E-noses exhibited sufficient sensitivity for hypergolic fuels at the required levels, while several commercial E-noses showed sufficient sensitivity of common organic vapors. These E-noses were subjected to further tests to assess their ability to identify vapors. Development and testing of E-nose models using vendor supplied software packages correctly identified vapors with an accuracy of 70-90%. In-house software improvements increased the identification rates between 90 and 100%. Further software enhancements are under development. Details on the experimental setup, test protocols, and results on E-nose performance are presented in this paper along with special emphasis on specific software enhancements. c2003 Elsevier Science B.V. All rights reserved.

  20. Questioning the Role of Requirements Engineering in the Causes of Safety-Critical Software Failures

    NASA Technical Reports Server (NTRS)

    Johnson, C. W.; Holloway, C. M.

    2006-01-01

    Many software failures stem from inadequate requirements engineering. This view has been supported both by detailed accident investigations and by a number of empirical studies; however, such investigations can be misleading. It is often difficult to distinguish between failures in requirements engineering and problems elsewhere in the software development lifecycle. Further pitfalls arise from the assumption that inadequate requirements engineering is a cause of all software related accidents for which the system fails to meet its requirements. This paper identifies some of the problems that have arisen from an undue focus on the role of requirements engineering in the causes of major accidents. The intention is to provoke further debate within the emerging field of forensic software engineering.

  1. Java Mission Evaluation Workstation System

    NASA Technical Reports Server (NTRS)

    Pettinger, Ross; Watlington, Tim; Ryley, Richard; Harbour, Jeff

    2006-01-01

    The Java Mission Evaluation Workstation System (JMEWS) is a collection of applications designed to retrieve, display, and analyze both real-time and recorded telemetry data. This software is currently being used by both the Space Shuttle Program (SSP) and the International Space Station (ISS) program. JMEWS was written in the Java programming language to satisfy the requirement of platform independence. An object-oriented design was used to satisfy additional requirements and to make the software easily extendable. By virtue of its platform independence, JMEWS can be used on the UNIX workstations in the Mission Control Center (MCC) and on office computers. JMEWS includes an interactive editor that allows users to easily develop displays that meet their specific needs. The displays can be developed and modified while viewing data. By simply selecting a data source, the user can view real-time, recorded, or test data.

  2. LHCb migration from Subversion to Git

    NASA Astrophysics Data System (ADS)

    Clemencic, M.; Couturier, B.; Closier, J.; Cattaneo, M.

    2017-10-01

    Due to user demand and to support new development workflows based on code review and multiple development streams, LHCb decided to port the source code management from Subversion to Git, using the CERN GitLab hosting service. Although tools exist for this kind of migration, LHCb specificities and development models required careful planning of the migration, development of migration tools, changes to the development model, and redefinition of the release procedures. Moreover we had to support a hybrid situation with some software projects hosted in Git and others still in Subversion, or even branches of one projects hosted in different systems. We present the way we addressed the special LHCb requirements, the technical details of migrating large non standard Subversion repositories, and how we managed to smoothly migrate the software projects following the schedule of each project manager.

  3. Visualization of unsteady computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Haimes, Robert

    1994-11-01

    A brief summary of the computer environment used for calculating three dimensional unsteady Computational Fluid Dynamic (CFD) results is presented. This environment requires a super computer as well as massively parallel processors (MPP's) and clusters of workstations acting as a single MPP (by concurrently working on the same task) provide the required computational bandwidth for CFD calculations of transient problems. The cluster of reduced instruction set computers (RISC) is a recent advent based on the low cost and high performance that workstation vendors provide. The cluster, with the proper software can act as a multiple instruction/multiple data (MIMD) machine. A new set of software tools is being designed specifically to address visualizing 3D unsteady CFD results in these environments. Three user's manuals for the parallel version of Visual3, pV3, revision 1.00 make up the bulk of this report.

  4. Continuous Risk Management: A NASA Program Initiative

    NASA Technical Reports Server (NTRS)

    Hammer, Theodore F.; Rosenberg, Linda

    1999-01-01

    NPG 7120.5A, "NASA Program and Project Management Processes and Requirements" enacted in April, 1998, requires that "The program or project manager shall apply risk management principles..." The Software Assurance Technology Center (SATC) at NASA GSFC has been tasked with the responsibility for developing and teaching a systems level course for risk management that provides information on how to comply with this edict. The course was developed in conjunction with the Software Engineering Institute at Carnegie Mellon University, then tailored to the NASA systems community. This presentation will briefly discuss the six functions for risk management: (1) Identify the risks in a specific format; (2) Analyze the risk probability, impact/severity, and timeframe; (3) Plan the approach; (4) Track the risk through data compilation and analysis; (5) Control and monitor the risk; (6) Communicate and document the process and decisions.

  5. Modeling and Composing Scenario-Based Requirements with Aspects

    NASA Technical Reports Server (NTRS)

    Araujo, Joao; Whittle, Jon; Ki, Dae-Kyoo

    2004-01-01

    There has been significant recent interest, within the Aspect-Oriented Software Development (AOSD) community, in representing crosscutting concerns at various stages of the software lifecycle. However, most of these efforts have concentrated on the design and implementation phases. We focus in this paper on representing aspects during use case modeling. In particular, we focus on scenario-based requirements and show how to compose aspectual and non-aspectual scenarios so that they can be simulated as a whole. Non-aspectual scenarios are modeled as UML sequence diagram. Aspectual scenarios are modeled as Interaction Pattern Specifications (IPS). In order to simulate them, the scenarios are transformed into a set of executable state machines using an existing state machine synthesis algorithm. Previous work composed aspectual and non-aspectual scenarios at the sequence diagram level. In this paper, the composition is done at the state machine level.

  6. Visualization of unsteady computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    1994-01-01

    A brief summary of the computer environment used for calculating three dimensional unsteady Computational Fluid Dynamic (CFD) results is presented. This environment requires a super computer as well as massively parallel processors (MPP's) and clusters of workstations acting as a single MPP (by concurrently working on the same task) provide the required computational bandwidth for CFD calculations of transient problems. The cluster of reduced instruction set computers (RISC) is a recent advent based on the low cost and high performance that workstation vendors provide. The cluster, with the proper software can act as a multiple instruction/multiple data (MIMD) machine. A new set of software tools is being designed specifically to address visualizing 3D unsteady CFD results in these environments. Three user's manuals for the parallel version of Visual3, pV3, revision 1.00 make up the bulk of this report.

  7. Do you also have problems with the file format syndrome?

    PubMed

    De Cuyper, B; Nyssen, E; Christophe, Y; Cornelis, J

    1991-11-01

    In a biomedical data processing environment, an essential requirement is the ability to integrate a large class of standard modules for the acquisition, processing and display of the (image) data. Our approach to the management and manipulation of the different data formats is based on the specification of a common standard for the representation of data formats, called 'data nature descriptions' to emphasise that this representation not only specifies the structure but also the contents of data objects (files). The idea behind this concept is to associate each hardware and software component that produces or uses medical data with a description of the data objects manipulated by that component. In our approach a special software module (a format convertor generator) takes care of the appropriate data format conversions, required when two or more components of the system exchange data.

  8. Simulating the behavior of volatiles belonging to the C-O-H-S system in silicate melts under magmatic conditions with the software D-Compress

    NASA Astrophysics Data System (ADS)

    Burgisser, Alain; Alletti, Marina; Scaillet, Bruno

    2015-06-01

    Modeling magmatic degassing, or how the volatile distribution between gas and melt changes at pressure varies, is a complex task that involves a large number of thermodynamical relationships and that requires dedicated software. This article presents the software D-Compress, which computes the gas and melt volatile composition of five element sets in magmatic systems (O-H, S-O-H, C-S-O-H, C-S-O-H-Fe, and C-O-H). It has been calibrated so as to simulate the volatiles coexisting with three common types of silicate melts (basalt, phonolite, and rhyolite). Operational temperatures depend on melt composition and range from 790 to 1400 °C. A specificity of D-Compress is the calculation of volatile composition as pressure varies along a (de)compression path between atmospheric and 3000 bars. This software was prepared so as to maximize versatility by proposing different sets of input parameters. In particular, whenever new solubility laws on specific melt compositions are available, the model parameters can be easily tuned to run the code on that composition. Parameter gaps were minimized by including sets of chemical species for which calibration data were available over a wide range of pressure, temperature, and melt composition. A brief description of the model rationale is followed by the presentation of the software capabilities. Examples of use are then presented with outputs comparisons between D-Compress and other currently available thermodynamical models. The compiled software and the source code are available as electronic supplementary materials.

  9. Standard requirements for GCP-compliant data management in multinational clinical trials.

    PubMed

    Ohmann, Christian; Kuchinke, Wolfgang; Canham, Steve; Lauritsen, Jens; Salas, Nader; Schade-Brittinger, Carmen; Wittenberg, Michael; McPherson, Gladys; McCourt, John; Gueyffier, Francois; Lorimer, Andrea; Torres, Ferràn

    2011-03-22

    A recent survey has shown that data management in clinical trials performed by academic trial units still faces many difficulties (e.g. heterogeneity of software products, deficits in quality management, limited human and financial resources and the complexity of running a local computer centre). Unfortunately, no specific, practical and open standard for both GCP-compliant data management and the underlying IT-infrastructure is available to improve the situation. For that reason the "Working Group on Data Centres" of the European Clinical Research Infrastructures Network (ECRIN) has developed a standard specifying the requirements for high quality GCP-compliant data management in multinational clinical trials. International, European and national regulations and guidelines relevant to GCP, data security and IT infrastructures, as well as ECRIN documents produced previously, were evaluated to provide a starting point for the development of standard requirements. The requirements were produced by expert consensus of the ECRIN Working group on Data Centres, using a structured and standardised process. The requirements were divided into two main parts: an IT part covering standards for the underlying IT infrastructure and computer systems in general, and a Data Management (DM) part covering requirements for data management applications in clinical trials. The standard developed includes 115 IT requirements, split into 15 separate sections, 107 DM requirements (in 12 sections) and 13 other requirements (2 sections). Sections IT01 to IT05 deal with the basic IT infrastructure while IT06 and IT07 cover validation and local software development. IT08 to IT015 concern the aspects of IT systems that directly support clinical trial management. Sections DM01 to DM03 cover the implementation of a specific clinical data management application, i.e. for a specific trial, whilst DM04 to DM12 address the data management of trials across the unit. Section IN01 is dedicated to international aspects and ST01 to the competence of a trials unit's staff. The standard is intended to provide an open and widely used set of requirements for GCP-compliant data management, particularly in academic trial units. It is the intention that ECRIN will use these requirements as the basis for the certification of ECRIN data centres.

  10. The Use of UML for Software Requirements Expression and Management

    NASA Technical Reports Server (NTRS)

    Murray, Alex; Clark, Ken

    2015-01-01

    It is common practice to write English-language "shall" statements to embody detailed software requirements in aerospace software applications. This paper explores the use of the UML language as a replacement for the English language for this purpose. Among the advantages offered by the Unified Modeling Language (UML) is a high degree of clarity and precision in the expression of domain concepts as well as architecture and design. Can this quality of UML be exploited for the definition of software requirements? While expressing logical behavior, interface characteristics, timeliness constraints, and other constraints on software using UML is commonly done and relatively straight-forward, achieving the additional aspects of the expression and management of software requirements that stakeholders expect, especially traceability, is far less so. These other characteristics, concerned with auditing and quality control, include the ability to trace a requirement to a parent requirement (which may well be an English "shall" statement), to trace a requirement to verification activities or scenarios which verify that requirement, and to trace a requirement to elements of the software design which implement that requirement. UML Use Cases, designed for capturing requirements, have not always been satisfactory. Some applications of them simply use the Use Case model element as a repository for English requirement statements. Other applications of Use Cases, in which Use Cases are incorporated into behavioral diagrams that successfully communicate the behaviors and constraints required of the software, do indeed take advantage of UML's clarity, but not in ways that support the traceability features mentioned above. Our approach uses the Stereotype construct of UML to precisely identify elements of UML constructs, especially behaviors such as State Machines and Activities, as requirements, and also to achieve the necessary mapping capabilities. We describe this approach in the context of a space-based software application currently under development at the Jet Propulsion Laboratory.

  11. Power API Prototype

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-12-04

    The software serves two purposes. The first purpose of the software is to prototype the Sandia High Performance Computing Power Application Programming Interface Specification effort. The specification can be found at http://powerapi.sandia.gov . Prototypes of the specification were developed in parallel with the development of the specification. Release of the prototype will be instructive to anyone who intends to implement the specification. More specifically, our vendor collaborators will benefit from the availability of the prototype. The second is in direct support of the PowerInsight power measurement device, which was co-developed with Penguin Computing. The software provides a cluster wide measurementmore » capability enabled by the PowerInsight device. The software can be used by anyone who purchases a PowerInsight device. The software will allow the user to easily collect power and energy information of a node that is instrumented with PowerInsight. The software can also be used as an example prototype implementation of the High Performance Computing Power Application Programming Interface Specification.« less

  12. Software Engineering Improvement Plan

    NASA Technical Reports Server (NTRS)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  13. 48 CFR 208.7402 - General.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software Agreements 208.7402 General. Departments and agencies shall fulfill requirements for commercial software and related services, such as software maintenance, in accordance with the DoD Enterprise Software Initiative (ESI...

  14. 48 CFR 208.7402 - General.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software Agreements 208.7402 General. (1) Departments and agencies shall fulfill requirements for commercial software and related services, such as software maintenance, in accordance with the DoD Enterprise Software Initiative...

  15. 48 CFR 208.7402 - General.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software Agreements 208.7402 General. Departments and agencies shall fulfill requirements for commercial software and related services, such as software maintenance, in accordance with the DoD Enterprise Software Initiative (ESI...

  16. 48 CFR 208.7402 - General.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software Agreements 208.7402 General. Departments and agencies shall fulfill requirements for commercial software and related services, such as software maintenance, in accordance with the DoD Enterprise Software Initiative (ESI...

  17. Traceability of Software Safety Requirements in Legacy Safety Critical Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice L.

    2007-01-01

    How can traceability of software safety requirements be created for legacy safety critical systems? Requirements in safety standards are imposed most times during contract negotiations. On the other hand, there are instances where safety standards are levied on legacy safety critical systems, some of which may be considered for reuse for new applications. Safety standards often specify that software development documentation include process-oriented and technical safety requirements, and also require that system and software safety analyses are performed supporting technical safety requirements implementation. So what can be done if the requisite documents for establishing and maintaining safety requirements traceability are not available?

  18. Engineering Complex Embedded Systems with State Analysis and the Mission Data System

    NASA Technical Reports Server (NTRS)

    Ingham, Michel D.; Rasmussen, Robert D.; Bennett, Matthew B.; Moncada, Alex C.

    2004-01-01

    It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer s intent, potentially leading to software errors. This problem is addressed by a systems engineering methodology called State Analysis, which provides a process for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using State Analysis and how these requirements inform the design of the system software, using representative spacecraft examples.

  19. Tool Use Within NASA Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  20. Software Engineering Laboratory (SEL) cleanroom process model

    NASA Technical Reports Server (NTRS)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  1. Sign: large-scale gene network estimation environment for high performance computing.

    PubMed

    Tamada, Yoshinori; Shimamura, Teppei; Yamaguchi, Rui; Imoto, Seiya; Nagasaki, Masao; Miyano, Satoru

    2011-01-01

    Our research group is currently developing software for estimating large-scale gene networks from gene expression data. The software, called SiGN, is specifically designed for the Japanese flagship supercomputer "K computer" which is planned to achieve 10 petaflops in 2012, and other high performance computing environments including Human Genome Center (HGC) supercomputer system. SiGN is a collection of gene network estimation software with three different sub-programs: SiGN-BN, SiGN-SSM and SiGN-L1. In these three programs, five different models are available: static and dynamic nonparametric Bayesian networks, state space models, graphical Gaussian models, and vector autoregressive models. All these models require a huge amount of computational resources for estimating large-scale gene networks and therefore are designed to be able to exploit the speed of 10 petaflops. The software will be available freely for "K computer" and HGC supercomputer system users. The estimated networks can be viewed and analyzed by Cell Illustrator Online and SBiP (Systems Biology integrative Pipeline). The software project web site is available at http://sign.hgc.jp/ .

  2. General Aviation Data Framework

    NASA Technical Reports Server (NTRS)

    Blount, Elaine M.; Chung, Victoria I.

    2006-01-01

    The Flight Research Services Directorate at the NASA Langley Research Center (LaRC) provides development and operations services associated with three general aviation (GA) aircraft used for research experiments. The GA aircraft includes a Cessna 206X Stationair, a Lancair Colombia 300X, and a Cirrus SR22X. Since 2004, the GA Data Framework software was designed and implemented to gather data from a varying set of hardware and software sources as well as enable transfer of the data to other computers or devices. The key requirements for the GA Data Framework software include platform independence, the ability to reuse the framework for different projects without changing the framework code, graphics display capabilities, and the ability to vary the interfaces and their performance. Data received from the various devices is stored in shared memory. This paper concentrates on the object oriented software design patterns within the General Aviation Data Framework, and how they enable the construction of project specific software without changing the base classes. The issues of platform independence and multi-threading which enable interfaces to run at different frame rates are also discussed in this paper.

  3. Distributed intelligent control and management (DICAM) applications and support for semi-automated development

    NASA Technical Reports Server (NTRS)

    Hayes-Roth, Frederick; Erman, Lee D.; Terry, Allan; Hayes-Roth, Barbara

    1992-01-01

    We have recently begun a 4-year effort to develop a new technology foundation and associated methodology for the rapid development of high-performance intelligent controllers. Our objective in this work is to enable system developers to create effective real-time systems for control of multiple, coordinated entities in much less time than is currently required. Our technical strategy for achieving this objective is like that in other domain-specific software efforts: analyze the domain and task underlying effective performance, construct parametric or model-based generic components and overall solutions to the task, and provide excellent means for specifying, selecting, tailoring or automatically generating the solution elements particularly appropriate for the problem at hand. In this paper, we first present our specific domain focus, briefly describe the methodology and environment we are developing to provide a more regular approach to software development, and then later describe the issues this raises for the research community and this specific workshop.

  4. SimVascular: An Open Source Pipeline for Cardiovascular Simulation.

    PubMed

    Updegrove, Adam; Wilson, Nathan M; Merkow, Jameson; Lan, Hongzhi; Marsden, Alison L; Shadden, Shawn C

    2017-03-01

    Patient-specific cardiovascular simulation has become a paradigm in cardiovascular research and is emerging as a powerful tool in basic, translational and clinical research. In this paper we discuss the recent development of a fully open-source SimVascular software package, which provides a complete pipeline from medical image data segmentation to patient-specific blood flow simulation and analysis. This package serves as a research tool for cardiovascular modeling and simulation, and has contributed to numerous advances in personalized medicine, surgical planning and medical device design. The SimVascular software has recently been refactored and expanded to enhance functionality, usability, efficiency and accuracy of image-based patient-specific modeling tools. Moreover, SimVascular previously required several licensed components that hindered new user adoption and code management and our recent developments have replaced these commercial components to create a fully open source pipeline. These developments foster advances in cardiovascular modeling research, increased collaboration, standardization of methods, and a growing developer community.

  5. Using RDF and Git to Realize a Collaborative Metadata Repository.

    PubMed

    Stöhr, Mark R; Majeed, Raphael W; Günther, Andreas

    2018-01-01

    The German Center for Lung Research (DZL) is a research network with the aim of researching respiratory diseases. The participating study sites' register data differs in terms of software and coding system as well as data field coverage. To perform meaningful consortium-wide queries through one single interface, a uniform conceptual structure is required covering the DZL common data elements. No single existing terminology includes all our concepts. Potential candidates such as LOINC and SNOMED only cover specific subject areas or are not granular enough for our needs. To achieve a broadly accepted and complete ontology, we developed a platform for collaborative metadata management. The DZL data management group formulated detailed requirements regarding the metadata repository and the user interfaces for metadata editing. Our solution builds upon existing standard technologies allowing us to meet those requirements. Its key parts are RDF and the distributed version control system Git. We developed a software system to publish updated metadata automatically and immediately after performing validation tests for completeness and consistency.

  6. Effectiveness of an automatic tracking software in underwater motion analysis.

    PubMed

    Magalhaes, Fabrício A; Sawacha, Zimi; Di Michele, Rocco; Cortesi, Matteo; Gatta, Giorgio; Fantozzi, Silvia

    2013-01-01

    Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP), based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers' positions) were manually tracked to determine the markers' center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM). Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker's coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4%) than for COM (17.8%). Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis. Key PointsThe availability of effective software for automatic tracking would represent a significant advance for the practical use of kinematic analysis in swimming and other aquatic sports.An important feature of automatic tracking software is to require limited human interventions and supervision, thus allowing short processing time.When tracking underwater movements, the degree of automation of the tracking procedure is influenced by the capability of the algorithm to overcome difficulties linked to the small target size, the low image quality and the presence of background clutters.The newly developed feature-tracking algorithm has shown a good automatic tracking effectiveness in underwater motion analysis with significantly smaller percentage of required manual interventions when compared to a commercial software.

  7. Expert Recommender: Designing for a Network Organization

    NASA Astrophysics Data System (ADS)

    Reichling, Tim; Veith, Michael; Wulf, Volker

    Recent knowledge management initiatives focus on expertise sharing within formal organizational units and informal communities of practice. Expert recommender systems seem to be a promising tool in support of these initiatives. This paper presents experiences in designing an expert recommender system for a knowledge- intensive organization, namely the National Industry Association (NIA). Field study results provide a set of specific design requirements. Based on these requirements, we have designed an expert recommender system which is integrated into the specific software infrastructure of the organizational setting. The organizational setting is, as we will show, specific for historical, political, and economic reasons. These particularities influence the employees’ organizational and (inter-)personal needs within this setting. The paper connects empirical findings of a long-term case study with design experiences of an expertise recommender system.

  8. Advanced information processing system: Input/output system services

    NASA Technical Reports Server (NTRS)

    Masotto, Tom; Alger, Linda

    1989-01-01

    The functional requirements and detailed specifications for the Input/Output (I/O) Systems Services of the Advanced Information Processing System (AIPS) are discussed. The introductory section is provided to outline the overall architecture and functional requirements of the AIPS system. Section 1.1 gives a brief overview of the AIPS architecture as well as a detailed description of the AIPS fault tolerant network architecture, while section 1.2 provides an introduction to the AIPS systems software. Sections 2 and 3 describe the functional requirements and design and detailed specifications of the I/O User Interface and Communications Management modules of the I/O System Services, respectively. Section 4 illustrates the use of the I/O System Services, while Section 5 concludes with a summary of results and suggestions for future work in this area.

  9. Automated personnel data base system specifications, Task V. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartley, H.J.; Bocast, A.K.; Deppner, F.O.

    1978-09-01

    This document is the General Research Corporation report on Task V of a study for the Office of Inspection and Enforcement of the Nuclear Regulatory Commission (NRC/IE). The full title of this study is ''Development of Qualification Requirements, Training Programs, Career Plans, and Methodologies for Effective Management and Training of Inspection and Enforcement Personnel.'' Task V required the development of an automated personnel data base system for NRC/IE. This system is identified as the NRC/IE Personnel, Assignment, Qualifications, and Training System (PAQTS). This Task V report provides the documentation for PAQTS including the Functional Requirements Document (FRD), the Data Requirementsmore » Document (DRD), the Hardware and Software Capabilities Assessment, and the Detailed Implementation Schedule. Specific recommendations to facilitate implementation of PAQTS are also included.« less

  10. HAMS (Hypoxia, Monitoring, and Mitigation System) II Quarterly Progress Report (Technical and Financial)

    DTIC Science & Technology

    2015-04-10

    Peripheral Interface SpO2 Arterial Oxygen Saturation Measured via Pulse - Oximeter SRS Software Requirements Specification SV Stroke Volume SVR Systemic...viewer ....................................................................................................... 9 Figure 3: Pulse OX custom module...analysis approaches will be gathered. Sensors which detect SpO2, pulse / pulse rate, ECG, and skin temperature will be researched and evaluated for

  11. Software Independent Verification and Validation (SIV&V) Simplified

    DTIC Science & Technology

    2006-12-01

    Configuration Item I/O Input/Output I2V2 Independent Integrated Verification and Validation IBM International Business Machines ICD Interface...IPT Integrated Product Team IRS Interface Requirements Specification ISD Integrated System Diagram ITD Integrated Test Description ITP ...programming languages such as COBOL (Common Business Oriented Language) (Codasyl committee 1960), and FORTRAN (FORmula TRANslator) ( IBM 1952) (Robat 11

  12. Communications Processors: Categories, Applications, and Trends

    DTIC Science & Technology

    1976-03-01

    allow switching from BSC to SDLC .(12) Standard protocols would ease the requirement that communications processor software convert from one...COMMANDER c^/g^_ (^-»M-^ V »*-^ FRANK J. EMMA, Colonel, USAF Director, information Systems Technology Applications Office Deputy for Command...guidelines in selecting a device for a specific application are included, with manufacturer models presented as illustrations. UNCLASSIFIED SECURITY

  13. Cargo Movement Operations System (CMOS). Software Requirements Specification

    DTIC Science & Technology

    1990-03-12

    was erroneously deleted. CMOS PMO ACCEPTS COMMENT: YES [ ] NO [ ] ERCI ACCEPTS COMMENT: YES [ ] NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN...previous SRS. CMOS PMO ACCEPTS COMMENT: YES [ ] NO [ ] ERCI ACCEPTS COMMENT: YES [ ] NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN [ ] CLOSED...ACCEPTS COMMENT: YES [ ] NO [ ] ERCI ACCEPTS COMMENT: YES [ ] NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN [ ] CLOSED [ ] 0 ORIGINATOR CONTROL NUMBER

  14. Microcomputer software for calculating the western Oregon elk habitat effectiveness index.

    Treesearch

    Alan Ager; Mark Hitchcock

    1992-01-01

    This paper describes the operation of the microcomputer program HEIWEST, which was developed to automate calculation of the western Oregon elk habitat effectiveness index (HEI). HEIWEST requires little or no training to operate and vastly simplifies the task of measuring HEI for either site-specific project analysis or long-term monitoring of elk habitat. It is...

  15. Software Prototyping: A Case Report of Refining User Requirements for a Health Information Exchange Dashboard.

    PubMed

    Nelson, Scott D; Del Fiol, Guilherme; Hanseler, Haley; Crouch, Barbara Insley; Cummins, Mollie R

    2016-01-01

    Health information exchange (HIE) between Poison Control Centers (PCCs) and Emergency Departments (EDs) could improve care of poisoned patients. However, PCC information systems are not designed to facilitate HIE with EDs; therefore, we are developing specialized software to support HIE within the normal workflow of the PCC using user-centered design and rapid prototyping. To describe the design of an HIE dashboard and the refinement of user requirements through rapid prototyping. Using previously elicited user requirements, we designed low-fidelity sketches of designs on paper with iterative refinement. Next, we designed an interactive high-fidelity prototype and conducted scenario-based usability tests with end users. Users were asked to think aloud while accomplishing tasks related to a case vignette. After testing, the users provided feedback and evaluated the prototype using the System Usability Scale (SUS). Survey results from three users provided useful feedback that was then incorporated into the design. After achieving a stable design, we used the prototype itself as the specification for development of the actual software. Benefits of prototyping included having 1) subject-matter experts heavily involved with the design; 2) flexibility to make rapid changes, 3) the ability to minimize software development efforts early in the design stage; 4) rapid finalization of requirements; 5) early visualization of designs; 6) and a powerful vehicle for communication of the design to the programmers. Challenges included 1) time and effort to develop the prototypes and case scenarios; 2) no simulation of system performance; 3) not having all proposed functionality available in the final product; and 4) missing needed data elements in the PCC information system.

  16. Using Software Technology to Specify Abstract Interfaces in VLSI Design.

    DTIC Science & Technology

    1985-01-01

    the information required for the specification should be provided "and nothir- g more." He felt that a chief source of specification failure was...this complexity control objective by mix i g two potentially separable kinds of information, i.e. functional/electrical and scheduling semantics...strength Syntactic: immaterial PINS Name AccFn C/D scim G -Scanln C scout PScanOut D routbar PDataOut D p )sh SShift D p2shbar SHold D plshbar S

  17. Representation and presentation of requirements knowledge

    NASA Technical Reports Server (NTRS)

    Johnson, W. L.; Feather, Martin S.; Harris, David R.

    1992-01-01

    An approach to representation and presentation of knowledge used in the ARIES, an experimental requirements/specification environment, is described. The approach applies the notion of a representation architecture to the domain of software engineering and incorporates a strong coupling to a transformation system. It is characterized by a single highly expressive underlying representation, interfaced simultaneously to multiple presentations, each with notations of differing degrees of expressivity. This enables analysts to use multiple languages for describing systems and have these descriptions yield a single consistent model of the system.

  18. IPG Job Manager v2.0 Design Documentation

    NASA Technical Reports Server (NTRS)

    Hu, Chaumin

    2003-01-01

    This viewgraph presentation provides a high-level design of the IPG Job Manager, and satisfies its Master Requirement Specification v2.0 Revision 1.0, 01/29/2003. The presentation includes a Software Architecture/Functional Overview with the following: Job Model; Job Manager Client/Server Architecture; Job Manager Client (Job Manager Client Class Diagram and Job Manager Client Activity Diagram); Job Manager Server (Job Manager Client Class Diagram and Job Manager Client Activity Diagram); Development Environment; Project Plan; Requirement Traceability.

  19. Machine translation project alternatives analysis

    NASA Technical Reports Server (NTRS)

    Bajis, Catherine J.; Bedford, Denise A. D.

    1993-01-01

    The Machine Translation Project consists of several components, two of which, the Project Plan and the Requirements Analysis, have already been delivered. The Project Plan details the overall rationale, objectives and time-table for the project as a whole. The Requirements Analysis compares a number of available machine translation systems, their capabilities, possible configurations, and costs. The Alternatives Analysis has resulted in a number of conclusions and recommendations to the NASA STI program concerning the acquisition of specific MT systems and related hardware and software.

  20. State analysis requirements database for engineering complex embedded systems

    NASA Technical Reports Server (NTRS)

    Bennett, Matthew B.; Rasmussen, Robert D.; Ingham, Michel D.

    2004-01-01

    It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer's intent, potentially leading to software errors. This problem is addressed by a systems engineering tool called the State Analysis Database, which provides a tool for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using the State Analysis Database.

Top