Sample records for automated verification process

  1. 40 CFR 1066.275 - Daily dynamometer readiness verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.275 Daily... automated process for this verification procedure, perform this evaluation by setting the initial speed and... your dynamometer does not perform this verification with an automated process: (1) With the dynamometer...

  2. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crowell, Michael W

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ‘hand’ comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink™ for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oakmore » Ridge National Laboratory’s High Flux Isotope Reactor (HFIR).« less

  3. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  4. Formal verification of automated teller machine systems using SPIN

    NASA Astrophysics Data System (ADS)

    Iqbal, Ikhwan Mohammad; Adzkiya, Dieky; Mukhlash, Imam

    2017-08-01

    Formal verification is a technique for ensuring the correctness of systems. This work focuses on verifying a model of the Automated Teller Machine (ATM) system against some specifications. We construct the model as a state transition diagram that is suitable for verification. The specifications are expressed as Linear Temporal Logic (LTL) formulas. We use Simple Promela Interpreter (SPIN) model checker to check whether the model satisfies the formula. This model checker accepts models written in Process Meta Language (PROMELA), and its specifications are specified in LTL formulas.

  5. Using Automation to Improve the Flight Software Testing Process

    NASA Technical Reports Server (NTRS)

    ODonnell, James R., Jr.; Andrews, Stephen F.; Morgenstern, Wendy M.; Bartholomew, Maureen O.; McComas, David C.; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, attitude control, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on previous missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the perceived benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  6. Using Automation to Improve the Flight Software Testing Process

    NASA Technical Reports Server (NTRS)

    ODonnell, James R., Jr.; Morgenstern, Wendy M.; Bartholomew, Maureen O.

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, knowledge of attitude control, and attitude control hardware, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on other missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  7. Automated Analysis of Stateflow Models

    NASA Technical Reports Server (NTRS)

    Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier

    2017-01-01

    Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.

  8. 30 CFR 227.601 - What are a State's responsibilities if it performs automated verification?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... performs automated verification? 227.601 Section 227.601 Mineral Resources MINERALS MANAGEMENT SERVICE... Perform Delegated Functions § 227.601 What are a State's responsibilities if it performs automated verification? To perform automated verification of production reports or royalty reports, you must: (a) Verify...

  9. High-speed autoverifying technology for printed wiring boards

    NASA Astrophysics Data System (ADS)

    Ando, Moritoshi; Oka, Hiroshi; Okada, Hideo; Sakashita, Yorihiro; Shibutani, Nobumi

    1996-10-01

    We have developed an automated pattern verification technique. The output of an automated optical inspection system contains many false alarms. Verification is needed to distinguish between minor irregularities and serious defects. In the past, this verification was usually done manually, which led to unsatisfactory product quality. The goal of our new automated verification system is to detect pattern features on surface mount technology boards. In our system, we employ a new illumination method, which uses multiple colors and multiple direction illumination. Images are captured with a CCD camera. We have developed a new algorithm that uses CAD data for both pattern matching and pattern structure determination. This helps to search for patterns around a defect and to examine defect definition rules. These are processed with a high speed workstation and a hard-wired circuits. The system can verify a defect within 1.5 seconds. The verification system was tested in a factory. It verified 1,500 defective samples and detected all significant defects with only a 0.1 percent of error rate (false alarm).

  10. Large - scale Rectangular Ruler Automated Verification Device

    NASA Astrophysics Data System (ADS)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  11. Ionospheric Modeling: Development, Verification and Validation

    DTIC Science & Technology

    2005-09-01

    facilitate the automated processing of a large network of GPS receiver data. 4.; CALIBRATION AND VALIDATION OF IONOSPHERIC SENSORS We have been...NOFS Workshop, Estes Park, CO, January 2005. W. Rideout, A. Coster, P. Doherty, MIT Haystack Automated Processing of GPS Data to Produce Worldwide TEC

  12. Automated System Calibration and Verification of the Position Measurements for the Los Alamos Isotope Production Facility and the Switchyard Kicker Facilities

    NASA Astrophysics Data System (ADS)

    Barr, D.; Gilpatrick, J. D.; Martinez, D.; Shurter, R. B.

    2004-11-01

    The Los Alamos Neutron Science Center (LANSCE) facility at Los Alamos National Laboratory has constructed both an Isotope Production Facility (IPF) and a Switchyard Kicker (XDK) as additions to the H+ and H- accelerator. These additions contain eleven Beam Position Monitors (BPMs) that measure the beam's position throughout the transport. The analog electronics within each processing module determines the beam position using the log-ratio technique. For system reliability, calibrations compensate for various temperature drifts and other imperfections in the processing electronics components. Additionally, verifications are periodically implemented by a PC running a National Instruments LabVIEW virtual instrument (VI) to verify continued system and cable integrity. The VI communicates with the processor cards via a PCI/MXI-3 VXI-crate communication module. Previously, accelerator operators performed BPM system calibrations typically once per day while beam was explicitly turned off. One of this new measurement system's unique achievements is its automated calibration and verification capability. Taking advantage of the pulsed nature of the LANSCE-facility beams, the integrated electronics hardware and VI perform calibration and verification operations between beam pulses without interrupting production beam delivery. The design, construction, and performance results of the automated calibration and verification portion of this position measurement system will be the topic of this paper.

  13. Verification and quality control of routine hematology analyzers.

    PubMed

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  14. A Tool for Automatic Verification of Real-Time Expert Systems

    NASA Technical Reports Server (NTRS)

    Traylor, B.; Schwuttke, U.; Quan, A.

    1994-01-01

    The creation of an automated, user-driven tool for expert system development, validation, and verification is curretly onoging at NASA's Jet Propulsion Laboratory. In the new age of faster, better, cheaper missions, there is an increased willingness to utilize embedded expert systems for encapsulating and preserving mission expertise in systems which combine conventional algorithmic processing and artifical intelligence. The once-questioned role of automation in spacecraft monitoring is now becoming one of increasing importance.

  15. vvtools v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drake, Richard R.

    Vvtools is a suite of testing tools, with a focus on reproducible verification and validation. They are written in pure Python, and contain a test harness and an automated process management tool. Users of vvtools can develop suites of verification and validation tests and run them on small to large high performance computing resources in an automated and reproducible way. The test harness enables complex processes to be performed in each test and even supports a one-level parent/child dependency between tests. It includes a built in capability to manage workloads requiring multiple processors and platforms that use batch queueing systems.

  16. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  17. Process development for automated solar cell and module production. Task 4: Automated array assembly

    NASA Technical Reports Server (NTRS)

    Hagerty, J. J.

    1981-01-01

    The cell preparation station was installed in its new enclosure. Operation verification tests were performed. The detailed layout drawings of the automated lamination station were produced and construction began. All major and most minor components were delivered by vendors. The station framework was built and assembly of components begun.

  18. A hardware-software system for the automation of verification and calibration of oil metering units secondary equipment

    NASA Astrophysics Data System (ADS)

    Boyarnikov, A. V.; Boyarnikova, L. V.; Kozhushko, A. A.; Sekachev, A. F.

    2017-08-01

    In the article the process of verification (calibration) of oil metering units secondary equipment is considered. The purpose of the work is to increase the reliability and reduce the complexity of this process by developing a software and hardware system that provides automated verification and calibration. The hardware part of this complex carries out the commutation of the measuring channels of the verified controller and the reference channels of the calibrator in accordance with the introduced algorithm. The developed software allows controlling the commutation of channels, setting values on the calibrator, reading the measured data from the controller, calculating errors and compiling protocols. This system can be used for checking the controllers of the secondary equipment of the oil metering units in the automatic verification mode (with the open communication protocol) or in the semi-automatic verification mode (without it). The peculiar feature of the approach used is the development of a universal signal switch operating under software control, which can be configured for various verification methods (calibration), which allows to cover the entire range of controllers of metering units secondary equipment. The use of automatic verification with the help of a hardware and software system allows to shorten the verification time by 5-10 times and to increase the reliability of measurements, excluding the influence of the human factor.

  19. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  20. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Design verification testing. 61.40-3 Section 61.40-3... INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...

  1. Systematic Model-in-the-Loop Test of Embedded Control Systems

    NASA Astrophysics Data System (ADS)

    Krupp, Alexander; Müller, Wolfgang

    Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.

  2. Automating Nuclear-Safety-Related SQA Procedures with Custom Applications

    DOE PAGES

    Freels, James D.

    2016-01-01

    Nuclear safety-related procedures are rigorous for good reason. Small design mistakes can quickly turn into unwanted failures. Researchers at Oak Ridge National Laboratory worked with COMSOL to define a simulation app that automates the software quality assurance (SQA) verification process and provides results in less than 24 hours.

  3. Phase 2 of the Array Automated Assembly Task for the Low Cost Silicon Solar Array Project

    NASA Technical Reports Server (NTRS)

    Wihl, M.; Torro, J.; Scheinine, A.; Anderson, J.

    1978-01-01

    An automated process sequence, to manufacture photovoltaic modules at a capacity of approximately 500 MW per year at a cost of approximately $0.50 per peak watt is described. Verification tests were performed and are reported along with cost predictions.

  4. Computer-Based Automation of Discrete Product Manufacture: A preliminary Discussion of Feasibility and Impact

    DTIC Science & Technology

    1974-07-01

    automated manufacturing processes and a rough technoeconomic evaluation of those concepts. Our evaluation is largely based on estimates; therefore, the...must be subjected to thorough analysis and experimental verification before they can be considered definitive. They are being published at this time...hardware and sensor technology, manufacturing engineering, automation, and economic analysis . Members of this team inspected over thirty manufacturing

  5. Application of software technology to automatic test data analysis

    NASA Technical Reports Server (NTRS)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  6. 40 CFR 1066.215 - Summary of verification and calibration procedures for chassis dynamometers.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer... manufacturer instructions and good engineering judgment. (c) Automated dynamometer verifications and... accomplish the verifications and calibrations specified in this subpart. You may use these automated...

  7. 40 CFR 1066.215 - Summary of verification and calibration procedures for chassis dynamometers.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer... manufacturer instructions and good engineering judgment. (c) Automated dynamometer verifications and... accomplish the verifications and calibrations specified in this subpart. You may use these automated...

  8. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...

  9. Compositional Verification of a Communication Protocol for a Remotely Operated Vehicle

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.; Munoz, Cesar A.

    2009-01-01

    This paper presents the specification and verification in the Prototype Verification System (PVS) of a protocol intended to facilitate communication in an experimental remotely operated vehicle used by NASA researchers. The protocol is defined as a stack-layered com- position of simpler protocols. It can be seen as the vertical composition of protocol layers, where each layer performs input and output message processing, and the horizontal composition of different processes concurrently inhabiting the same layer, where each process satisfies a distinct requirement. It is formally proven that the protocol components satisfy certain delivery guarantees. Compositional techniques are used to prove these guarantees also hold in the composed system. Although the protocol itself is not novel, the methodology employed in its verification extends existing techniques by automating the tedious and usually cumbersome part of the proof, thereby making the iterative design process of protocols feasible.

  10. Independent Validation and Verification of automated information systems in the Department of Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunteman, W.J.; Caldwell, R.

    1994-07-01

    The Department of Energy (DOE) has established an Independent Validation and Verification (IV&V) program for all classified automated information systems (AIS) operating in compartmented or multi-level modes. The IV&V program was established in DOE Order 5639.6A and described in the manual associated with the Order. This paper describes the DOE IV&V program, the IV&V process and activities, the expected benefits from an IV&V, and the criteria and methodologies used during an IV&V. The first IV&V under this program was conducted on the Integrated Computing Network (ICN) at Los Alamos National Laboratory and several lessons learned are presented. The DOE IV&Vmore » program is based on the following definitions. An IV&V is defined as the use of expertise from outside an AIS organization to conduct validation and verification studies on a classified AIS. Validation is defined as the process of applying the specialized security test and evaluation procedures, tools, and equipment needed to establish acceptance for joint usage of an AIS by one or more departments or agencies and their contractors. Verification is the process of comparing two levels of an AIS specification for proper correspondence (e.g., security policy model with top-level specifications, top-level specifications with source code, or source code with object code).« less

  11. 40 CFR 1066.245 - Response time verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... dynamometer's automated process to verify response time. Perform this test at two different inertia settings... observed settling response time must be less than 100 milliseconds for each inertia setting. ER15SE11.085 ...

  12. 40 CFR 1066.245 - Response time verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... dynamometer's automated process to verify response time. Perform this test at two different inertia settings... observed settling response time must be less than 100 milliseconds for each inertia setting. ER15SE11.085 ...

  13. ICSH guidelines for the verification and performance of automated cell counters for body fluids.

    PubMed

    Bourner, G; De la Salle, B; George, T; Tabe, Y; Baum, H; Culp, N; Keng, T B

    2014-12-01

    One of the many challenges facing laboratories is the verification of their automated Complete Blood Count cell counters for the enumeration of body fluids. These analyzers offer improved accuracy, precision, and efficiency in performing the enumeration of cells compared with manual methods. A patterns of practice survey was distributed to laboratories that participate in proficiency testing in Ontario, Canada, the United States, the United Kingdom, and Japan to determine the number of laboratories that are testing body fluids on automated analyzers and the performance specifications that were performed. Based on the results of this questionnaire, an International Working Group for the Verification and Performance of Automated Cell Counters for Body Fluids was formed by the International Council for Standardization in Hematology (ICSH) to prepare a set of guidelines to help laboratories plan and execute the verification of their automated cell counters to provide accurate and reliable results for automated body fluid counts. These guidelines were discussed at the ICSH General Assemblies and reviewed by an international panel of experts to achieve further consensus. © 2014 John Wiley & Sons Ltd.

  14. Handbook: Design of automated redundancy verification

    NASA Technical Reports Server (NTRS)

    Ford, F. A.; Hasslinger, T. W.; Moreno, F. J.

    1971-01-01

    The use of the handbook is discussed and the design progress is reviewed. A description of the problem is presented, and examples are given to illustrate the necessity for redundancy verification, along with the types of situations to which it is typically applied. Reusable space vehicles, such as the space shuttle, are recognized as being significant in the development of the automated redundancy verification problem.

  15. The Automation of Nowcast Model Assessment Processes

    DTIC Science & Technology

    2016-09-01

    that will automate real-time WRE-N model simulations, collect and quality control check weather observations for assimilation and verification, and...domains centered near White Sands Missile Range, New Mexico, where the Meteorological Sensor Array (MSA) will be located. The MSA will provide...observations and performing quality -control checks for the pre-forecast data assimilation period. 2. Run the WRE-N model to generate model forecast data

  16. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  17. Using Automated Theorem Provers to Certify Auto-Generated Aerospace Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd; Schumann, Johann

    2004-01-01

    We describe a system for the automated certification of safety properties of NASA software. The system uses Hoare-style program verification technology to generate proof obligations which are then processed by an automated first-order theorem prover (ATP). For full automation, however, the obligations must be aggressively preprocessed and simplified We describe the unique requirements this places on the ATP and demonstrate how the individual simplification stages, which are implemented by rewriting, influence the ability of the ATP to solve the proof tasks. Experiments on more than 25,000 tasks were carried out using Vampire, Spass, and e-setheo.

  18. Implementation and verification of global optimization benchmark problems

    NASA Astrophysics Data System (ADS)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  19. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  20. Independent Verification and Validation of Complex User Interfaces: A Human Factors Approach

    NASA Technical Reports Server (NTRS)

    Whitmore, Mihriban; Berman, Andrea; Chmielewski, Cynthia

    1996-01-01

    The Usability Testing and Analysis Facility (UTAF) at the NASA Johnson Space Center has identified and evaluated a potential automated software interface inspection tool capable of assessing the degree to which space-related critical and high-risk software system user interfaces meet objective human factors standards across each NASA program and project. Testing consisted of two distinct phases. Phase 1 compared analysis times and similarity of results for the automated tool and for human-computer interface (HCI) experts. In Phase 2, HCI experts critiqued the prototype tool's user interface. Based on this evaluation, it appears that a more fully developed version of the tool will be a promising complement to a human factors-oriented independent verification and validation (IV&V) process.

  1. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-31

    requirement for increased emphasis on software quality assurance has lead to the creation of various methods of verification and validation. Experience...result was a vast array of methods , systems, languages and automated tools to assist in the process. Given that the primary role of quality assurance is...Unfortunately, there is no single method , tool or technique that can insure accurate, reliable and cost effective software. Therefore, government and industry

  2. VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.

    2015-12-01

    A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.

  3. Software Verification of Orion Cockpit Displays

    NASA Technical Reports Server (NTRS)

    Biswas, M. A. Rafe; Garcia, Samuel; Prado, Matthew; Hossain, Sadad; Souris, Matthew; Morin, Lee

    2017-01-01

    NASA's latest spacecraft Orion is in the development process of taking humans deeper into space. Orion is equipped with three main displays to monitor and control the spacecraft. To ensure the software behind the glass displays operates without faults, rigorous testing is needed. To conduct such testing, the Rapid Prototyping Lab at NASA's Johnson Space Center along with the University of Texas at Tyler employed a software verification tool, EggPlant Functional by TestPlant. It is an image based test automation tool that allows users to create scripts to verify the functionality within a program. A set of edge key framework and Common EggPlant Functions were developed to enable creation of scripts in an efficient fashion. This framework standardized the way to code and to simulate user inputs in the verification process. Moreover, the Common EggPlant Functions can be used repeatedly in verification of different displays.

  4. Expert system verification and validation study. Delivery 3A and 3B: Trip summaries

    NASA Technical Reports Server (NTRS)

    French, Scott

    1991-01-01

    Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.

  5. Computational solution verification and validation applied to a thermal model of a ruggedized instrumentation package

    DOE PAGES

    Scott, Sarah Nicole; Templeton, Jeremy Alan; Hough, Patricia Diane; ...

    2014-01-01

    This study details a methodology for quantification of errors and uncertainties of a finite element heat transfer model applied to a Ruggedized Instrumentation Package (RIP). The proposed verification and validation (V&V) process includes solution verification to examine errors associated with the code's solution techniques, and model validation to assess the model's predictive capability for quantities of interest. The model was subjected to mesh resolution and numerical parameters sensitivity studies to determine reasonable parameter values and to understand how they change the overall model response and performance criteria. To facilitate quantification of the uncertainty associated with the mesh, automatic meshing andmore » mesh refining/coarsening algorithms were created and implemented on the complex geometry of the RIP. Automated software to vary model inputs was also developed to determine the solution’s sensitivity to numerical and physical parameters. The model was compared with an experiment to demonstrate its accuracy and determine the importance of both modelled and unmodelled physics in quantifying the results' uncertainty. An emphasis is placed on automating the V&V process to enable uncertainty quantification within tight development schedules.« less

  6. 40 CFR 1066.215 - Summary of verification procedures for chassis dynamometers.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.215 Summary... judgment. (c) Automated dynamometer verifications and calibrations. In some cases, dynamometers are... specified in this subpart. You may use these automated functions instead of following the procedures we...

  7. Formally verifying human–automation interaction as part of a system model: limitations and tradeoffs

    PubMed Central

    Bass, Ellen J.

    2011-01-01

    Both the human factors engineering (HFE) and formal methods communities are concerned with improving the design of safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to perform formal verification of human–automation interaction with a programmable device. This effort utilizes a system architecture composed of independent models of the human mission, human task behavior, human-device interface, device automation, and operational environment. The goals of this architecture were to allow HFE practitioners to perform formal verifications of realistic systems that depend on human–automation interaction in a reasonable amount of time using representative models, intuitive modeling constructs, and decoupled models of system components that could be easily changed to support multiple analyses. This framework was instantiated using a patient controlled analgesia pump in a two phased process where models in each phase were verified using a common set of specifications. The first phase focused on the mission, human-device interface, and device automation; and included a simple, unconstrained human task behavior model. The second phase replaced the unconstrained task model with one representing normative pump programming behavior. Because models produced in the first phase were too large for the model checker to verify, a number of model revisions were undertaken that affected the goals of the effort. While the use of human task behavior models in the second phase helped mitigate model complexity, verification time increased. Additional modeling tools and technological developments are necessary for model checking to become a more usable technique for HFE. PMID:21572930

  8. Automated Array Assembly, Phase 2

    NASA Technical Reports Server (NTRS)

    Carbajal, B. G.

    1979-01-01

    The Automated Array Assembly Task, Phase 2 of the Low Cost Silicon Solar Array Project is a process development task. The contract provides for the fabrication of modules from large area tandem junction cells (TJC). During this quarter, effort was focused on the design of a large area, approximately 36 sq cm, TJC and process verification runs. The large area TJC design was optimized for minimum I squared R power losses. In the TJM activity, the cell-module interfaces were defined, module substrates were formed and heat treated and clad metal interconnect strips were fabricated.

  9. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  10. A methodology for model-based development and automated verification of software for aerospace systems

    NASA Astrophysics Data System (ADS)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  11. 40 CFR 1066.245 - Response time verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... automated process to verify response time. You may perform this test either at two different inertia... base inertia and two acceleration rates that cover the range of acceleration rates experienced during... less than 100 milliseconds for each inertia setting. ER28AP14.066 ...

  12. Experimental verification of layout physical verification of silicon photonics

    NASA Astrophysics Data System (ADS)

    El Shamy, Raghi S.; Swillam, Mohamed A.

    2018-02-01

    Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.

  13. Space Construction Automated Fabrication Experiment Definition Study (SCAFEDS), part 3. Volume 3: Requirements

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The performance, design and verification requirements for the space Construction Automated Fabrication Experiment (SCAFE) are defined. The SCAFE program defines, develops, and demonstrates the techniques, processes, and equipment required for the automatic fabrication of structural elements in space and for the assembly of such elements into a large, lightweight structure. The program defines a large structural platform to be constructed in orbit using the space shuttle as a launch vehicle and construction base.

  14. Digital video system for on-line portal verification

    NASA Astrophysics Data System (ADS)

    Leszczynski, Konrad W.; Shalev, Shlomo; Cosby, N. Scott

    1990-07-01

    A digital system has been developed for on-line acquisition, processing and display of portal images during radiation therapy treatment. A metal/phosphor screen combination is the primary detector, where the conversion from high-energy photons to visible light takes place. A mirror angled at 45 degrees reflects the primary image to a low-light-level camera, which is removed from the direct radiation beam. The image registered by the camera is digitized, processed and displayed on a CRT monitor. Advanced digital techniques for processing of on-line images have been developed and implemented to enhance image contrast and suppress the noise. Some elements of automated radiotherapy treatment verification have been introduced.

  15. Cassini's Maneuver Automation Software (MAS) Process: How to Successfully Command 200 Navigation Maneuvers

    NASA Technical Reports Server (NTRS)

    Yang, Genevie Velarde; Mohr, David; Kirby, Charles E.

    2008-01-01

    To keep Cassini on its complex trajectory, more than 200 orbit trim maneuvers (OTMs) have been planned from July 2004 to July 2010. With only a few days between many of these OTMs, the operations process of planning and executing the necessary commands had to be automated. The resulting Maneuver Automation Software (MAS) process minimizes the workforce required for, and maximizes the efficiency of, the maneuver design and uplink activities. The MAS process is a well-organized and logically constructed interface between Cassini's Navigation (NAV), Spacecraft Operations (SCO), and Ground Software teams. Upon delivery of an orbit determination (OD) from NAV, the MAS process can generate a maneuver design and all related uplink and verification products within 30 minutes. To date, all 112 OTMs executed by the Cassini spacecraft have been successful. MAS was even used to successfully design and execute a maneuver while the spacecraft was in safe mode.

  16. Land Ice Verification and Validation Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  17. Array automated assembly task, phase 2. Low cost silicon solar array project

    NASA Technical Reports Server (NTRS)

    Rhee, S. S.; Jones, G. T.; Allison, K. T.

    1978-01-01

    Several modifications instituted in the wafer surface preparation process served to significantly reduce the process cost to 1.55 cents per peak watt in 1975 cents. Performance verification tests of a laser scanning system showed a limited capability to detect hidden cracks or defects, but with potential equipment modifications this cost effective system could be rendered suitable for applications. Installation of electroless nickel plating system was completed along with an optimization of the wafer plating process. The solder coating and flux removal process verification test was completed. An optimum temperature range of 500-550 C was found to produce uniform solder coating with the restriction that a modified dipping procedure is utilized. Finally, the construction of the spray-on dopant equipment was completed.

  18. Automation bias and verification complexity: a systematic review.

    PubMed

    Lyell, David; Coiera, Enrico

    2017-03-01

    While potentially reducing decision errors, decision support systems can introduce new types of errors. Automation bias (AB) happens when users become overreliant on decision support, which reduces vigilance in information seeking and processing. Most research originates from the human factors literature, where the prevailing view is that AB occurs only in multitasking environments. This review seeks to compare the human factors and health care literature, focusing on the apparent association of AB with multitasking and task complexity. EMBASE, Medline, Compendex, Inspec, IEEE Xplore, Scopus, Web of Science, PsycINFO, and Business Source Premiere from 1983 to 2015. Evaluation studies where task execution was assisted by automation and resulted in errors were included. Participants needed to be able to verify automation correctness and perform the task manually. Tasks were identified and grouped. Task and automation type and presence of multitasking were noted. Each task was rated for its verification complexity. Of 890 papers identified, 40 met the inclusion criteria; 6 were in health care. Contrary to the prevailing human factors view, AB was found in single tasks, typically involving diagnosis rather than monitoring, and with high verification complexity. The literature is fragmented, with large discrepancies in how AB is reported. Few studies reported the statistical significance of AB compared to a control condition. AB appears to be associated with the degree of cognitive load experienced in decision tasks, and appears to not be uniquely associated with multitasking. Strategies to minimize AB might focus on cognitive load reduction. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  19. Formal verification of human-automation interaction

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Heymann, Michael

    2002-01-01

    This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.

  20. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  1. An Empirical Evaluation of Automated Theorem Provers in Software Certification

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd; Schumann, Johann

    2004-01-01

    We describe a system for the automated certification of safety properties of NASA software. The system uses Hoare-style program verification technology to generate proof obligations which are then processed by an automated first-order theorem prover (ATP). We discuss the unique requirements this application places on the ATPs, focusing on automation, proof checking, and usability. For full automation, however, the obligations must be aggressively preprocessed and simplified, and we demonstrate how the individual simplification stages, which are implemented by rewriting, influence the ability of the ATPs to solve the proof tasks. Our results are based on 13 certification experiments that lead to more than 25,000 proof tasks which have each been attempted by Vampire, Spass, e-setheo, and Otter. The proofs found by Otter have been proof-checked by IVY.

  2. A digital flight control system verification laboratory

    NASA Technical Reports Server (NTRS)

    De Feo, P.; Saib, S.

    1982-01-01

    A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.

  3. Reconfigurable Very Long Instruction Word (VLIW) Processor

    NASA Technical Reports Server (NTRS)

    Velev, Miroslav N.

    2015-01-01

    Future NASA missions will depend on radiation-hardened, power-efficient processing systems-on-a-chip (SOCs) that consist of a range of processor cores custom tailored for space applications. Aries Design Automation, LLC, has developed a processing SOC that is optimized for software-defined radio (SDR) uses. The innovation implements the Institute of Electrical and Electronics Engineers (IEEE) RazorII voltage management technique, a microarchitectural mechanism that allows processor cores to self-monitor, self-analyze, and selfheal after timing errors, regardless of their cause (e.g., radiation; chip aging; variations in the voltage, frequency, temperature, or manufacturing process). This highly automated SOC can also execute legacy PowerPC 750 binary code instruction set architecture (ISA), which is used in the flight-control computers of many previous NASA space missions. In developing this innovation, Aries Design Automation has made significant contributions to the fields of formal verification of complex pipelined microprocessors and Boolean satisfiability (SAT) and has developed highly efficient electronic design automation tools that hold promise for future developments.

  4. The study of thermal processes in control systems of heat consumption of buildings

    NASA Astrophysics Data System (ADS)

    Tsynaeva, E.; A, Tsynaeva

    2017-11-01

    The article discusses the main thermal processes in the automated control systems for heat consumption (ACSHC) of buildings, schematic diagrams of these systems, mathematical models used for description of thermal processes in ACSHC. Conducted verification represented by mathematical models. It was found that the efficiency of the operation of ACSHC depend from the external and internal factors. Numerical study of dynamic modes of operation of ACSHC.

  5. Quality control and quality assurance plan for bridge channel-stability assessments in Massachusetts

    USGS Publications Warehouse

    Parker, Gene W.; Pinson, Harlow

    1993-01-01

    A quality control and quality assurance plan has been implemented as part of the Massachusetts bridge scour and channel-stability assessment program. This program is being conducted by the U.S. Geological Survey, Massachusetts-Rhode Island District, in cooperation with the Massachusetts Highway Department. Project personnel training, data-integrity verification, and new data-management technologies are being utilized in the channel-stability assessment process to improve current data-collection and management techniques. An automated data-collection procedure has been implemented to standardize channel-stability assessments on a regular basis within the State. An object-oriented data structure and new image management tools are used to produce a data base enabling management of multiple data object classes. Data will be reviewed by assessors and data base managers before being merged into a master bridge-scour data base, which includes automated data-verification routines.

  6. Automation of Presentation Record Production Based on Rich-Media Technology Using SNT Petri Nets Theory.

    PubMed

    Martiník, Ivo

    2015-01-01

    Rich-media describes a broad range of digital interactive media that is increasingly used in the Internet and also in the support of education. Last year, a special pilot audiovisual lecture room was built as a part of the MERLINGO (MEdia-rich Repository of LearnING Objects) project solution. It contains all the elements of the modern lecture room determined for the implementation of presentation recordings based on the rich-media technologies and their publication online or on-demand featuring the access of all its elements in the automated mode including automatic editing. Property-preserving Petri net process algebras (PPPA) were designed for the specification and verification of the Petri net processes. PPPA does not need to verify the composition of the Petri net processes because all their algebraic operators preserve the specified set of the properties. These original PPPA are significantly generalized for the newly introduced class of the SNT Petri process and agent nets in this paper. The PLACE-SUBST and ASYNC-PROC algebraic operators are defined for this class of Petri nets and their chosen properties are proved. The SNT Petri process and agent nets theory were significantly applied at the design, verification, and implementation of the programming system ensuring the pilot audiovisual lecture room functionality.

  7. Automation of Presentation Record Production Based on Rich-Media Technology Using SNT Petri Nets Theory

    PubMed Central

    Martiník, Ivo

    2015-01-01

    Rich-media describes a broad range of digital interactive media that is increasingly used in the Internet and also in the support of education. Last year, a special pilot audiovisual lecture room was built as a part of the MERLINGO (MEdia-rich Repository of LearnING Objects) project solution. It contains all the elements of the modern lecture room determined for the implementation of presentation recordings based on the rich-media technologies and their publication online or on-demand featuring the access of all its elements in the automated mode including automatic editing. Property-preserving Petri net process algebras (PPPA) were designed for the specification and verification of the Petri net processes. PPPA does not need to verify the composition of the Petri net processes because all their algebraic operators preserve the specified set of the properties. These original PPPA are significantly generalized for the newly introduced class of the SNT Petri process and agent nets in this paper. The PLACE-SUBST and ASYNC-PROC algebraic operators are defined for this class of Petri nets and their chosen properties are proved. The SNT Petri process and agent nets theory were significantly applied at the design, verification, and implementation of the programming system ensuring the pilot audiovisual lecture room functionality. PMID:26258164

  8. Automatic high throughput empty ISO container verification

    NASA Astrophysics Data System (ADS)

    Chalmers, Alex

    2007-04-01

    Encouraging results are presented for the automatic analysis of radiographic images of a continuous stream of ISO containers to confirm they are truly empty. A series of image processing algorithms are described that process real-time data acquired during the actual inspection of each container and assigns each to one of the classes "empty", "not empty" or "suspect threat". This research is one step towards achieving fully automated analysis of cargo containers.

  9. SU-E-T-398: Feasibility of Automated Tools for Robustness Evaluation of Advanced Photon and Proton Techniques in Oropharyngeal Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, H; Liang, X; Kalbasi, A

    2014-06-01

    Purpose: Advanced radiotherapy (RT) techniques such as proton pencil beam scanning (PBS) and photon-based volumetric modulated arc therapy (VMAT) have dosimetric advantages in the treatment of head and neck malignancies. However, anatomic or alignment changes during treatment may limit robustness of PBS and VMAT plans. We assess the feasibility of automated deformable registration tools for robustness evaluation in adaptive PBS and VMAT RT of oropharyngeal cancer (OPC). Methods: We treated 10 patients with bilateral OPC with advanced RT techniques and obtained verification CT scans with physician-reviewed target and OAR contours. We generated 3 advanced RT plans for each patient: protonmore » PBS plan using 2 posterior oblique fields (2F), proton PBS plan using an additional third low-anterior field (3F), and a photon VMAT plan using 2 arcs (Arc). For each of the planning techniques, we forward calculated initial (Ini) plans on the verification scans to create verification (V) plans. We extracted DVH indicators based on physician-generated contours for 2 target and 14 OAR structures to investigate the feasibility of two automated tools (contour propagation (CP) and dose deformation (DD)) as surrogates for routine clinical plan robustness evaluation. For each verification scan, we compared DVH indicators of V, CP and DD plans in a head-to-head fashion using Student's t-test. Results: We performed 39 verification scans; each patient underwent 3 to 6 verification scan. We found no differences in doses to target or OAR structures between V and CP, V and DD, and CP and DD plans across all patients (p > 0.05). Conclusions: Automated robustness evaluation tools, CP and DD, accurately predicted dose distributions of verification (V) plans using physician-generated contours. These tools may be further developed as a potential robustness screening tool in the workflow for adaptive treatment of OPC using advanced RT techniques, reducing the need for physician-generated contours.« less

  10. Industrial methodology for process verification in research (IMPROVER): toward systems biology verification

    PubMed Central

    Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo

    2012-01-01

    Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044

  11. SU-F-T-440: The Feasibility Research of Checking Cervical Cancer IMRT Pre- Treatment Dose Verification by Automated Treatment Planning Verification System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, X; Yin, Y; Lin, X

    Purpose: To assess the preliminary feasibility of automated treatment planning verification system in cervical cancer IMRT pre-treatment dose verification. Methods: The study selected randomly clinical IMRT treatment planning data for twenty patients with cervical cancer, all IMRT plans were divided into 7 fields to meet the dosimetric goals using a commercial treatment planning system(PianncleVersion 9.2and the EclipseVersion 13.5). The plans were exported to the Mobius 3D (M3D)server percentage differences of volume of a region of interest (ROI) and dose calculation of target region and organ at risk were evaluated, in order to validate the accuracy automated treatment planning verification system.more » Results: The difference of volume for Pinnacle to M3D was less than results for Eclipse to M3D in ROI, the biggest difference was 0.22± 0.69%, 3.5±1.89% for Pinnacle and Eclipse respectively. M3D showed slightly better agreement in dose of target and organ at risk compared with TPS. But after recalculating plans by M3D, dose difference for Pinnacle was less than Eclipse on average, results were within 3%. Conclusion: The method of utilizing the automated treatment planning system to validate the accuracy of plans is convenientbut the scope of differences still need more clinical patient cases to determine. At present, it should be used as a secondary check tool to improve safety in the clinical treatment planning.« less

  12. Software Quality Assurance and Verification for the MPACT Library Generation Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yuxuan; Williams, Mark L.; Wiarda, Dorothea

    This report fulfills the requirements for the Consortium for the Advanced Simulation of Light-Water Reactors (CASL) milestone L2:RTM.P14.02, “SQA and Verification for MPACT Library Generation,” by documenting the current status of the software quality, verification, and acceptance testing of nuclear data libraries for MPACT. It provides a brief overview of the library generation process, from general-purpose evaluated nuclear data files (ENDF/B) to a problem-dependent cross section library for modeling of light-water reactors (LWRs). The software quality assurance (SQA) programs associated with each of the software used to generate the nuclear data libraries are discussed; specific tests within the SCALE/AMPX andmore » VERA/XSTools repositories are described. The methods and associated tests to verify the quality of the library during the generation process are described in detail. The library generation process has been automated to a degree to (1) ensure that it can be run without user intervention and (2) to ensure that the library can be reproduced. Finally, the acceptance testing process that will be performed by representatives from the Radiation Transport Methods (RTM) Focus Area prior to the production library’s release is described in detail.« less

  13. Extremely accurate sequential verification of RELAP5-3D

    DOE PAGES

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less

  14. The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1995-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, JCH FUEL SOLUTIONS, INC., JCH ENVIRO AUTOMATED FUEL CLEANING AND MAINTENANCE SYSTEM

    EPA Science Inventory

    The verification testing was conducted at the Cl facility in North Las Vegas, NV, on July 17 and 18, 2001. During this period, engine emissions, fuel consumption, and fuel quality were evaluated with contaminated and cleaned fuel.

    To facilitate this verification, JCH repre...

  16. Automatic programming for critical applications

    NASA Technical Reports Server (NTRS)

    Loganantharaj, Raj L.

    1988-01-01

    The important phases of a software life cycle include verification and maintenance. Usually, the execution performance is an expected requirement in a software development process. Unfortunately, the verification and the maintenance of programs are the time consuming and the frustrating aspects of software engineering. The verification cannot be waived for the programs used for critical applications such as, military, space, and nuclear plants. As a consequence, synthesis of programs from specifications, an alternative way of developing correct programs, is becoming popular. The definition, or what is understood by automatic programming, has been changed with our expectations. At present, the goal of automatic programming is the automation of programming process. Specifically, it means the application of artificial intelligence to software engineering in order to define techniques and create environments that help in the creation of high level programs. The automatic programming process may be divided into two phases: the problem acquisition phase and the program synthesis phase. In the problem acquisition phase, an informal specification of the problem is transformed into an unambiguous specification while in the program synthesis phase such a specification is further transformed into a concrete, executable program.

  17. Proof Rules for Automated Compositional Verification through Learning

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Giannakopoulou, Dimitra; Pasareanu, Corina S.

    2003-01-01

    Compositional proof systems not only enable the stepwise development of concurrent processes but also provide a basis to alleviate the state explosion problem associated with model checking. An assume-guarantee style of specification and reasoning has long been advocated to achieve compositionality. However, this style of reasoning is often non-trivial, typically requiring human input to determine appropriate assumptions. In this paper, we present novel assume- guarantee rules in the setting of finite labelled transition systems with blocking communication. We show how these rules can be applied in an iterative and fully automated fashion within a framework based on learning.

  18. An automated qualification framework for the MeerKAT CAM (Control-And-Monitoring)

    NASA Astrophysics Data System (ADS)

    van den Heever, Lize; Marais, Neilen; Slabber, Martin

    2016-08-01

    This paper introduces and discusses the design of an Automated Qualification Framework (AQF) that was developed to automate as much as possible of the formal Qualification Testing of the Control And Monitoring (CAM) subsystem of the 64 dish MeerKAT radio telescope currently under construction in the Karoo region of South Africa. The AQF allows each Integrated CAM Test to reference the MeerKAT CAM requirement and associated verification requirement it covers and automatically produces the Qualification Test Procedure and Qualification Test Report from the test steps and evaluation steps annotated in the Integrated CAM Tests. The MeerKAT System Engineers are extremely happy with the AQF results, but mostly by the approach and process it enforces.

  19. Assessment of Automated Measurement and Verification (M&V) Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granderson, Jessica; Touzani, Samir; Custodio, Claudine

    This report documents the application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-building energy savings.

  20. Digital-flight-control-system software written in automated-engineering-design language: A user's guide of verification and validation tools

    NASA Technical Reports Server (NTRS)

    Saito, Jim

    1987-01-01

    The user guide of verification and validation (V&V) tools for the Automated Engineering Design (AED) language is specifically written to update the information found in several documents pertaining to the automated verification of flight software tools. The intent is to provide, in one document, all the information necessary to adequately prepare a run to use the AED V&V tools. No attempt is made to discuss the FORTRAN V&V tools since they were not updated and are not currently active. Additionally, the current descriptions of the AED V&V tools are contained and provides information to augment the NASA TM 84276. The AED V&V tools are accessed from the digital flight control systems verification laboratory (DFCSVL) via a PDP-11/60 digital computer. The AED V&V tool interface handlers on the PDP-11/60 generate a Univac run stream which is transmitted to the Univac via a Remote Job Entry (RJE) link. Job execution takes place on the Univac 1100 and the job output is transmitted back to the DFCSVL and stored as a PDP-11/60 printfile.

  1. Verification of operation of the actuator control system using the integration the B&R Automation Studio software with a virtual model of the actuator system

    NASA Astrophysics Data System (ADS)

    Herbuś, K.; Ociepka, P.

    2017-08-01

    In the work is analysed a sequential control system of a machine for separating and grouping work pieces for processing. Whereas, the area of the considered problem is related with verification of operation of an actuator system of an electro-pneumatic control system equipped with a PLC controller. Wherein to verification is subjected the way of operation of actuators in view of logic relationships assumed in the control system. The actuators of the considered control system were three drives of linear motion (pneumatic cylinders). And the logical structure of the system of operation of the control system is based on the signals flow graph. The tested logical structure of operation of the electro-pneumatic control system was implemented in the Automation Studio software of B&R company. This software is used to create programs for the PLC controllers. Next, in the FluidSIM software was created the model of the actuator system of the control system of a machine. To verify the created program for the PLC controller, simulating the operation of the created model, it was utilized the approach of integration these two programs using the tool for data exchange in the form of the OPC server.

  2. Managing complexity in simulations of land surface and near-surface processes

    DOE PAGES

    Coon, Ethan T.; Moulton, J. David; Painter, Scott L.

    2016-01-12

    Increasing computing power and the growing role of simulation in Earth systems science have led to an increase in the number and complexity of processes in modern simulators. We present a multiphysics framework that specifies interfaces for coupled processes and automates weak and strong coupling strategies to manage this complexity. Process management is enabled by viewing the system of equations as a tree, where individual equations are associated with leaf nodes and coupling strategies with internal nodes. A dynamically generated dependency graph connects a variable to its dependencies, streamlining and automating model evaluation, easing model development, and ensuring models aremore » modular and flexible. Additionally, the dependency graph is used to ensure that data requirements are consistent between all processes in a given simulation. Here we discuss the design and implementation of these concepts within the Arcos framework, and demonstrate their use for verification testing and hypothesis evaluation in numerical experiments.« less

  3. Space Construction Automated Fabrication Experiment Definition Study (SCAFEDS). Volume 3: Requirements

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The performance, design, and verification requirements for the space construction automated fabrication experiment (SCAFE) are defined and the source of each imposed or derived requirement is identified.

  4. Verification and Validation of Neural Networks for Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Mackall, Dale; Nelson, Stacy; Schumman, Johann; Clancy, Daniel (Technical Monitor)

    2002-01-01

    The Dryden Flight Research Center V&V working group and NASA Ames Research Center Automated Software Engineering (ASE) group collaborated to prepare this report. The purpose is to describe V&V processes and methods for certification of neural networks for aerospace applications, particularly adaptive flight control systems like Intelligent Flight Control Systems (IFCS) that use neural networks. This report is divided into the following two sections: 1) Overview of Adaptive Systems; and 2) V&V Processes/Methods.

  5. Verification and Validation of Neural Networks for Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Mackall, Dale; Nelson, Stacy; Schumann, Johann

    2002-01-01

    The Dryden Flight Research Center V&V working group and NASA Ames Research Center Automated Software Engineering (ASE) group collaborated to prepare this report. The purpose is to describe V&V processes and methods for certification of neural networks for aerospace applications, particularly adaptive flight control systems like Intelligent Flight Control Systems (IFCS) that use neural networks. This report is divided into the following two sections: Overview of Adaptive Systems and V&V Processes/Methods.

  6. TU-C-BRE-11: 3D EPID-Based in Vivo Dosimetry: A Major Step Forward Towards Optimal Quality and Safety in Radiation Oncology Practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mijnheer, B; Mans, A; Olaciregui-Ruiz, I

    Purpose: To develop a 3D in vivo dosimetry method that is able to substitute pre-treatment verification in an efficient way, and to terminate treatment delivery if the online measured 3D dose distribution deviates too much from the predicted dose distribution. Methods: A back-projection algorithm has been further developed and implemented to enable automatic 3D in vivo dose verification of IMRT/VMAT treatments using a-Si EPIDs. New software tools were clinically introduced to allow automated image acquisition, to periodically inspect the record-and-verify database, and to automatically run the EPID dosimetry software. The comparison of the EPID-reconstructed and planned dose distribution is donemore » offline to raise automatically alerts and to schedule actions when deviations are detected. Furthermore, a software package for online dose reconstruction was also developed. The RMS of the difference between the cumulative planned and reconstructed 3D dose distributions was used for triggering a halt of a linac. Results: The implementation of fully automated 3D EPID-based in vivo dosimetry was able to replace pre-treatment verification for more than 90% of the patient treatments. The process has been fully automated and integrated in our clinical workflow where over 3,500 IMRT/VMAT treatments are verified each year. By optimizing the dose reconstruction algorithm and the I/O performance, the delivered 3D dose distribution is verified in less than 200 ms per portal image, which includes the comparison between the reconstructed and planned dose distribution. In this way it was possible to generate a trigger that can stop the irradiation at less than 20 cGy after introducing large delivery errors. Conclusion: The automatic offline solution facilitated the large scale clinical implementation of 3D EPID-based in vivo dose verification of IMRT/VMAT treatments; the online approach has been successfully tested for various severe delivery errors.« less

  7. Automated Generation of Finite-Element Meshes for Aircraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Li, Wu; Robinson, Jay

    2016-01-01

    This paper presents a novel approach for automated generation of fully connected finite-element meshes for all internal structural components and skins of a given wing-body geometry model, controlled by a few conceptual-level structural layout parameters. Internal structural components include spars, ribs, frames, and bulkheads. Structural layout parameters include spar/rib locations in wing chordwise/spanwise direction and frame/bulkhead locations in longitudinal direction. A simple shell thickness optimization problem with two load conditions is used to verify versatility and robustness of the automated meshing process. The automation process is implemented in ModelCenter starting from an OpenVSP geometry and ending with a NASTRAN 200 solution. One subsonic configuration and one supersonic configuration are used for numerical verification. Two different structural layouts are constructed for each configuration and five finite-element meshes of different sizes are generated for each layout. The paper includes various comparisons of solutions of 20 thickness optimization problems, as well as discussions on how the optimal solutions are affected by the stress constraint bound and the initial guess of design variables.

  8. Position verification systems for an automated highway system.

    DOT National Transportation Integrated Search

    2015-03-01

    Automated vehicles promote road safety, fuel efficiency, and reduced travel time by decreasing traffic : congestion and driver workload. In a vehicle platoon (grouping vehicles to increase road capacity by : managing distance between vehicles using e...

  9. Automated Verification of Specifications with Typestates and Access Permissions

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Catano, Nestor

    2011-01-01

    We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).

  10. How gamma radiation processing systems are benefiting from the latest advances in information technology

    NASA Astrophysics Data System (ADS)

    Gibson, Wayne H.; Levesque, Daniel

    2000-03-01

    This paper discusses how gamma irradiation plants are putting the latest advances in computer and information technology to use for better process control, cost savings, and strategic advantages. Some irradiator operations are gaining significant benefits by integrating computer technology and robotics with real-time information processing, multi-user databases, and communication networks. The paper reports on several irradiation facilities that are making good use of client/server LANs, user-friendly graphics interfaces, supervisory control and data acquisition (SCADA) systems, distributed I/O with real-time sensor devices, trending analysis, real-time product tracking, dynamic product scheduling, and automated dosimetry reading. These plants are lowering costs by fast and reliable reconciliation of dosimetry data, easier validation to GMP requirements, optimizing production flow, and faster release of sterilized products to market. There is a trend in the manufacturing sector towards total automation using "predictive process control". Real-time verification of process parameters "on-the-run" allows control parameters to be adjusted appropriately, before the process strays out of limits. Applying this technology to the gamma radiation process, control will be based on monitoring the key parameters such as time, and making adjustments during the process to optimize quality and throughput. Dosimetry results will be used as a quality control measurement rather than as a final monitor for the release of the product. Results are correlated with the irradiation process data to quickly and confidently reconcile variations. Ultimately, a parametric process control system utilizing responsive control, feedback and verification will not only increase productivity and process efficiency, but can also result in operating within tighter dose control set points.

  11. Method for modeling post-mortem biometric 3D fingerprints

    NASA Astrophysics Data System (ADS)

    Rajeev, Srijith; Shreyas, Kamath K. M.; Agaian, Sos S.

    2016-05-01

    Despite the advancements of fingerprint recognition in 2-D and 3-D domain, authenticating deformed/post-mortem fingerprints continue to be an important challenge. Prior cleansing and reconditioning of the deceased finger is required before acquisition of the fingerprint. The victim's finger needs to be precisely and carefully operated by a medium to record the fingerprint impression. This process may damage the structure of the finger, which subsequently leads to higher false rejection rates. This paper proposes a non-invasive method to perform 3-D deformed/post-mortem finger modeling, which produces a 2-D rolled equivalent fingerprint for automated verification. The presented novel modeling method involves masking, filtering, and unrolling. Computer simulations were conducted on finger models with different depth variations obtained from Flashscan3D LLC. Results illustrate that the modeling scheme provides a viable 2-D fingerprint of deformed models for automated verification. The quality and adaptability of the obtained unrolled 2-D fingerprints were analyzed using NIST fingerprint software. Eventually, the presented method could be extended to other biometric traits such as palm, foot, tongue etc. for security and administrative applications.

  12. Imperceptible watermarking for security of fundus images in tele-ophthalmology applications and computer-aided diagnosis of retina diseases.

    PubMed

    Singh, Anushikha; Dutta, Malay Kishore

    2017-12-01

    The authentication and integrity verification of medical images is a critical and growing issue for patients in e-health services. Accurate identification of medical images and patient verification is an essential requirement to prevent error in medical diagnosis. The proposed work presents an imperceptible watermarking system to address the security issue of medical fundus images for tele-ophthalmology applications and computer aided automated diagnosis of retinal diseases. In the proposed work, patient identity is embedded in fundus image in singular value decomposition domain with adaptive quantization parameter to maintain perceptual transparency for variety of fundus images like healthy fundus or disease affected image. In the proposed method insertion of watermark in fundus image does not affect the automatic image processing diagnosis of retinal objects & pathologies which ensure uncompromised computer-based diagnosis associated with fundus image. Patient ID is correctly recovered from watermarked fundus image for integrity verification of fundus image at the diagnosis centre. The proposed watermarking system is tested in a comprehensive database of fundus images and results are convincing. results indicate that proposed watermarking method is imperceptible and it does not affect computer vision based automated diagnosis of retinal diseases. Correct recovery of patient ID from watermarked fundus image makes the proposed watermarking system applicable for authentication of fundus images for computer aided diagnosis and Tele-ophthalmology applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. SAMI Automated Plug Plate Configuration

    NASA Astrophysics Data System (ADS)

    Lorente, N. P. F.; Farrell, T.; Goodwin, M.

    2013-10-01

    The Sydney-AAO Multi-object Integral field spectrograph (SAMI) is a prototype wide-field system at the Anglo-Australian Telescope (AAT) which uses a plug-plate to mount its 13×61-core imaging fibre bundles (hexabundles) in the optical path at the telescope's prime focus. In this paper we describe the process of determining the positions of the plug-plate holes, where plates contain three or more stacked observation configurations. The process, which up until now has involved several separate processes and has required significant manual configuration and checking, is now being automated to increase efficiency and reduce error. This is carried out by means of a thin Java controller layer which drives the configuration cycle. This layer controls the user interface and the C++ algorithm layer where the plate configuration and optimisation is carried out. Additionally, through the Aladin display package, it provides visualisation and facilitates user verification of the resulting plates.

  14. Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks

    NASA Technical Reports Server (NTRS)

    Anderson, Mark G.

    2011-01-01

    This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.

  15. 21 CFR 1311.205 - Pharmacy application requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... that indicates the prescription was signed; or (ii) Display the field for the pharmacist's verification... must display the information for the pharmacist's verification. (10) The pharmacy application must... § 1311.215 in a format that is readable by the pharmacist. Such an internal audit may be automated and...

  16. 21 CFR 1311.205 - Pharmacy application requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... that indicates the prescription was signed; or (ii) Display the field for the pharmacist's verification... must display the information for the pharmacist's verification. (10) The pharmacy application must... § 1311.215 in a format that is readable by the pharmacist. Such an internal audit may be automated and...

  17. 21 CFR 1311.205 - Pharmacy application requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... that indicates the prescription was signed; or (ii) Display the field for the pharmacist's verification... must display the information for the pharmacist's verification. (10) The pharmacy application must... § 1311.215 in a format that is readable by the pharmacist. Such an internal audit may be automated and...

  18. 21 CFR 1311.205 - Pharmacy application requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... that indicates the prescription was signed; or (ii) Display the field for the pharmacist's verification... must display the information for the pharmacist's verification. (10) The pharmacy application must... § 1311.215 in a format that is readable by the pharmacist. Such an internal audit may be automated and...

  19. The Status and Promise of Advanced M&V: An Overview of “M&V 2.0” Methods, Tools, and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franconi, Ellen; Gee, Matt; Goldberg, Miriam

    Advanced measurement and verification (M&V) of energy efficiency savings, often referred to as M&V 2.0 or advanced M&V, is currently an object of much industry attention. Thus far, however, there has been a lack of clarity about what techniques M&V 2.0 includes, how those techniques differ from traditional approaches, what the key considerations are for their use, and what value propositions M&V 2.0 presents to different stakeholders. The objective of this paper is to provide background information and frame key discussion points related to advanced M&V. The paper identifies the benefits, methods, and requirements of advanced M&V and outlines keymore » technical issues for applying these methods. It presents an overview of the distinguishing elements of M&V 2.0 tools and of how the industry is addressing needs for tool testing, consistency, and standardization, and it identifies opportunities for collaboration. In this paper, we consider two key features of M&V 2.0: (1) automated analytics that can provide ongoing, near-real-time savings estimates, and (2) increased data granularity in terms of frequency, volume, or end-use detail. Greater data granularity for large numbers of customers, such as that derived from comprehensive implementation of advanced metering infrastructure (AMI) systems, leads to very large data volumes. This drives interest in automated processing systems. It is worth noting, however, that automated processing can provide value even when applied to less granular data, such as monthly consumption data series. Likewise, more granular data, such as interval or end-use data, delivers value with or without automated processing, provided the processing is manageable. But it is the combination of greater data detail with automated processing that offers the greatest opportunity for value. Using M&V methods that capture load shapes together with automated processing1 can determine savings in near-real time to provide stakeholders with more timely and detailed information. This information can be used to inform ongoing building operations, provide early input on energy efficiency program design, or assess the impact of efficiency by location and time of day. Stakeholders who can make use of such information include regulators, energy efficiency program administrators, program evaluators, contractors and aggregators, building owners, the investment community, and grid planners. Although each stakeholder has its own priorities and challenges related to savings measurement and verification, the potential exists for all to draw from a single set of efficiency valuation data. Such an integrated approach could provide a base consistency across stakeholder uses.« less

  20. Development of automated optical verification technologies for control systems

    NASA Astrophysics Data System (ADS)

    Volegov, Peter L.; Podgornov, Vladimir A.

    1999-08-01

    The report considers optical techniques for automated verification of object's identity designed for control system of nuclear objects. There are presented results of experimental researches and results of development of pattern recognition techniques carried out under the ISTC project number 772 with the purpose of identification of unique feature of surface structure of a controlled object and effects of its random treatment. Possibilities of industrial introduction of the developed technologies in frames of USA and Russia laboratories' lab-to-lab cooperation, including development of up-to-date systems for nuclear material control and accounting are examined.

  1. Caveat emptor: limitations of the automated reconstruction of metabolic pathways in Plasmodium.

    PubMed

    Ginsburg, Hagai

    2009-01-01

    The functional reconstruction of metabolic pathways from an annotated genome is a tedious and demanding enterprise. Automation of this endeavor using bioinformatics algorithms could cope with the ever-increasing number of sequenced genomes and accelerate the process. Here, the manual reconstruction of metabolic pathways in the functional genomic database of Plasmodium falciparum--Malaria Parasite Metabolic Pathways--is described and compared with pathways generated automatically as they appear in PlasmoCyc, metaSHARK and the Kyoto Encyclopedia for Genes and Genomes. A critical evaluation of this comparison discloses that the automatic reconstruction of pathways generates manifold paths that need an expert manual verification to accept some and reject most others based on manually curated gene annotation.

  2. Automation is an Effective Way to Improve Quality of Verification (Calibration) of Measuring Instruments

    NASA Astrophysics Data System (ADS)

    Golobokov, M.; Danilevich, S.

    2018-04-01

    In order to assess calibration reliability and automate such assessment, procedures for data collection and simulation study of thermal imager calibration procedure have been elaborated. The existing calibration techniques do not always provide high reliability. A new method for analyzing the existing calibration techniques and developing new efficient ones has been suggested and tested. A type of software has been studied that allows generating instrument calibration reports automatically, monitoring their proper configuration, processing measurement results and assessing instrument validity. The use of such software allows reducing man-hours spent on finalization of calibration data 2 to 5 times and eliminating a whole set of typical operator errors.

  3. Abstraction and Assume-Guarantee Reasoning for Automated Software Verification

    NASA Technical Reports Server (NTRS)

    Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.

    2004-01-01

    Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.

  4. Energy Information Systems

    Science.gov Websites

    Energy Analytics Campaign > 2014-2018 Assessment of Automated M&V Methods > 2012-2018 Better Assessment of automated measurement and verification methods. Granderson, J. et al. Lawrence Berkeley . PDF, 726 KB Performance Metrics and Objective Testing Methods for Energy Baseline Modeling Software

  5. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  6. Advanced Spectroscopic and Thermal Imaging Instrumentation for Shock Tube and Ballistic Range Facilities

    DTIC Science & Technology

    2010-04-01

    the development process, increase its quality and reduce development time through automation of synthesis, analysis or verification. For this purpose...made of time-non-deterministic systems, improving efficiency and reducing complexity of formal analysis . We also show how our theory relates to, and...of the most recent investigations for Earth and Mars atmospheres will be discussed in the following sections. 2.4.1 Earth: lunar return NASA’s

  7. Maximizing coupling-efficiency of high-power diode lasers utilizing hybrid assembly technology

    NASA Astrophysics Data System (ADS)

    Zontar, D.; Dogan, M.; Fulghum, S.; Müller, T.; Haag, S.; Brecher, C.

    2015-03-01

    In this paper, we present hybrid assembly technology to maximize coupling efficiency for spatially combined laser systems. High quality components, such as center-turned focusing units, as well as suitable assembly strategies are necessary to obtain highest possible output ratios. Alignment strategies are challenging tasks due to their complexity and sensitivity. Especially in low-volume production fully automated systems are economically at a disadvantage, as operator experience is often expensive. However reproducibility and quality of automatically assembled systems can be superior. Therefore automated and manual assembly techniques are combined to obtain high coupling efficiency while preserving maximum flexibility. The paper will describe necessary equipment and software to enable hybrid assembly processes. Micromanipulator technology with high step-resolution and six degrees of freedom provide a large number of possible evaluation points. Automated algorithms are necess ary to speed-up data gathering and alignment to efficiently utilize available granularity for manual assembly processes. Furthermore, an engineering environment is presented to enable rapid prototyping of automation tasks with simultaneous data ev aluation. Integration with simulation environments, e.g. Zemax, allows the verification of assembly strategies in advance. Data driven decision making ensures constant high quality, documents the assembly process and is a basis for further improvement. The hybrid assembly technology has been applied on several applications for efficiencies above 80% and will be discussed in this paper. High level coupling efficiency has been achieved with minimized assembly as a result of semi-automated alignment. This paper will focus on hybrid automation for optimizing and attaching turning mirrors and collimation lenses.

  8. Automated biowaste sampling system urine subsystem operating model, part 1

    NASA Technical Reports Server (NTRS)

    Fogal, G. L.; Mangialardi, J. K.; Rosen, F.

    1973-01-01

    The urine subsystem automatically provides for the collection, volume sensing, and sampling of urine from six subjects during space flight. Verification of the subsystem design was a primary objective of the current effort which was accomplished thru the detail design, fabrication, and verification testing of an operating model of the subsystem.

  9. Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Yvonne (Editor)

    2008-01-01

    Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.

  10. Capturing Safety Requirements to Enable Effective Task Allocation Between Humans and Automaton in Increasingly Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Neogi, Natasha A.

    2016-01-01

    There is a current drive towards enabling the deployment of increasingly autonomous systems in the National Airspace System (NAS). However, shifting the traditional roles and responsibilities between humans and automation for safety critical tasks must be managed carefully, otherwise the current emergent safety properties of the NAS may be disrupted. In this paper, a verification activity to assess the emergent safety properties of a clearly defined, safety critical, operational scenario that possesses tasks that can be fluidly allocated between human and automated agents is conducted. Task allocation role sets were proposed for a human-automation team performing a contingency maneuver in a reduced crew context. A safety critical contingency procedure (engine out on takeoff) was modeled in the Soar cognitive architecture, then translated into the Hybrid Input Output formalism. Verification activities were then performed to determine whether or not the safety properties held over the increasingly autonomous system. The verification activities lead to the development of several key insights regarding the implicit assumptions on agent capability. It subsequently illustrated the usefulness of task annotations associated with specialized requirements (e.g., communication, timing etc.), and demonstrated the feasibility of this approach.

  11. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data.

    PubMed

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-07-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users.

  12. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data

    PubMed Central

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-01-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users. PMID:20501601

  13. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  14. Proceedings of the First NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  15. 30 CFR 227.600 - What automated verification functions may a State perform?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... involves systematic monitoring of production and royalty reports to identify and resolve reporting or... reported by royalty reporters to sales and transfer volumes reported by production reporters. If you request delegation of automated comparison of sales and production volumes, you must perform at least the...

  16. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify eachmore » cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.« less

  17. Fault management for the Space Station Freedom control center

    NASA Technical Reports Server (NTRS)

    Clark, Colin; Jowers, Steven; Mcnenny, Robert; Culbert, Chris; Kirby, Sarah; Lauritsen, Janet

    1992-01-01

    This paper describes model based reasoning fault isolation in complex systems using automated digraph analysis. It discusses the use of the digraph representation as the paradigm for modeling physical systems and a method for executing these failure models to provide real-time failure analysis. It also discusses the generality, ease of development and maintenance, complexity management, and susceptibility to verification and validation of digraph failure models. It specifically describes how a NASA-developed digraph evaluation tool and an automated process working with that tool can identify failures in a monitored system when supplied with one or more fault indications. This approach is well suited to commercial applications of real-time failure analysis in complex systems because it is both powerful and cost effective.

  18. Challenges in High-Assurance Runtime Verification

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.

    2016-01-01

    Safety-critical systems are growing more complex and becoming increasingly autonomous. Runtime Verification (RV) has the potential to provide protections when a system cannot be assured by conventional means, but only if the RV itself can be trusted. In this paper, we proffer a number of challenges to realizing high-assurance RV and illustrate how we have addressed them in our research. We argue that high-assurance RV provides a rich target for automated verification tools in hope of fostering closer collaboration among the communities.

  19. Automated feature detection and identification in digital point-ordered signals

    DOEpatents

    Oppenlander, Jane E.; Loomis, Kent C.; Brudnoy, David M.; Levy, Arthur J.

    1998-01-01

    A computer-based automated method to detect and identify features in digital point-ordered signals. The method is used for processing of non-destructive test signals, such as eddy current signals obtained from calibration standards. The signals are first automatically processed to remove noise and to determine a baseline. Next, features are detected in the signals using mathematical morphology filters. Finally, verification of the features is made using an expert system of pattern recognition methods and geometric criteria. The method has the advantage that standard features can be, located without prior knowledge of the number or sequence of the features. Further advantages are that standard features can be differentiated from irrelevant signal features such as noise, and detected features are automatically verified by parameters extracted from the signals. The method proceeds fully automatically without initial operator set-up and without subjective operator feature judgement.

  20. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, Roy H.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.

    1987-01-01

    The Software Automation, Generation and Administration (SAGA) project is investigating the design and construction of practical software engineering environments for developing and maintaining aerospace systems and applications software. The research includes the practical organization of the software lifecycle, configuration management, software requirements specifications, executable specifications, design methodologies, programming, verification, validation and testing, version control, maintenance, the reuse of software, software libraries, documentation, and automated management.

  1. Automated Array Assembly, Phase 2

    NASA Technical Reports Server (NTRS)

    Daiello, R. V.

    1978-01-01

    The purpose of the overall program is to establish technological readiness and provide verification for the elements of a manufacturing sequence which would ultimately be suitable for the large-scale production of silicon solar-array modules at a selling price of less than $500/kW. A program and process plan for accomplishing this objective was developed and put into operation. Three junction-formation processes are shown; since cost analysis shows that they do not differ greatly in cost, each should be considered for technical merits and possible future cost reduction. The progress made in the various process steps of the plan is described, and conclusions are presented.

  2. AIRSAR Web-Based Data Processing

    NASA Technical Reports Server (NTRS)

    Chu, Anhua; Van Zyl, Jakob; Kim, Yunjin; Hensley, Scott; Lou, Yunling; Madsen, Soren; Chapman, Bruce; Imel, David; Durden, Stephen; Tung, Wayne

    2007-01-01

    The AIRSAR automated, Web-based data processing and distribution system is an integrated, end-to-end synthetic aperture radar (SAR) processing system. Designed to function under limited resources and rigorous demands, AIRSAR eliminates operational errors and provides for paperless archiving. Also, it provides a yearly tune-up of the processor on flight missions, as well as quality assurance with new radar modes and anomalous data compensation. The software fully integrates a Web-based SAR data-user request subsystem, a data processing system to automatically generate co-registered multi-frequency images from both polarimetric and interferometric data collection modes in 80/40/20 MHz bandwidth, an automated verification quality assurance subsystem, and an automatic data distribution system for use in the remote-sensor community. Features include Survey Automation Processing in which the software can automatically generate a quick-look image from an entire 90-GB SAR raw data 32-MB/s tape overnight without operator intervention. Also, the software allows product ordering and distribution via a Web-based user request system. To make AIRSAR more user friendly, it has been designed to let users search by entering the desired mission flight line (Missions Searching), or to search for any mission flight line by entering the desired latitude and longitude (Map Searching). For precision image automation processing, the software generates the products according to each data processing request stored in the database via a Queue management system. Users are able to have automatic generation of coregistered multi-frequency images as the software generates polarimetric and/or interferometric SAR data processing in ground and/or slant projection according to user processing requests for one of the 12 radar modes.

  3. Universal Verification Methodology Based Register Test Automation Flow.

    PubMed

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  4. Prototype automated post-MECO ascent I-load Verification Data Table

    NASA Technical Reports Server (NTRS)

    Lardas, George D.

    1990-01-01

    A prototype automated processor for quality assurance of Space Shuttle post-Main Engine Cut Off (MECO) ascent initialization parameters (I-loads) is described. The processor incorporates Clips rules adapted from the quality assurance criteria for the post-MECO ascent I-loads. Specifically, the criteria are implemented for nominal and abort targets, as given in the 'I-load Verification Data Table, Part 3, Post-MECO Ascent, Version 2.1, December 1989.' This processor, ivdt, compares a given l-load set with the stated mission design and quality assurance criteria. It determines which I-loads violate the stated criteria, and presents a summary of I-loads that pass or fail the tests.

  5. 46 CFR 62.25-25 - Programmable systems and devices.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 2 2014-10-01 2014-10-01 false Programmable systems and devices. 62.25-25 Section 62.25... AUTOMATION General Requirements for All Automated Vital Systems § 62.25-25 Programmable systems and devices. (a) Programmable control or alarm system logic must not be altered after Design Verification testing...

  6. Automated analysis in generic groups

    NASA Astrophysics Data System (ADS)

    Fagerholm, Edvard

    This thesis studies automated methods for analyzing hardness assumptions in generic group models, following ideas of symbolic cryptography. We define a broad class of generic and symbolic group models for different settings---symmetric or asymmetric (leveled) k-linear groups --- and prove ''computational soundness'' theorems for the symbolic models. Based on this result, we formulate a master theorem that relates the hardness of an assumption to solving problems in polynomial algebra. We systematically analyze these problems identifying different classes of assumptions and obtain decidability and undecidability results. Then, we develop automated procedures for verifying the conditions of our master theorems, and thus the validity of hardness assumptions in generic group models. The concrete outcome is an automated tool, the Generic Group Analyzer, which takes as input the statement of an assumption, and outputs either a proof of its generic hardness or shows an algebraic attack against the assumption. Structure-preserving signatures are signature schemes defined over bilinear groups in which messages, public keys and signatures are group elements, and the verification algorithm consists of evaluating ''pairing-product equations''. Recent work on structure-preserving signatures studies optimality of these schemes in terms of the number of group elements needed in the verification key and the signature, and the number of pairing-product equations in the verification algorithm. While the size of keys and signatures is crucial for many applications, another aspect of performance is the time it takes to verify a signature. The most expensive operation during verification is the computation of pairings. However, the concrete number of pairings is not captured by the number of pairing-product equations considered in earlier work. We consider the question of what is the minimal number of pairing computations needed to verify structure-preserving signatures. We build an automated tool to search for structure-preserving signatures matching a template. Through exhaustive search we conjecture lower bounds for the number of pairings required in the Type~II setting and prove our conjecture to be true. Finally, our tool exhibits examples of structure-preserving signatures matching the lower bounds, which proves tightness of our bounds, as well as improves on previously known structure-preserving signature schemes.

  7. Very fast road database verification using textured 3D city models obtained from airborne imagery

    NASA Astrophysics Data System (ADS)

    Bulatov, Dimitri; Ziems, Marcel; Rottensteiner, Franz; Pohl, Melanie

    2014-10-01

    Road databases are known to be an important part of any geodata infrastructure, e.g. as the basis for urban planning or emergency services. Updating road databases for crisis events must be performed quickly and with the highest possible degree of automation. We present a semi-automatic algorithm for road verification using textured 3D city models, starting from aerial or even UAV-images. This algorithm contains two processes, which exchange input and output, but basically run independently from each other. These processes are textured urban terrain reconstruction and road verification. The first process contains a dense photogrammetric reconstruction of 3D geometry of the scene using depth maps. The second process is our core procedure, since it contains various methods for road verification. Each method represents a unique road model and a specific strategy, and thus is able to deal with a specific type of roads. Each method is designed to provide two probability distributions, where the first describes the state of a road object (correct, incorrect), and the second describes the state of its underlying road model (applicable, not applicable). Based on the Dempster-Shafer Theory, both distributions are mapped to a single distribution that refers to three states: correct, incorrect, and unknown. With respect to the interaction of both processes, the normalized elevation map and the digital orthophoto generated during 3D reconstruction are the necessary input - together with initial road database entries - for the road verification process. If the entries of the database are too obsolete or not available at all, sensor data evaluation enables classification of the road pixels of the elevation map followed by road map extraction by means of vectorization and filtering of the geometrically and topologically inconsistent objects. Depending on the time issue and availability of a geo-database for buildings, the urban terrain reconstruction procedure has semantic models of buildings, trees, and ground as output. Building s and ground are textured by means of available images. This facilitates the orientation in the model and the interactive verification of the road objects that where initially classified as unknown. The three main modules of the texturing algorithm are: Pose estimation (if the videos are not geo-referenced), occlusion analysis, and texture synthesis.

  8. Verification and Validation of the General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

    2014-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

  9. Advanced Distributed Measurements and Data Processing at the Vibro-Acoustic Test Facility, GRC Space Power Facility, Sandusky, Ohio - an Architecture and an Example

    NASA Technical Reports Server (NTRS)

    Hill, Gerald M.; Evans, Richard K.

    2009-01-01

    A large-scale, distributed, high-speed data acquisition system (HSDAS) is currently being installed at the Space Power Facility (SPF) at NASA Glenn Research Center s Plum Brook Station in Sandusky, OH. This installation is being done as part of a facility construction project to add Vibro-acoustic Test Capabilities (VTC) to the current thermal-vacuum testing capability of SPF in support of the Orion Project s requirement for Space Environments Testing (SET). The HSDAS architecture is a modular design, which utilizes fully-remotely managed components, enables the system to support multiple test locations with a wide-range of measurement types and a very large system channel count. The architecture of the system is presented along with details on system scalability and measurement verification. In addition, the ability of the system to automate many of its processes such as measurement verification and measurement system analysis is also discussed.

  10. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    NASA Astrophysics Data System (ADS)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  11. Pharmacy Automation in Navy Medicine: A Study of Naval Medical Center San Diego

    DTIC Science & Technology

    2015-09-01

    to pharmacist verification. ...............................................................7  Figure 3.  Robotic Delivery System Installed at Naval...medication, caps the vial, and affixes the label. This completed prescription is then placed on the conveyer belt for routing to pharmacist ...performing all steps, including transportation, up to pharmacist verification via the conveyer belt. Manual fills are located along the conveyor system

  12. Application of automated measurement and verification to utility energy efficiency program data

    DOE PAGES

    Granderson, Jessica; Touzani, Samir; Fernandes, Samuel; ...

    2017-02-17

    Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The increasing availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifymore » savings, offers the potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer these ‘M&V 2.0’ capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline the M&V process. In this paper, we apply an automated whole-building M&V tool to historic data sets from energy efficiency programs to begin to explore the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. For the data sets studied we evaluate the fraction of buildings that are well suited to automated baseline characterization, the uncertainty in gross savings that is due to M&V 2.0 tools’ model error, and indications of labor time savings, and how the automated savings results compare to prior, traditionally determined savings results. The results show that 70% of the buildings were well suited to the automated approach. In a majority of the cases (80%) savings and uncertainties for each individual building were quantified to levels above the criteria in ASHRAE Guideline 14. In addition the findings suggest that M&V 2.0 methods may also offer time-savings relative to traditional approaches. Lastly, we discuss the implications of these findings relative to the potential evolution of M&V, and pilots currently being launched to test how M&V automation can be integrated into ratepayer-funded programs and professional implementation and evaluation practice.« less

  13. Application of automated measurement and verification to utility energy efficiency program data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granderson, Jessica; Touzani, Samir; Fernandes, Samuel

    Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The increasing availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifymore » savings, offers the potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer these ‘M&V 2.0’ capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline the M&V process. In this paper, we apply an automated whole-building M&V tool to historic data sets from energy efficiency programs to begin to explore the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. For the data sets studied we evaluate the fraction of buildings that are well suited to automated baseline characterization, the uncertainty in gross savings that is due to M&V 2.0 tools’ model error, and indications of labor time savings, and how the automated savings results compare to prior, traditionally determined savings results. The results show that 70% of the buildings were well suited to the automated approach. In a majority of the cases (80%) savings and uncertainties for each individual building were quantified to levels above the criteria in ASHRAE Guideline 14. In addition the findings suggest that M&V 2.0 methods may also offer time-savings relative to traditional approaches. Lastly, we discuss the implications of these findings relative to the potential evolution of M&V, and pilots currently being launched to test how M&V automation can be integrated into ratepayer-funded programs and professional implementation and evaluation practice.« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person`s identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed formore » discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm.« less

  15. TU-H-CAMPUS-JeP1-02: Fully Automatic Verification of Automatically Contoured Normal Tissues in the Head and Neck

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCarroll, R; UT Health Science Center, Graduate School of Biomedical Sciences, Houston, TX; Beadle, B

    Purpose: To investigate and validate the use of an independent deformable-based contouring algorithm for automatic verification of auto-contoured structures in the head and neck towards fully automated treatment planning. Methods: Two independent automatic contouring algorithms [(1) Eclipse’s Smart Segmentation followed by pixel-wise majority voting, (2) an in-house multi-atlas based method] were used to create contours of 6 normal structures of 10 head-and-neck patients. After rating by a radiation oncologist, the higher performing algorithm was selected as the primary contouring method, the other used for automatic verification of the primary. To determine the ability of the verification algorithm to detect incorrectmore » contours, contours from the primary method were shifted from 0.5 to 2cm. Using a logit model the structure-specific minimum detectable shift was identified. The models were then applied to a set of twenty different patients and the sensitivity and specificity of the models verified. Results: Per physician rating, the multi-atlas method (4.8/5 point scale, with 3 rated as generally acceptable for planning purposes) was selected as primary and the Eclipse-based method (3.5/5) for verification. Mean distance to agreement and true positive rate were selected as covariates in an optimized logit model. These models, when applied to a group of twenty different patients, indicated that shifts could be detected at 0.5cm (brain), 0.75cm (mandible, cord), 1cm (brainstem, cochlea), or 1.25cm (parotid), with sensitivity and specificity greater than 0.95. If sensitivity and specificity constraints are reduced to 0.9, detectable shifts of mandible and brainstem were reduced by 0.25cm. These shifts represent additional safety margins which might be considered if auto-contours are used for automatic treatment planning without physician review. Conclusion: Automatically contoured structures can be automatically verified. This fully automated process could be used to flag auto-contours for special review or used with safety margins in a fully automatic treatment planning system.« less

  16. An Overview of the Runtime Verification Tool Java PathExplorer

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2002-01-01

    We present an overview of the Java PathExplorer runtime verification tool, in short referred to as JPAX. JPAX can monitor the execution of a Java program and check that it conforms with a set of user provided properties formulated in temporal logic. JPAX can in addition analyze the program for concurrency errors such as deadlocks and data races. The concurrency analysis requires no user provided specification. The tool facilitates automated instrumentation of a program's bytecode, which when executed will emit an event stream, the execution trace, to an observer. The observer dispatches the incoming event stream to a set of observer processes, each performing a specialized analysis, such as the temporal logic verification, the deadlock analysis and the data race analysis. Temporal logic specifications can be formulated by the user in the Maude rewriting logic, where Maude is a high-speed rewriting system for equational logic, but here extended with executable temporal logic. The Maude rewriting engine is then activated as an event driven monitoring process. Alternatively, temporal specifications can be translated into efficient automata, which check the event stream. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems.

  17. Using software security analysis to verify the secure socket layer (SSL) protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2004-01-01

    nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.

  18. Verification of Java Programs using Symbolic Execution and Invariant Generation

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  19. Spaceport Command and Control System Automated Verification Software Development

    NASA Technical Reports Server (NTRS)

    Backus, Michael W.

    2017-01-01

    For as long as we have walked the Earth, humans have always been explorers. We have visited our nearest celestial body and sent Voyager 1 beyond our solar system1 out into interstellar space. Now it is finally time for us to step beyond our home and onto another planet. The Spaceport Command and Control System (SCCS) is being developed along with the Space Launch System (SLS) to take us on a journey further than ever attempted. Within SCCS are separate subsystems and system level software, each of which have to be tested and verified. Testing is a long and tedious process, so automating it will be much more efficient and also helps to remove the possibility of human error from mission operations. I was part of a team of interns and full-time engineers who automated tests for the requirements on SCCS, and with that was able to help verify that the software systems are performing as expected.

  20. Evaluation of verification and testing tools for FORTRAN programs

    NASA Technical Reports Server (NTRS)

    Smith, K. A.

    1980-01-01

    Two automated software verification and testing systems were developed for use in the analysis of computer programs. An evaluation of the static analyzer DAVE and the dynamic analyzer PET, which are used in the analysis of FORTRAN programs on Control Data (CDC) computers, are described. Both systems were found to be effective and complementary, and are recommended for use in testing FORTRAN programs.

  1. Development of an automated on-line pepsin digestion-liquid chromatography-tandem mass spectrometry configuration for the rapid analysis of protein adducts of chemical warfare agents.

    PubMed

    Carol-Visser, Jeroen; van der Schans, Marcel; Fidder, Alex; Hulst, Albert G; van Baar, Ben L M; Irth, Hubertus; Noort, Daan

    2008-07-01

    Rapid monitoring and retrospective verification are key issues in protection against and non-proliferation of chemical warfare agents (CWA). Such monitoring and verification are adequately accomplished by the analysis of persistent protein adducts of these agents. Liquid chromatography-mass spectrometry (LC-MS) is the tool of choice in the analysis of such protein adducts, but the overall experimental procedure is quite elaborate. Therefore, an automated on-line pepsin digestion-LC-MS configuration has been developed for the rapid determination of CWA protein adducts. The utility of this configuration is demonstrated by the analysis of specific adducts of sarin and sulfur mustard to human butyryl cholinesterase and human serum albumin, respectively.

  2. Comparison on Human Resource Requirement between Manual and Automated Dispensing Systems.

    PubMed

    Noparatayaporn, Prapaporn; Sakulbumrungsil, Rungpetch; Thaweethamcharoen, Tanita; Sangseenil, Wunwisa

    2017-05-01

    This study was conducted to compare human resource requirement among manual, automated, and modified automated dispensing systems. Data were collected from the pharmacy department at the 2100-bed university hospital (Siriraj Hospital, Bangkok, Thailand). Data regarding the duration of the medication distribution process were collected by using self-reported forms for 1 month. The data on the automated dispensing machine (ADM) system were obtained from 1 piloted inpatient ward, whereas those on the manual system were the average of other wards. Data on dispensing, returned unused medication, and stock management processes under the traditional manual system and the ADM system were from actual activities, whereas the modified ADM system was modeled. The full-time equivalent (FTE) of each model was estimated for comparison. The result showed that the manual system required 46.84 FTEs of pharmacists and 132.66 FTEs of pharmacy technicians. By adding pharmacist roles on screening and verification under the ADM system, the ADM system required 117.61 FTEs of pharmacists. Replacing counting and filling medication functions by ADM has decreased the number of pharmacy technicians to 55.38 FTEs. After the modified ADM system canceled the return unused medication process, FTEs requirement for pharmacists and pharmacy technicians decreased to 69.78 and 51.90 FTEs, respectively. The ADM system decreased the workload of pharmacy technicians, whereas it required more time from pharmacists. However, the increased workload of pharmacists was associated with more comprehensive patient care functions, which resulted from the redesigned work process. Copyright © 2017. Published by Elsevier Inc.

  3. Poster — Thur Eve — 55: An automated XML technique for isocentre verification on the Varian TrueBeam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Asiev, Krum; Mullins, Joel; DeBlois, François

    2014-08-15

    Isocentre verification tests, such as the Winston-Lutz (WL) test, have gained popularity in the recent years as techniques such as stereotactic radiosurgery/radiotherapy (SRS/SRT) treatments are more commonly performed on radiotherapy linacs. These highly conformal treatments require frequent monitoring of the geometrical accuracy of the isocentre to ensure proper radiation delivery. At our clinic, the WL test is performed by acquiring with the EPID a collection of 8 images of a WL phantom fixed on the couch for various couch/gantry angles. This set of images is later analyzed to determine the isocentre size. The current work addresses the acquisition process. Amore » manual WL test acquisition performed by and experienced physicist takes in average 25 minutes and is prone to user manipulation errors. We have automated this acquisition on a Varian TrueBeam STx linac (Varian, Palo Alto, USA). The Varian developer mode allows the execution of custom-made XML script files to control all aspects of the linac operation. We have created an XML-WL script that cycles through each couch/gantry combinations taking an EPID image at each position. This automated acquisition is done in less than 4 minutes. The reproducibility of the method was verified by repeating the execution of the XML file 5 times. The analysis of the images showed variation of the isocenter size less than 0.1 mm along the X, Y and Z axes and compares favorably to a manual acquisition for which we typically observe variations up to 0.5 mm.« less

  4. Automatic OPC repair flow: optimized implementation of the repair recipe

    NASA Astrophysics Data System (ADS)

    Bahnas, Mohamed; Al-Imam, Mohamed; Word, James

    2007-10-01

    Virtual manufacturing that is enabled by rapid, accurate, full-chip simulation is a main pillar in achieving successful mask tape-out in the cutting-edge low-k1 lithography. It facilitates detecting printing failures before a costly and time-consuming mask tape-out and wafer print occur. The OPC verification step role is critical at the early production phases of a new process development, since various layout patterns will be suspected that they might to fail or cause performance degradation, and in turn need to be accurately flagged to be fed back to the OPC Engineer for further learning and enhancing in the OPC recipe. At the advanced phases of the process development, there is much less probability of detecting failures but still the OPC Verification step act as the last-line-of-defense for the whole RET implemented work. In recent publication the optimum approach of responding to these detected failures was addressed, and a solution was proposed to repair these defects in an automated methodology and fully integrated and compatible with the main RET/OPC flow. In this paper the authors will present further work and optimizations of this Repair flow. An automated analysis methodology for root causes of the defects and classification of them to cover all possible causes will be discussed. This automated analysis approach will include all the learning experience of the previously highlighted causes and include any new discoveries. Next, according to the automated pre-classification of the defects, application of the appropriate approach of OPC repair (i.e. OPC knob) on each classified defect location can be easily selected, instead of applying all approaches on all locations. This will help in cutting down the runtime of the OPC repair processing and reduce the needed number of iterations to reach the status of zero defects. An output report for existing causes of defects and how the tool handled them will be generated. The report will with help further learning and facilitate the enhancement of the main OPC recipe. Accordingly, the main OPC recipe can be more robust, converging faster and probably in a fewer number of iterations. This knowledge feedback loop is one of the fruitful benefits of the Automatic OPC Repair flow.

  5. Setup, Validation and Quality Control of a Centralized WGS laboratory - Lessons Learned.

    PubMed

    Arnold, Cath; Edwards, Kirstin; Desai, Meeta; Platt, Steve; Green, Jonathan; Conway, David

    2018-04-25

    Routine use of Whole Genome analysis for infectious diseases can be used to enlighten various scenarios pertaining to Public Health, including identification of microbial pathogens; relating individual cases to an outbreak of infectious disease; establishing an association between an outbreak of food poisoning and a specific food vehicle; inferring drug susceptibility; source tracing of contaminants and study of variations in the genome affect pathogenicity/virulence. We describe the setup, validation and ongoing verification of a centralised WGS laboratory to carry out the sequencing for these public health functions for the National Infection Services, Public Health England in the UK. The performance characteristics and Quality Control metrics measured during validation and verification of the entire end to end process (accuracy, precision, reproducibility and repeatability) are described and include information regarding the automated pass and release of data to service users without intervention. © Crown copyright 2018.

  6. Fourth NASA Langley Formal Methods Workshop

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael (Compiler); Hayhurst, Kelly J. (Compiler)

    1997-01-01

    This publication consists of papers presented at NASA Langley Research Center's fourth workshop on the application of formal methods to the design and verification of life-critical systems. Topic considered include: Proving properties of accident; modeling and validating SAFER in VDM-SL; requirement analysis of real-time control systems using PVS; a tabular language for system design; automated deductive verification of parallel systems. Also included is a fundamental hardware design in PVS.

  7. Verification and Validation in a Rapid Software Development Process

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  8. Experiment Automation with a Robot Arm using the Liquids Reflectometer Instrument at the Spallation Neutron Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zolnierczuk, Piotr A; Vacaliuc, Bogdan; Sundaram, Madhan

    The Liquids Reflectometer instrument installed at the Spallation Neutron Source (SNS) enables observations of chemical kinetics, solid-state reactions and phase-transitions of thin film materials at both solid and liquid surfaces. Effective measurement of these behaviors requires each sample to be calibrated dynamically using the neutron beam and the data acquisition system in a feedback loop. Since the SNS is an intense neutron source, the time needed to perform the measurement can be the same as the alignment process, leading to a labor-intensive operation that is exhausting to users. An update to the instrument control system, completed in March 2013, implementedmore » the key features of automated sample alignment and robot-driven sample management, allowing for unattended operation over extended periods, lasting as long as 20 hours. We present a case study of the effort, detailing the mechanical, electrical and software modifications that were made as well as the lessons learned during the integration, verification and testing process.« less

  9. Model Checking Verification and Validation at JPL and the NASA Fairmont IV and V Facility

    NASA Technical Reports Server (NTRS)

    Schneider, Frank; Easterbrook, Steve; Callahan, Jack; Montgomery, Todd

    1999-01-01

    We show how a technology transfer effort was carried out. The successful use of model checking on a pilot JPL flight project demonstrates the usefulness and the efficacy of the approach. The pilot project was used to model a complex spacecraft controller. Software design and implementation validation were carried out successfully. To suggest future applications we also show how the implementation validation step can be automated. The effort was followed by the formal introduction of the modeling technique as a part of the JPL Quality Assurance process.

  10. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  11. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE PAGES

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...

    2017-03-23

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  12. A standardized kit for automated quantitative assessment of candidate protein biomarkers in human plasma.

    PubMed

    Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H

    2015-12-01

    An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.

  13. TEAMS Model Analyzer

    NASA Technical Reports Server (NTRS)

    Tijidjian, Raffi P.

    2010-01-01

    The TEAMS model analyzer is a supporting tool developed to work with models created with TEAMS (Testability, Engineering, and Maintenance System), which was developed by QSI. In an effort to reduce the time spent in the manual process that each TEAMS modeler must perform in the preparation of reporting for model reviews, a new tool has been developed as an aid to models developed in TEAMS. The software allows for the viewing, reporting, and checking of TEAMS models that are checked into the TEAMS model database. The software allows the user to selectively model in a hierarchical tree outline view that displays the components, failure modes, and ports. The reporting features allow the user to quickly gather statistics about the model, and generate an input/output report pertaining to all of the components. Rules can be automatically validated against the model, with a report generated containing resulting inconsistencies. In addition to reducing manual effort, this software also provides an automated process framework for the Verification and Validation (V&V) effort that will follow development of these models. The aid of such an automated tool would have a significant impact on the V&V process.

  14. SU-E-T-406: Use of TrueBeam Developer Mode and API to Increase the Efficiency and Accuracy of Commissioning Measurements for the Varian EDGE Stereotactic Linac

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, S; Gulam, M; Song, K

    2014-06-01

    Purpose: The Varian EDGE machine is a new stereotactic platform, combining Calypso and VisionRT localization systems with a stereotactic linac. The system includes TrueBeam DeveloperMode, making possible the use of XML-scripting for automation of linac-related tasks. This study details the use of DeveloperMode to automate commissioning tasks for Varian EDGE, thereby improving efficiency and measurement consistency. Methods: XML-scripting was used for various commissioning tasks,including couch model verification,beam-scanning,and isocenter verification. For couch measurements, point measurements were acquired for several field sizes (2×2,4×4,10×10cm{sup 2}) at 42 gantry angles for two couch-models. Measurements were acquired with variations in couch position(rails in/out,couch shifted inmore » each of motion axes) compared to treatment planning system(TPS)-calculated values,which were logged automatically through advanced planning interface(API) scripting functionality. For beam scanning, XML-scripts were used to create custom MLC-apertures. For isocenter verification, XML-scripts were used to automate various Winston-Lutz-type tests. Results: For couch measurements, the time required for each set of angles was approximately 9 minutes. Without scripting, each set required approximately 12 minutes. Automated measurements required only one physicist, while manual measurements required at least two physicists to handle linac positions/beams and data recording. MLC apertures were generated outside of the TPS,and with the .xml file format, double-checking without use of TPS/operator console was possible. Similar time efficiency gains were found for isocenter verification measurements Conclusion: The use of XML scripting in TrueBeam DeveloperMode allows for efficient and accurate data acquisition during commissioning. The efficiency improvement is most pronounced for iterative measurements, exemplified by the time savings for couch modeling measurements(approximately 10 hours). The scripting also allowed for creation of the files in advance without requiring access to TPS. The API scripting functionality enabled efficient creation/mining of TPS data. Finally, automation reduces the potential for human error in entering linac values at the machine console,and the script provides a log of measurements acquired for each session. This research was supported in part by a grant from Varian Medical Systems, Palo Alto, CA.« less

  15. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  16. Verifying the Modal Logic Cube Is an Easy Task (For Higher-Order Automated Reasoners)

    NASA Astrophysics Data System (ADS)

    Benzmüller, Christoph

    Prominent logics, including quantified multimodal logics, can be elegantly embedded in simple type theory (classical higher-order logic). Furthermore, off-the-shelf reasoning systems for simple type type theory exist that can be uniformly employed for reasoning within and about embedded logics. In this paper we focus on reasoning about modal logics and exploit our framework for the automated verification of inclusion and equivalence relations between them. Related work has applied first-order automated theorem provers for the task. Our solution achieves significant improvements, most notably, with respect to elegance and simplicity of the problem encodings as well as with respect to automation performance.

  17. A Practical Approach to Implementing Real-Time Semantics

    NASA Technical Reports Server (NTRS)

    Luettgen, Gerald; Bhat, Girish; Cleaveland, Rance

    1999-01-01

    This paper investigates implementations of process algebras which are suitable for modeling concurrent real-time systems. It suggests an approach for efficiently implementing real-time semantics using dynamic priorities. For this purpose a proces algebra with dynamic priority is defined, whose semantics corresponds one-to-one to traditional real-time semantics. The advantage of the dynamic-priority approach is that it drastically reduces the state-space sizes of the systems in question while preserving all properties of their functional and real-time behavior. The utility of the technique is demonstrated by a case study which deals with the formal modeling and verification of the SCSI-2 bus-protocol. The case study is carried out in the Concurrency Workbench of North Carolina, an automated verification tool in which the process algebra with dynamic priority is implemented. It turns out that the state space of the bus-protocol model is about an order of magnitude smaller than the one resulting from real-time semantics. The accuracy of the model is proved by applying model checking for verifying several mandatory properties of the bus protocol.

  18. Land surface Verification Toolkit (LVT)

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  19. Program Model Checking as a New Trend

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper introduces a special section of STTT (International Journal on Software Tools for Technology Transfer) containing a selection of papers that were presented at the 7th International SPIN workshop, Stanford, August 30 - September 1, 2000. The workshop was named SPIN Model Checking and Software Verification, with an emphasis on model checking of programs. The paper outlines the motivation for stressing software verification, rather than only design and model verification, by presenting the work done in the Automated Software Engineering group at NASA Ames Research Center within the last 5 years. This includes work in software model checking, testing like technologies and static analysis.

  20. Software Model Checking Without Source Code

    NASA Technical Reports Server (NTRS)

    Chaki, Sagar; Ivers, James

    2009-01-01

    We present a framework, called AIR, for verifying safety properties of assembly language programs via software model checking. AIR extends the applicability of predicate abstraction and counterexample guided abstraction refinement to the automated verification of low-level software. By working at the assembly level, AIR allows verification of programs for which source code is unavailable-such as legacy and COTS software-and programs that use features-such as pointers, structures, and object-orientation-that are problematic for source-level software verification tools. In addition, AIR makes no assumptions about the underlying compiler technology. We have implemented a prototype of AIR and present encouraging results on several non-trivial examples.

  1. Knowledge-based verification of clinical guidelines by detection of anomalies.

    PubMed

    Duftschmid, G; Miksch, S

    2001-04-01

    As shown in numerous studies, a significant part of published clinical guidelines is tainted with different types of semantical errors that interfere with their practical application. The adaptation of generic guidelines, necessitated by circumstances such as resource limitations within the applying organization or unexpected events arising in the course of patient care, further promotes the introduction of defects. Still, most current approaches for the automation of clinical guidelines are lacking mechanisms, which check the overall correctness of their output. In the domain of software engineering in general and in the domain of knowledge-based systems (KBS) in particular, a common strategy to examine a system for potential defects consists in its verification. The focus of this work is to present an approach, which helps to ensure the semantical correctness of clinical guidelines in a three-step process. We use a particular guideline specification language called Asbru to demonstrate our verification mechanism. A scenario-based evaluation of our method is provided based on a guideline for the artificial ventilation of newborn infants. The described approach is kept sufficiently general in order to allow its application to several other guideline representation formats.

  2. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  3. Computer imaging and workflow systems in the business office.

    PubMed

    Adams, W T; Veale, F H; Helmick, P M

    1999-05-01

    Computer imaging and workflow technology automates many business processes that currently are performed using paper processes. Documents are scanned into the imaging system and placed in electronic patient account folders. Authorized users throughout the organization, including preadmission, verification, admission, billing, cash posting, customer service, and financial counseling staff, have online access to the information they need when they need it. Such streamlining of business functions can increase collections and customer satisfaction while reducing labor, supply, and storage costs. Because the costs of a comprehensive computer imaging and workflow system can be considerable, healthcare organizations should consider implementing parts of such systems that can be cost-justified or include implementation as part of a larger strategic technology initiative.

  4. Software Development Technologies for Reactive, Real-Time, and Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Manna, Zohar

    1996-01-01

    The research is directed towards the design and implementation of a comprehensive deductive environment for the development of high-assurance systems, especially reactive (concurrent, real-time, and hybrid) systems. Reactive systems maintain an ongoing interaction with their environment, and are among the most difficult to design and verify. The project aims to provide engineers with a wide variety of tools within a single, general, formal framework in which the tools will be most effective. The entire development process is considered, including the construction, transformation, validation, verification, debugging, and maintenance of computer systems. The goal is to automate the process as much as possible and reduce the errors that pervade hardware and software development.

  5. Automated finite element meshing of the lumbar spine: Verification and validation with 18 specimen-specific models.

    PubMed

    Campbell, J Q; Coombs, D J; Rao, M; Rullkoetter, P J; Petrella, A J

    2016-09-06

    The purpose of this study was to seek broad verification and validation of human lumbar spine finite element models created using a previously published automated algorithm. The automated algorithm takes segmented CT scans of lumbar vertebrae, automatically identifies important landmarks and contact surfaces, and creates a finite element model. Mesh convergence was evaluated by examining changes in key output variables in response to mesh density. Semi-direct validation was performed by comparing experimental results for a single specimen to the automated finite element model results for that specimen with calibrated material properties from a prior study. Indirect validation was based on a comparison of results from automated finite element models of 18 individual specimens, all using one set of generalized material properties, to a range of data from the literature. A total of 216 simulations were run and compared to 186 experimental data ranges in all six primary bending modes up to 7.8Nm with follower loads up to 1000N. Mesh convergence results showed less than a 5% difference in key variables when the original mesh density was doubled. The semi-direct validation results showed that the automated method produced results comparable to manual finite element modeling methods. The indirect validation results showed a wide range of outcomes due to variations in the geometry alone. The studies showed that the automated models can be used to reliably evaluate lumbar spine biomechanics, specifically within our intended context of use: in pure bending modes, under relatively low non-injurious simulated in vivo loads, to predict torque rotation response, disc pressures, and facet forces. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    DiCostanzo, D; Ayan, A; Woollard, J

    Purpose: To automate the daily verification of each patient’s treatment by utilizing the trajectory log files (TLs) written by the Varian TrueBeam linear accelerator while reducing the number of false positives including jaw and gantry positioning errors, that are displayed in the Treatment History tab of Varian’s Chart QA module. Methods: Small deviations in treatment parameters are difficult to detect in weekly chart checks, but may be significant in reducing delivery errors, and would be critical if detected daily. Software was developed in house to read TLs. Multiple functions were implemented within the software that allow it to operate viamore » a GUI to analyze TLs, or as a script to run on a regular basis. In order to determine tolerance levels for the scripted analysis, 15,241 TLs from seven TrueBeams were analyzed. The maximum error of each axis for each TL was written to a CSV file and statistically analyzed to determine the tolerance for each axis accessible in the TLs to flag for manual review. The software/scripts developed were tested by varying the tolerance values to ensure veracity. After tolerances were determined, multiple weeks of manual chart checks were performed simultaneously with the automated analysis to ensure validity. Results: The tolerance values for the major axis were determined to be, 0.025 degrees for the collimator, 1.0 degree for the gantry, 0.002cm for the y-jaws, 0.01cm for the x-jaws, and 0.5MU for the MU. The automated verification of treatment parameters has been in clinical use for 4 months. During that time, no errors in machine delivery of the patient treatments were found. Conclusion: The process detailed here is a viable and effective alternative to manually checking treatment parameters during weekly chart checks.« less

  7. Test Analysis Tools to Ensure Higher Quality of On-Board Real Time Software for Space Applications

    NASA Astrophysics Data System (ADS)

    Boudillet, O.; Mescam, J.-C.; Dalemagne, D.

    2008-08-01

    EADS Astrium Space Transportation, in its Les Mureaux premises, is responsible for the French M51 nuclear deterrent missile onboard SW. There was also developed over 1 million of line of code, mostly in ADA, for the Automated Transfer Vehicle (ATV) onboard SW and the flight control SW of the ARIANE5 launcher which has put it into orbit. As part of the ATV SW, ASTRIUM ST has developed the first Category A SW ever qualified for a European space application. To ensure that all these embedded SW have been developed with the highest quality and reliability level, specific development tools have been designed to cover the steps of source code verification, automated validation test or complete target instruction coverage verification. Three of such dedicated tools are presented here.

  8. How automated access verification can help organizations demonstrate HIPAA compliance: A case study.

    PubMed

    Hill, Linda

    2006-01-01

    This case study of Sharp HealthCare takes an in depth look at how the organization has embedded security policies into its business process and automated workflow to ensure users are granted only the IT access that is necessary for them to perform their jobs and to ensure patient privacy Some of the most pressing audit and compliance concerns in healthcare organizations today revolve around the need to constantly review and give an account for users' IT access. The need for this at Sharp is exacerbated because of the high rate of change within the organization and the large percentage of non-employee staff such as traveling nurses moving throughout hospital departments on their rotations. By implementing a software solution to verify the accuracy of user access rights or automatically initiate appropriate corrective actions, Sharp is now able to extend the responsibility and accountability for compliance to the most appropriate resources.

  9. Spacecraft command verification: The AI solution

    NASA Technical Reports Server (NTRS)

    Fesq, Lorraine M.; Stephan, Amy; Smith, Brian K.

    1990-01-01

    Recently, a knowledge-based approach was used to develop a system called the Command Constraint Checker (CCC) for TRW. CCC was created to automate the process of verifying spacecraft command sequences. To check command files by hand for timing and sequencing errors is a time-consuming and error-prone task. Conventional software solutions were rejected when it was estimated that it would require 36 man-months to build an automated tool to check constraints by conventional methods. Using rule-based representation to model the various timing and sequencing constraints of the spacecraft, CCC was developed and tested in only three months. By applying artificial intelligence techniques, CCC designers were able to demonstrate the viability of AI as a tool to transform difficult problems into easily managed tasks. The design considerations used in developing CCC are discussed and the potential impact of this system on future satellite programs is examined.

  10. Real-time automated failure identification in the Control Center Complex (CCC)

    NASA Technical Reports Server (NTRS)

    Kirby, Sarah; Lauritsen, Janet; Pack, Ginger; Ha, Anhhoang; Jowers, Steven; Mcnenny, Robert; Truong, The; Dell, James

    1993-01-01

    A system which will provide real-time failure management support to the Space Station Freedom program is described. The system's use of a simplified form of model based reasoning qualifies it as an advanced automation system. However, it differs from most such systems in that it was designed from the outset to meet two sets of requirements. First, it must provide a useful increment to the fault management capabilities of the Johnson Space Center (JSC) Control Center Complex (CCC) Fault Detection Management system. Second, it must satisfy CCC operational environment constraints such as cost, computer resource requirements, verification, and validation, etc. The need to meet both requirement sets presents a much greater design challenge than would have been the case had functionality been the sole design consideration. The choice of technology, discussing aspects of that choice and the process for migrating it into the control center is overviewed.

  11. Decision Making In A High-Tech World: Automation Bias and Countermeasures

    NASA Technical Reports Server (NTRS)

    Mosier, Kathleen L.; Skitka, Linda J.; Burdick, Mark R.; Heers, Susan T.; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    Automated decision aids and decision support systems have become essential tools in many high-tech environments. In aviation, for example, flight management systems computers not only fly the aircraft, but also calculate fuel efficient paths, detect and diagnose system malfunctions and abnormalities, and recommend or carry out decisions. Air Traffic Controllers will soon be utilizing decision support tools to help them predict and detect potential conflicts and to generate clearances. Other fields as disparate as nuclear power plants and medical diagnostics are similarly becoming more and more automated. Ideally, the combination of human decision maker and automated decision aid should result in a high-performing team, maximizing the advantages of additional cognitive and observational power in the decision-making process. In reality, however, the presence of these aids often short-circuits the way that even very experienced decision makers have traditionally handled tasks and made decisions, and introduces opportunities for new decision heuristics and biases. Results of recent research investigating the use of automated aids have indicated the presence of automation bias, that is, errors made when decision makers rely on automated cues as a heuristic replacement for vigilant information seeking and processing. Automation commission errors, i.e., errors made when decision makers inappropriately follow an automated directive, or automation omission errors, i.e., errors made when humans fail to take action or notice a problem because an automated aid fails to inform them, can result from this tendency. Evidence of the tendency to make automation-related omission and commission errors has been found in pilot self reports, in studies using pilots in flight simulations, and in non-flight decision making contexts with student samples. Considerable research has found that increasing social accountability can successfully ameliorate a broad array of cognitive biases and resultant errors. To what extent these effects generalize to performance situations is not yet empirically established. The two studies to be presented represent concurrent efforts, with student and professional pilot samples, to determine the effects of accountability pressures on automation bias and on the verification of the accurate functioning of automated aids. Students (Experiment 1) and commercial pilots (Experiment 2) performed simulated flight tasks using automated aids. In both studies, participants who perceived themselves as accountable for their strategies of interaction with the automation were significantly more likely to verify its correctness, and committed significantly fewer automation-related errors than those who did not report this perception.

  12. Formal verification of software-based medical devices considering medical guidelines.

    PubMed

    Daw, Zamira; Cleaveland, Rance; Vetter, Marcus

    2014-01-01

    Software-based devices have increasingly become an important part of several clinical scenarios. Due to their critical impact on human life, medical devices have very strict safety requirements. It is therefore necessary to apply verification methods to ensure that the safety requirements are met. Verification of software-based devices is commonly limited to the verification of their internal elements without considering the interaction that these elements have with other devices as well as the application environment in which they are used. Medical guidelines define clinical procedures, which contain the necessary information to completely verify medical devices. The objective of this work was to incorporate medical guidelines into the verification process in order to increase the reliability of the software-based medical devices. Medical devices are developed using the model-driven method deterministic models for signal processing of embedded systems (DMOSES). This method uses unified modeling language (UML) models as a basis for the development of medical devices. The UML activity diagram is used to describe medical guidelines as workflows. The functionality of the medical devices is abstracted as a set of actions that is modeled within these workflows. In this paper, the UML models are verified using the UPPAAL model-checker. For this purpose, a formalization approach for the UML models using timed automaton (TA) is presented. A set of requirements is verified by the proposed approach for the navigation-guided biopsy. This shows the capability for identifying errors or optimization points both in the workflow and in the system design of the navigation device. In addition to the above, an open source eclipse plug-in was developed for the automated transformation of UML models into TA models that are automatically verified using UPPAAL. The proposed method enables developers to model medical devices and their clinical environment using clinical workflows as one UML diagram. Additionally, the system design can be formally verified automatically.

  13. Boundary element analysis of post-tensioned slabs

    NASA Astrophysics Data System (ADS)

    Rashed, Youssef F.

    2015-06-01

    In this paper, the boundary element method is applied to carry out the structural analysis of post-tensioned flat slabs. The shear-deformable plate-bending model is employed. The effect of the pre-stressing cables is taken into account via the equivalent load method. The formulation is automated using a computer program, which uses quadratic boundary elements. Verification samples are presented, and finally a practical application is analyzed where results are compared against those obtained from the finite element method. The proposed method is efficient in terms of computer storage and processing time as well as the ease in data input and modifications.

  14. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... the person is conducted by the responsible entity through the INS automated system (INS Systematic... designated INS office for review. (Form G-845S is available from the local INS Office.) (3) Failure of...

  15. EU-FP7-iMARS: Analysis of Mars Multi-Resolution Images Using Auto-Coregistration Data Mining and Crowd Source Techniques: Processed Results - a First Look

    NASA Astrophysics Data System (ADS)

    Muller, Jan-Peter; Tao, Yu; Sidiropoulos, Panagiotis; Gwinner, Klaus; Willner, Konrad; Fanara, Lida; Waehlisch, Marita; van Gasselt, Stephan; Walter, Sebastian; Steikert, Ralf; Schreiner, Bjoern; Ivanov, Anton; Cantini, Federico; Wardlaw, Jessica; Morley, Jeremy; Sprinks, James; Giordano, Michele; Marsh, Stuart; Kim, Jungrack; Houghton, Robert; Bamford, Steven

    2016-06-01

    Understanding planetary atmosphere-surface exchange and extra-terrestrial-surface formation processes within our Solar System is one of the fundamental goals of planetary science research. There has been a revolution in planetary surface observations over the last 15 years, especially in 3D imaging of surface shape. This has led to the ability to overlay image data and derived information from different epochs, back in time to the mid 1970s, to examine changes through time, such as the recent discovery of mass movement, tracking inter-year seasonal changes and looking for occurrences of fresh craters. Within the EU FP-7 iMars project, we have developed a fully automated multi-resolution DTM processing chain, called the Coregistration ASP-Gotcha Optimised (CASP-GO), based on the open source NASA Ames Stereo Pipeline (ASP) [Tao et al., this conference], which is being applied to the production of planetwide DTMs and ORIs (OrthoRectified Images) from CTX and HiRISE. Alongside the production of individual strip CTX & HiRISE DTMs & ORIs, DLR [Gwinner et al., 2015] have processed HRSC mosaics of ORIs and DTMs for complete areas in a consistent manner using photogrammetric bundle block adjustment techniques. A novel automated co-registration and orthorectification chain has been developed by [Sidiropoulos & Muller, this conference]. Using the HRSC map products (both mosaics and orbital strips) as a map-base it is being applied to many of the 400,000 level-1 EDR images taken by the 4 NASA orbital cameras. In particular, the NASA Viking Orbiter camera (VO), Mars Orbiter Camera (MOC), Context Camera (CTX) as well as the High Resolution Imaging Science Experiment (HiRISE) back to 1976. A webGIS has been developed [van Gasselt et al., this conference] for displaying this time sequence of imagery and will be demonstrated showing an example from one of the HRSC quadrangle map-sheets. Automated quality control [Sidiropoulos & Muller, 2015] techniques are applied to screen for suitable images and these are extended to detect temporal changes in features on the surface such as mass movements, streaks, spiders, impact craters, CO2 geysers and Swiss Cheese terrain. For result verification these data mining techniques are then being employed within a citizen science project within the Zooniverse family. Examples of data mining and its verification will be presented.

  16. Knowledge-Based Aircraft Automation: Managers Guide on the use of Artificial Intelligence for Aircraft Automation and Verification and Validation Approach for a Neural-Based Flight Controller

    NASA Technical Reports Server (NTRS)

    Broderick, Ron

    1997-01-01

    The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network development. The changes were to include evaluation tools that can be applied to neural networks at each phase of the software engineering life cycle. The result was a formal evaluation approach to increase the product quality of systems that use neural networks for their implementation.

  17. Third Annual Workshop on Space Operations Automation and Robotics (SOAR 1989)

    NASA Technical Reports Server (NTRS)

    Griffin, Sandy (Editor)

    1990-01-01

    Papers presented at the Third Annual Workshop on Space Operations Automation and Robotics (SOAR '89), hosted by the NASA Lyndon B. Johnson Space Center at Houston, Texas, on July 25 to 27, 1989, are given. Approximately 100 technical papers were presented by experts from NASA, the USAF, universities, and technical companies. Also held were panel discussions on Air Force/NASA Artificial Intelligence Overview and Expert System Verification and Validation.

  18. Discrete Abstractions of Hybrid Systems: Verification of Safety and Application to User-Interface Design

    NASA Technical Reports Server (NTRS)

    Oishi, Meeko; Tomlin, Claire; Degani, Asaf

    2003-01-01

    Human interaction with complex hybrid systems involves the user, the automation's discrete mode logic, and the underlying continuous dynamics of the physical system. Often the user-interface of such systems displays a reduced set of information about the entire system. In safety-critical systems, how can we identify user-interface designs which do not have adequate information, or which may confuse the user? Here we describe a methodology, based on hybrid system analysis, to verify that a user-interface contains information necessary to safely complete a desired procedure or task. Verification within a hybrid framework allows us to account for the continuous dynamics underlying the simple, discrete representations displayed to the user. We provide two examples: a car traveling through a yellow light at an intersection and an aircraft autopilot in a landing/go-around maneuver. The examples demonstrate the general nature of this methodology, which is applicable to hybrid systems (not fully automated) which have operational constraints we can pose in terms of safety. This methodology differs from existing work in hybrid system verification in that we directly account for the user's interactions with the system.

  19. Ontology Matching with Semantic Verification.

    PubMed

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  20. MVP-CA Methodology for the Expert System Advocate's Advisor (ESAA)

    DOT National Transportation Integrated Search

    1997-11-01

    The Multi-Viewpoint Clustering Analysis (MVP-CA) tool is a semi-automated tool to provide a valuable aid for comprehension, verification, validation, maintenance, integration, and evolution of complex knowledge-based software systems. In this report,...

  1. The Automated Logistics Element Planning System (ALEPS)

    NASA Technical Reports Server (NTRS)

    Schwaab, Douglas G.

    1992-01-01

    ALEPS, which is being developed to provide the SSF program with a computer system to automate logistics resupply/return cargo load planning and verification, is presented. ALEPS will make it possible to simultaneously optimize both the resupply flight load plan and the return flight reload plan for any of the logistics carriers. In the verification mode ALEPS will support the carrier's flight readiness reviews and control proper execution of the approved plans. It will also support the SSF inventory management system by providing electronic block updates to the inventory database on the cargo arriving at or departing the station aboard a logistics carrier. A prototype drawer packing algorithm is described which is capable of generating solutions for 3D packing of cargo items into a logistics carrier storage accommodation. It is concluded that ALEPS will provide the capability to generate and modify optimized loading plans for the logistics elements fleet.

  2. Design and Verification of a Distributed Communication Protocol

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.

  3. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  4. Analysis of Complexity Evolution Management and Human Performance Issues in Commercial Aircraft Automation Systems

    NASA Technical Reports Server (NTRS)

    Vakil, Sanjay S.; Hansman, R. John

    2000-01-01

    Autoflight systems in the current generation of aircraft have been implicated in several recent incidents and accidents. A contributory aspect to these incidents may be the manner in which aircraft transition between differing behaviours or 'modes.' The current state of aircraft automation was investigated and the incremental development of the autoflight system was tracked through a set of aircraft to gain insight into how these systems developed. This process appears to have resulted in a system without a consistent global representation. In order to evaluate and examine autoflight systems, a 'Hybrid Automation Representation' (HAR) was developed. This representation was used to examine several specific problems known to exist in aircraft systems. Cyclomatic complexity is an analysis tool from computer science which counts the number of linearly independent paths through a program graph. This approach was extended to examine autoflight mode transitions modelled with the HAR. A survey was conducted of pilots to identify those autoflight mode transitions which airline pilots find difficult. The transitions identified in this survey were analyzed using cyclomatic complexity to gain insight into the apparent complexity of the autoflight system from the perspective of the pilot. Mode transitions which had been identified as complex by pilots were found to have a high cyclomatic complexity. Further examination was made into a set of specific problems identified in aircraft: the lack of a consistent representation of automation, concern regarding appropriate feedback from the automation, and the implications of physical limitations on the autoflight systems. Mode transitions involved in changing to and leveling at a new altitude were identified across multiple aircraft by numerous pilots. Where possible, evaluation and verification of the behaviour of these autoflight mode transitions was investigated via aircraft-specific high fidelity simulators. Three solution approaches to concerns regarding autoflight systems, and mode transitions in particular, are presented in this thesis. The first is to use training to modify pilot behaviours, or procedures to work around known problems. The second approach is to mitigate problems by enhancing feedback. The third approach is to modify the process by which automation is designed. The Operator Directed Process forces the consideration and creation of an automation model early in the design process for use as the basis of the software specification and training.

  5. Opportunities for Launch Site Integrated System Health Engineering and Management

    NASA Technical Reports Server (NTRS)

    Waterman, Robert D.; Langwost, Patricia E.; Waterman, Susan J.

    2005-01-01

    The launch site processing flow involves operations such as functional verification, preflight servicing and launch. These operations often include hazards that must be controlled to protect human life and critical space hardware assets. Existing command and control capabilities are limited to simple limit checking durig automated monitoring. Contingency actions are highly dependent on human recognition, decision making, and execution. Many opportunities for Integrated System Health Engineering and Management (ISHEM) exist throughout the processing flow. This paper will present the current human-centered approach to health management as performed today for the shuttle and space station programs. In addition, it will address some of the more critical ISHEM needs, and provide recommendations for future implementation of ISHEM at the launch site.

  6. Simulation-Based Verification of Autonomous Controllers via Livingstone PathFinder

    NASA Technical Reports Server (NTRS)

    Lindsey, A. E.; Pecheur, Charles

    2004-01-01

    AI software is often used as a means for providing greater autonomy to automated systems, capable of coping with harsh and unpredictable environments. Due in part to the enormous space of possible situations that they aim to addrs, autonomous systems pose a serious challenge to traditional test-based verification approaches. Efficient verification approaches need to be perfected before these systems can reliably control critical applications. This publication describes Livingstone PathFinder (LPF), a verification tool for autonomous control software. LPF applies state space exploration algorithms to an instrumented testbed, consisting of the controller embedded in a simulated operating environment. Although LPF has focused on NASA s Livingstone model-based diagnosis system applications, the architecture is modular and adaptable to other systems. This article presents different facets of LPF and experimental results from applying the software to a Livingstone model of the main propulsion feed subsystem for a prototype space vehicle.

  7. Dispensing error rate after implementation of an automated pharmacy carousel system.

    PubMed

    Oswald, Scott; Caldwell, Richard

    2007-07-01

    A study was conducted to determine filling and dispensing error rates before and after the implementation of an automated pharmacy carousel system (APCS). The study was conducted in a 613-bed acute and tertiary care university hospital. Before the implementation of the APCS, filling and dispensing rates were recorded during October through November 2004 and January 2005. Postimplementation data were collected during May through June 2006. Errors were recorded in three areas of pharmacy operations: first-dose or missing medication fill, automated dispensing cabinet fill, and interdepartmental request fill. A filling error was defined as an error caught by a pharmacist during the verification step. A dispensing error was defined as an error caught by a pharmacist observer after verification by the pharmacist. Before implementation of the APCS, 422 first-dose or missing medication orders were observed between October 2004 and January 2005. Independent data collected in December 2005, approximately six weeks after the introduction of the APCS, found that filling and error rates had increased. The filling rate for automated dispensing cabinets was associated with the largest decrease in errors. Filling and dispensing error rates had decreased by December 2005. In terms of interdepartmental request fill, no dispensing errors were noted in 123 clinic orders dispensed before the implementation of the APCS. One dispensing error out of 85 clinic orders was identified after implementation of the APCS. The implementation of an APCS at a university hospital decreased medication filling errors related to automated cabinets only and did not affect other filling and dispensing errors.

  8. Tackling the x-ray cargo inspection challenge using machine learning

    NASA Astrophysics Data System (ADS)

    Jaccard, Nicolas; Rogers, Thomas W.; Morton, Edward J.; Griffin, Lewis D.

    2016-05-01

    The current infrastructure for non-intrusive inspection of cargo containers cannot accommodate exploding com-merce volumes and increasingly stringent regulations. There is a pressing need to develop methods to automate parts of the inspection workflow, enabling expert operators to focus on a manageable number of high-risk images. To tackle this challenge, we developed a modular framework for automated X-ray cargo image inspection. Employing state-of-the-art machine learning approaches, including deep learning, we demonstrate high performance for empty container verification and specific threat detection. This work constitutes a significant step towards the partial automation of X-ray cargo image inspection.

  9. From Verified Models to Verifiable Code

    NASA Technical Reports Server (NTRS)

    Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.

  10. Automated spot defect characterization in a field portable night vision goggle test set

    NASA Astrophysics Data System (ADS)

    Scopatz, Stephen; Ozten, Metehan; Aubry, Gilles; Arquetoux, Guillaume

    2018-05-01

    This paper discusses a new capability developed for and results from a field portable test set for Gen 2 and Gen 3 Image Intensifier (I2) tube-based Night Vision Goggles (NVG). A previous paper described the test set and the automated and semi-automated tests supported for NVGs including a Knife Edge MTF test to replace the operator's interpretation of the USAF 1951 resolution chart. The major improvement and innovation detailed in this paper is the use of image analysis algorithms to automate the characterization of spot defects of I² tubes with the same test set hardware previously presented. The original and still common Spot Defect Test requires the operator to look through the NVGs at target of concentric rings; compare the size of the defects to a chart and manually enter the results into a table based on the size and location of each defect; this is tedious and subjective. The prior semi-automated improvement captures and displays an image of the defects and the rings; allowing the operator determine the defects with less eyestrain; while electronically storing the image and the resulting table. The advanced Automated Spot Defect Test utilizes machine vision algorithms to determine the size and location of the defects, generates the result table automatically and then records the image and the results in a computer-generated report easily usable for verification. This is inherently a more repeatable process that ensures consistent spot detection independent of the operator. Results of across several NVGs will be presented.

  11. An Automated Solar Synoptic Analysis Software System

    NASA Astrophysics Data System (ADS)

    Hong, S.; Lee, S.; Oh, S.; Kim, J.; Lee, J.; Kim, Y.; Lee, J.; Moon, Y.; Lee, D.

    2012-12-01

    We have developed an automated software system of identifying solar active regions, filament channels, and coronal holes, those are three major solar sources causing the space weather. Space weather forecasters of NOAA Space Weather Prediction Center produce the solar synoptic drawings as a daily basis to predict solar activities, i.e., solar flares, filament eruptions, high speed solar wind streams, and co-rotating interaction regions as well as their possible effects to the Earth. As an attempt to emulate this process with a fully automated and consistent way, we developed a software system named ASSA(Automated Solar Synoptic Analysis). When identifying solar active regions, ASSA uses high-resolution SDO HMI intensitygram and magnetogram as inputs and providing McIntosh classification and Mt. Wilson magnetic classification of each active region by applying appropriate image processing techniques such as thresholding, morphology extraction, and region growing. At the same time, it also extracts morphological and physical properties of active regions in a quantitative way for the short-term prediction of flares and CMEs. When identifying filament channels and coronal holes, images of global H-alpha network and SDO AIA 193 are used for morphological identification and also SDO HMI magnetograms for quantitative verification. The output results of ASSA are routinely checked and validated against NOAA's daily SRS(Solar Region Summary) and UCOHO(URSIgram code for coronal hole information). A couple of preliminary scientific results are to be presented using available output results. ASSA will be deployed at the Korean Space Weather Center and serve its customers in an operational status by the end of 2012.

  12. 46 CFR 61.40-1 - General.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-1 General. (a) All automatically or... tests and inspections to evaluate the operation and reliability of controls, alarms, safety features...

  13. Overview of open resources to support automated structure verification and elucidation

    EPA Science Inventory

    Cheminformatics methods form an essential basis for providing analytical scientists with access to data, algorithms and workflows. There are an increasing number of free online databases (compound databases, spectral libraries, data repositories) and a rich collection of software...

  14. Application of semi-active RFID power meter in automatic verification pipeline and intelligent storage system

    NASA Astrophysics Data System (ADS)

    Chen, Xiangqun; Huang, Rui; Shen, Liman; chen, Hao; Xiong, Dezhi; Xiao, Xiangqi; Liu, Mouhai; Xu, Renheng

    2018-03-01

    In this paper, the semi-active RFID watt-hour meter is applied to automatic test lines and intelligent warehouse management, from the transmission system, test system and auxiliary system, monitoring system, realize the scheduling of watt-hour meter, binding, control and data exchange, and other functions, make its more accurate positioning, high efficiency of management, update the data quickly, all the information at a glance. Effectively improve the quality, efficiency and automation of verification, and realize more efficient data management and warehouse management.

  15. Knowledge based system verification and validation as related to automation of space station subsystems: Rationale for a knowledge based system lifecycle

    NASA Technical Reports Server (NTRS)

    Richardson, Keith; Wong, Carla

    1988-01-01

    The role of verification and validation (V and V) in software has been to support and strengthen the software lifecycle and to ensure that the resultant code meets the standards of the requirements documents. Knowledge Based System (KBS) V and V should serve the same role, but the KBS lifecycle is ill-defined. The rationale of a simple form of the KBS lifecycle is presented, including accommodation to certain critical KBS differences from software development.

  16. CrossTalk: The Journal of Defense Software Engineering. Volume 26, Number 6, November/December 2013

    DTIC Science & Technology

    2013-12-01

    requirements during sprint planning. Automated scanning, which includes automated code-review tools, allows the expert to monitor the system... sprint . This enables the validator to leverage the test results for formal validation and verification, and perform a shortened “hybrid” style of IV&V...per SPRINT (1-4 weeks) 1 week 1 Month Up to four months Ø Deliverable product to user Ø Security posture assessed Ø Accredited to field/operate

  17. MRO Sequence Checking Tool

    NASA Technical Reports Server (NTRS)

    Fisher, Forest; Gladden, Roy; Khanampornpan, Teerapat

    2008-01-01

    The MRO Sequence Checking Tool program, mro_check, automates significant portions of the MRO (Mars Reconnaissance Orbiter) sequence checking procedure. Though MRO has similar checks to the ODY s (Mars Odyssey) Mega Check tool, the checks needed for MRO are unique to the MRO spacecraft. The MRO sequence checking tool automates the majority of the sequence validation procedure and check lists that are used to validate the sequences generated by MRO MPST (mission planning and sequencing team). The tool performs more than 50 different checks on the sequence. The automation varies from summarizing data about the sequence needed for visual verification of the sequence, to performing automated checks on the sequence and providing a report for each step. To allow for the addition of new checks as needed, this tool is built in a modular fashion.

  18. Automated X-ray image analysis for cargo security: Critical review and future promise.

    PubMed

    Rogers, Thomas W; Jaccard, Nicolas; Morton, Edward J; Griffin, Lewis D

    2017-01-01

    We review the relatively immature field of automated image analysis for X-ray cargo imagery. There is increasing demand for automated analysis methods that can assist in the inspection and selection of containers, due to the ever-growing volumes of traded cargo and the increasing concerns that customs- and security-related threats are being smuggled across borders by organised crime and terrorist networks. We split the field into the classical pipeline of image preprocessing and image understanding. Preprocessing includes: image manipulation; quality improvement; Threat Image Projection (TIP); and material discrimination and segmentation. Image understanding includes: Automated Threat Detection (ATD); and Automated Contents Verification (ACV). We identify several gaps in the literature that need to be addressed and propose ideas for future research. Where the current literature is sparse we borrow from the single-view, multi-view, and CT X-ray baggage domains, which have some characteristics in common with X-ray cargo.

  19. SEM AutoAnalysis: enhancing photomask and NIL defect disposition and review

    NASA Astrophysics Data System (ADS)

    Schulz, Kristian; Egodage, Kokila; Tabbone, Gilles; Ehrlich, Christian; Garetto, Anthony

    2017-06-01

    For defect disposition and repair verification regarding printability, AIMS™ is the state of the art measurement tool in industry. With its unique capability of capturing aerial images of photomasks it is the one method that comes closest to emulating the printing behaviour of a scanner. However for nanoimprint lithography (NIL) templates aerial images cannot be applied to evaluate the success of a repair process. Hence, for NIL defect dispositioning scanning, electron microscopy (SEM) imaging is the method of choice. In addition, it has been a standard imaging method for further root cause analysis of defects and defect review on optical photomasks which enables 2D or even 3D mask profiling at high resolutions. In recent years a trend observed in mask shops has been the automation of processes that traditionally were driven by operators. This of course has brought many advantages one of which is freeing cost intensive labour from conducting repetitive and tedious work. Furthermore, it reduces variability in processes due to different operator skill and experience levels which at the end contributes to eliminating the human factor. Taking these factors into consideration, one of the software based solutions available under the FAVOR® brand to support customer needs is the aerial image evaluation software, AIMS™ AutoAnalysis (AAA). It provides fully automated analysis of AIMS™ images and runs in parallel to measurements. This is enabled by its direct connection and communication with the AIMS™tools. As one of many positive outcomes, generating automated result reports is facilitated, standardizing the mask manufacturing workflow. Today, AAA has been successfully introduced into production at multiple customers and is supporting the workflow as described above. These trends indeed have triggered the demand for similar automation with respect to SEM measurements leading to the development of SEM AutoAnalysis (SAA). It aims towards a fully automated SEM image evaluation process utilizing a completely different algorithm due to the different nature of SEM images and aerial images. Both AAA and SAA are the building blocks towards an image evaluation suite in the mask shop industry.

  20. Lightning Jump Algorithm Development for the GOES·R Geostationary Lightning Mapper

    NASA Technical Reports Server (NTRS)

    Schultz. E.; Schultz. C.; Chronis, T.; Stough, S.; Carey, L.; Calhoun, K.; Ortega, K.; Stano, G.; Cecil, D.; Bateman, M.; hide

    2014-01-01

    Current work on the lightning jump algorithm to be used in GOES-R Geostationary Lightning Mapper (GLM)'s data stream is multifaceted due to the intricate interplay between the storm tracking, GLM proxy data, and the performance of the lightning jump itself. This work outlines the progress of the last year, where analysis and performance of the lightning jump algorithm with automated storm tracking and GLM proxy data were assessed using over 700 storms from North Alabama. The cases analyzed coincide with previous semi-objective work performed using total lightning mapping array (LMA) measurements in Schultz et al. (2011). Analysis shows that key components of the algorithm (flash rate and sigma thresholds) have the greatest influence on the performance of the algorithm when validating using severe storm reports. Automated objective analysis using the GLM proxy data has shown probability of detection (POD) values around 60% with false alarm rates (FAR) around 73% using similar methodology to Schultz et al. (2011). However, when applying verification methods similar to those employed by the National Weather Service, POD values increase slightly (69%) and FAR values decrease (63%). The relationship between storm tracking and lightning jump has also been tested in a real-time framework at NSSL. This system includes fully automated tracking by radar alone, real-time LMA and radar observations and the lightning jump. Results indicate that the POD is strong at 65%. However, the FAR is significantly higher than in Schultz et al. (2011) (50-80% depending on various tracking/lightning jump parameters) when using storm reports for verification. Given known issues with Storm Data, the performance of the real-time jump algorithm is also being tested with high density radar and surface observations from the NSSL Severe Hazards Analysis & Verification Experiment (SHAVE).

  1. 78 FR 32010 - Pipeline Safety: Public Workshop on Integrity Verification Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-28

    .... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process AGENCY: Pipeline and... announcing a public workshop to be held on the concept of ``Integrity Verification Process.'' The Integrity Verification Process shares similar characteristics with fitness for service processes. At this workshop, the...

  2. Optimized Temporal Monitors for SystemC

    NASA Technical Reports Server (NTRS)

    Tabakov, Deian; Rozier, Kristin Y.; Vardi, Moshe Y.

    2012-01-01

    SystemC is a modeling language built as an extension of C++. Its growing popularity and the increasing complexity of designs have motivated research efforts aimed at the verification of SystemC models using assertion-based verification (ABV), where the designer asserts properties that capture the design intent in a formal language such as PSL or SVA. The model then can be verified against the properties using runtime or formal verification techniques. In this paper we focus on automated generation of runtime monitors from temporal properties. Our focus is on minimizing runtime overhead, rather than monitor size or monitor-generation time. We identify four issues in monitor generation: state minimization, alphabet representation, alphabet minimization, and monitor encoding. We conduct extensive experimentation and identify a combination of settings that offers the best performance in terms of runtime overhead.

  3. Software design as a problem in learning theory (a research overview)

    NASA Technical Reports Server (NTRS)

    Fass, Leona F.

    1992-01-01

    Our interest in automating software design has come out of our research in automated reasoning, inductive inference, learnability, and algebraic machine theory. We have investigated these areas extensively, in connection with specific problems of language representation, acquisition, processing, and design. In the case of formal context-free (CF) languages we established existence of finite learnable models ('behavioral realizations') and procedures for constructing them effectively. We also determined techniques for automatic construction of the models, inductively inferring them from finite examples of how they should 'behave'. These results were obtainable due to appropriate representation of domain knowledge, and constraints on the domain that the representation defined. It was when we sought to generalize our results, and adapt or apply them, that we began investigating the possibility of determining similar procedures for constructing correct software. Discussions with other researchers led us to examine testing and verification processes, as they are related to inference, and due to their considerable importance in correct software design. Motivating papers by other researchers, led us to examine these processes in some depth. Here we present our approach to those software design issues raised by other researchers, within our own theoretical context. We describe our results, relative to those of the other researchers, and conclude that they do not compare unfavorably.

  4. 75 FR 8294 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-24

    ... of the Information: The USDA eAuthentication Service provides public and government businesses single sign-on capability for USDA applications, management of user credentials, and verification of identify... collection of information on those who are to respond, including through the use of appropriate automated...

  5. Cross-checking of Large Evaluated and Experimental Nuclear Reaction Databases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeydina, O.; Koning, A.J.; Soppera, N.

    2014-06-15

    Automated methods are presented for the verification of large experimental and evaluated nuclear reaction databases (e.g. EXFOR, JEFF, TENDL). These methods allow an assessment of the overall consistency of the data and detect aberrant values in both evaluated and experimental databases.

  6. Sci—Thur PM: Planning and Delivery — 03: Automated delivery and quality assurance of a modulated electron radiation therapy plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connell, T; Papaconstadopoulos, P; Alexander, A

    2014-08-15

    Modulated electron radiation therapy (MERT) offers the potential to improve healthy tissue sparing through increased dose conformity. Challenges remain, however, in accurate beamlet dose calculation, plan optimization, collimation method and delivery accuracy. In this work, we investigate the accuracy and efficiency of an end-to-end MERT plan and automated-delivery workflow for the electron boost portion of a previously treated whole breast irradiation case. Dose calculations were performed using Monte Carlo methods and beam weights were determined using a research-based treatment planning system capable of inverse optimization. The plan was delivered to radiochromic film placed in a water equivalent phantom for verification,more » using an automated motorized tertiary collimator. The automated delivery, which covered 4 electron energies, 196 subfields and 6183 total MU was completed in 25.8 minutes, including 6.2 minutes of beam-on time with the remainder of the delivery time spent on collimator leaf motion and the automated interfacing with the accelerator in service mode. The delivery time could be reduced by 5.3 minutes with minor electron collimator modifications and the beam-on time could be reduced by and estimated factor of 2–3 through redesign of the scattering foils. Comparison of the planned and delivered film dose gave 3%/3 mm gamma pass rates of 62.1, 99.8, 97.8, 98.3, and 98.7 percent for the 9, 12, 16, 20 MeV, and combined energy deliveries respectively. Good results were also seen in the delivery verification performed with a MapCHECK 2 device. The results showed that accurate and efficient MERT delivery is possible with current technologies.« less

  7. Towards automatic Markov reliability modeling of computer architectures

    NASA Technical Reports Server (NTRS)

    Liceaga, C. A.; Siewiorek, D. P.

    1986-01-01

    The analysis and evaluation of reliability measures using time-varying Markov models is required for Processor-Memory-Switch (PMS) structures that have competing processes such as standby redundancy and repair, or renewal processes such as transient or intermittent faults. The task of generating these models is tedious and prone to human error due to the large number of states and transitions involved in any reasonable system. Therefore model formulation is a major analysis bottleneck, and model verification is a major validation problem. The general unfamiliarity of computer architects with Markov modeling techniques further increases the necessity of automating the model formulation. This paper presents an overview of the Automated Reliability Modeling (ARM) program, under development at NASA Langley Research Center. ARM will accept as input a description of the PMS interconnection graph, the behavior of the PMS components, the fault-tolerant strategies, and the operational requirements. The output of ARM will be the reliability of availability Markov model formulated for direct use by evaluation programs. The advantages of such an approach are (a) utility to a large class of users, not necessarily expert in reliability analysis, and (b) a lower probability of human error in the computation.

  8. Using AUTORAD for Cassini File Uplinks: Incorporating Automated Commanding into Mission Operations

    NASA Technical Reports Server (NTRS)

    Goo, Sherwin

    2014-01-01

    As the Cassini spacecraft embarked on the Solstice Mission in October 2010, the flight operations team faced a significant challenge in planning and executing the continuing tour of the Saturnian system. Faced with budget cuts that reduced the science and engineering staff by over a third in size, new and streamlined processes had to be developed to allow the Cassini mission to maintain a high level of science data return with a lower amount of available resources while still minimizing the risk. Automation was deemed an important key in enabling mission operations with reduced workforce and the Cassini flight team has made this goal a priority for the Solstice Mission. The operations team learned about a utility called AUTORAD which would give the flight operations team the ability to program selected command files for radiation up to seven days in advance and help minimize the need for off-shift support that could deplete available staffing during the prime shift hours. This paper will describe how AUTORAD is being utilized by the Cassini flight operations team and the processes that were developed or modified to ensure that proper oversight and verification is maintained in the generation and execution of radiated command files.

  9. Increasing the automation of a 2D-3D registration system.

    PubMed

    Varnavas, Andreas; Carrell, Tom; Penney, Graeme

    2013-02-01

    Routine clinical use of 2D-3D registration algorithms for Image Guided Surgery remains limited. A key aspect for routine clinical use of this technology is its degree of automation, i.e., the amount of necessary knowledgeable interaction between the clinicians and the registration system. Current image-based registration approaches usually require knowledgeable manual interaction during two stages: for initial pose estimation and for verification of produced results. We propose four novel techniques, particularly suited to vertebra-based registration systems, which can significantly automate both of the above stages. Two of these techniques are based upon the intraoperative "insertion" of a virtual fiducial marker into the preoperative data. The remaining two techniques use the final registration similarity value between multiple CT vertebrae and a single fluoroscopy vertebra. The proposed methods were evaluated with data from 31 operations (31 CT scans, 419 fluoroscopy images). Results show these methods can remove the need for manual vertebra identification during initial pose estimation, and were also very effective for result verification, producing a combined true positive rate of 100% and false positive rate equal to zero. This large decrease in required knowledgeable interaction is an important contribution aiming to enable more widespread use of 2D-3D registration technology.

  10. Automated biphasic morphological assessment of hepatitis B-related liver fibrosis using second harmonic generation microscopy

    NASA Astrophysics Data System (ADS)

    Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting

    2015-08-01

    Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P < 0.001). A biphasic scoring method was applied, combining support vector machine and multivariate generalized linear models to assess the early and late stages of fibrosis, respectively, based on these parameters. The verification cohort was used to verify the scoring method, and the area under the receiver operating characteristic curve was >0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method.

  11. Automated biphasic morphological assessment of hepatitis B-related liver fibrosis using second harmonic generation microscopy.

    PubMed

    Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting

    2015-08-11

    Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P < 0.001). A biphasic scoring method was applied, combining support vector machine and multivariate generalized linear models to assess the early and late stages of fibrosis, respectively, based on these parameters. The verification cohort was used to verify the scoring method, and the area under the receiver operating characteristic curve was >0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method.

  12. To repair or not to repair: with FAVOR there is no question

    NASA Astrophysics Data System (ADS)

    Garetto, Anthony; Schulz, Kristian; Tabbone, Gilles; Himmelhaus, Michael; Scheruebl, Thomas

    2016-10-01

    In the mask shop the challenges associated with today's advanced technology nodes, both technical and economic, are becoming increasingly difficult. The constant drive to continue shrinking features means more masks per device, smaller manufacturing tolerances and more complexity along the manufacturing line with respect to the number of manufacturing steps required. Furthermore, the extremely competitive nature of the industry makes it critical for mask shops to optimize asset utilization and processes in order to maximize their competitive advantage and, in the end, profitability. Full maximization of profitability in such a complex and technologically sophisticated environment simply cannot be achieved without the use of smart automation. Smart automation allows productivity to be maximized through better asset utilization and process optimization. Reliability is improved through the minimization of manual interactions leading to fewer human error contributions and a more efficient manufacturing line. In addition to these improvements in productivity and reliability, extra value can be added through the collection and cross-verification of data from multiple sources which provides more information about our products and processes. When it comes to handling mask defects, for instance, the process consists largely of time consuming manual interactions that are error prone and often require quick decisions from operators and engineers who are under pressure. The handling of defects itself is a multiple step process consisting of several iterations of inspection, disposition, repair, review and cleaning steps. Smaller manufacturing tolerances and features with higher complexity contribute to a higher number of defects which must be handled as well as a higher level of complexity. In this paper the recent efforts undertaken by ZEISS to provide solutions which address these challenges, particularly those associated with defectivity, will be presented. From automation of aerial image analysis to the use of data driven decision making to predict and propose the optimized back end of line process flow, productivity and reliability improvements are targeted by smart automation. Additionally the generation of the ideal aerial image from the design and several repair enhancement features offer additional capabilities to improve the efficiency and yield associated with defect handling.

  13. VeriClick: an efficient tool for table format verification

    NASA Astrophysics Data System (ADS)

    Nagy, George; Tamhankar, Mangesh

    2012-01-01

    The essential layout attributes of a visual table can be defined by the location of four critical grid cells. Although these critical cells can often be located by automated analysis, some means of human interaction is necessary for correcting residual errors. VeriClick is a macro-enabled spreadsheet interface that provides ground-truthing, confirmation, correction, and verification functions for CSV tables. All user actions are logged. Experimental results of seven subjects on one hundred tables suggest that VeriClick can provide a ten- to twenty-fold speedup over performing the same functions with standard spreadsheet editing commands.

  14. HiVy automated translation of stateflow designs for model checking verification

    NASA Technical Reports Server (NTRS)

    Pingree, Paula

    2003-01-01

    tool set enables model checking of finite state machines designs. This is acheived by translating state-chart specifications into the input language of the Spin model checker. An abstract syntax of hierarchical sequential automata (HSA) is provided as an intermediate format tool set.

  15. 78 FR 56268 - Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-12

    .... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension... public workshop on ``Integrity Verification Process'' which took place on August 7, 2013. The notice also sought comments on the proposed ``Integrity Verification Process.'' In response to the comments received...

  16. Automated solar panel assembly line

    NASA Technical Reports Server (NTRS)

    Somberg, H.

    1981-01-01

    The initial stage of the automated solar panel assembly line program was devoted to concept development and proof of approach through simple experimental verification. In this phase, laboratory bench models were built to demonstrate and verify concepts. Following this phase was machine design and integration of the various machine elements. The third phase was machine assembly and debugging. In this phase, the various elements were operated as a unit and modifications were made as required. The final stage of development was the demonstration of the equipment in a pilot production operation.

  17. Systems integration for the Kennedy Space Center (KSC) Robotics Applications Development Laboratory (RADL)

    NASA Technical Reports Server (NTRS)

    Davis, V. Leon; Nordeen, Ross

    1988-01-01

    A laboratory for developing robotics technology for hazardous and repetitive Shuttle and payload processing activities is discussed. An overview of the computer hardware and software responsible for integrating the laboratory systems is given. The center's anthropomorphic robot is placed on a track allowing it to be moved to different stations. Various aspects of the laboratory equipment are described, including industrial robot arm control, smart systems integration, the supervisory computer, programmable process controller, real-time tracking controller, image processing hardware, and control display graphics. Topics of research include: automated loading and unloading of hypergolics for space vehicles and payloads; the use of mobile robotics for security, fire fighting, and hazardous spill operations; nondestructive testing for SRB joint and seal verification; Shuttle Orbiter radiator damage inspection; and Orbiter contour measurements. The possibility of expanding the laboratory in the future is examined.

  18. A deep convolutional neural network approach to single-particle recognition in cryo-electron microscopy.

    PubMed

    Zhu, Yanan; Ouyang, Qi; Mao, Youdong

    2017-07-21

    Single-particle cryo-electron microscopy (cryo-EM) has become a mainstream tool for the structural determination of biological macromolecular complexes. However, high-resolution cryo-EM reconstruction often requires hundreds of thousands of single-particle images. Particle extraction from experimental micrographs thus can be laborious and presents a major practical bottleneck in cryo-EM structural determination. Existing computational methods for particle picking often use low-resolution templates for particle matching, making them susceptible to reference-dependent bias. It is critical to develop a highly efficient template-free method for the automatic recognition of particle images from cryo-EM micrographs. We developed a deep learning-based algorithmic framework, DeepEM, for single-particle recognition from noisy cryo-EM micrographs, enabling automated particle picking, selection and verification in an integrated fashion. The kernel of DeepEM is built upon a convolutional neural network (CNN) composed of eight layers, which can be recursively trained to be highly "knowledgeable". Our approach exhibits an improved performance and accuracy when tested on the standard KLH dataset. Application of DeepEM to several challenging experimental cryo-EM datasets demonstrated its ability to avoid the selection of un-wanted particles and non-particles even when true particles contain fewer features. The DeepEM methodology, derived from a deep CNN, allows automated particle extraction from raw cryo-EM micrographs in the absence of a template. It demonstrates an improved performance, objectivity and accuracy. Application of this novel method is expected to free the labor involved in single-particle verification, significantly improving the efficiency of cryo-EM data processing.

  19. Visualization design and verification of Ada tasking using timing diagrams

    NASA Technical Reports Server (NTRS)

    Vidale, R. F.; Szulewski, P. A.; Weiss, J. B.

    1986-01-01

    The use of timing diagrams is recommended in the design and testing of multi-task Ada programs. By displaying the task states vs. time, timing diagrams can portray the simultaneous threads of data flow and control which characterize tasking programs. This description of the system's dynamic behavior from conception to testing is a necessary adjunct to other graphical techniques, such as structure charts, which essentially give a static view of the system. A series of steps is recommended which incorporates timing diagrams into the design process. Finally, a description is provided of a prototype Ada Execution Analyzer (AEA) which automates the production of timing diagrams from VAX/Ada debugger output.

  20. NASA/ASEE Summer Faculty Fellowship Program. 1991 Research Reports

    NASA Technical Reports Server (NTRS)

    Hosler, E. Ramon (Editor); Beymer, Mark A. (Editor); Armstrong, Dennis W. (Editor)

    1991-01-01

    Reports from the NASA/ASEE Summer Faculty Fellowship Program are presented. The editors are responsible for selecting appropriately qualified faculty to address some of the many problems of current interest to NASA Kennedy. Some representative titles are as follows: Development of an Accelerated Test Method for the Determination of Susceptibility to Atmospheric Corrosion; Hazardous Gas Leak Analysis in the Space Shuttle; Modeling and Control of the Automated Radiator Inspection Device; Study of the Finite Element Software Packages at KSC; Multispectral Image Processing for Plants; Algorithms for Contours Depicting Static Electric Fields during Adverse Weather Conditions; Transient Study of a Cryogenic Hydrogen Filling System; and Precision Cleaning Verification of Nonvolatile Residues by using Water, Ultrasonics, and Turbidity Analyses.

  1. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  2. 46 CFR 61.40-6 - Periodic safety tests.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 2 2012-10-01 2012-10-01 false Periodic safety tests. 61.40-6 Section 61.40-6 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-6 Periodic safety...

  3. 46 CFR 61.40-6 - Periodic safety tests.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 2 2013-10-01 2013-10-01 false Periodic safety tests. 61.40-6 Section 61.40-6 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-6 Periodic safety...

  4. 46 CFR 61.40-6 - Periodic safety tests.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Periodic safety tests. 61.40-6 Section 61.40-6 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-6 Periodic safety...

  5. 46 CFR 61.40-6 - Periodic safety tests.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 2 2014-10-01 2014-10-01 false Periodic safety tests. 61.40-6 Section 61.40-6 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-6 Periodic safety...

  6. 46 CFR 61.40-6 - Periodic safety tests.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Periodic safety tests. 61.40-6 Section 61.40-6 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-6 Periodic safety...

  7. 46 CFR 61.40-10 - Test procedure details.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Test procedure details. 61.40-10 Section 61.40-10 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-10 Test procedure...

  8. 46 CFR 61.40-10 - Test procedure details.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Test procedure details. 61.40-10 Section 61.40-10 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-10 Test procedure...

  9. A methodology for producing reliable software, volume 1

    NASA Technical Reports Server (NTRS)

    Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.

    1976-01-01

    An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.

  10. Subspace Dimensionality: A Tool for Automated QC in Seismic Array Processing

    NASA Astrophysics Data System (ADS)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.

    2013-12-01

    Because of the great resolving power of seismic arrays, the application of automated processing to array data is critically important in treaty verification work. A significant problem in array analysis is the inclusion of bad sensor channels in the beamforming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by node basis, so the dimensionality of the node traffic is instead monitoried for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. In the established template application, a detector functions in a manner analogous to waveform cross-correlation, applying a statistical test to assess the similarity of the incoming data stream to known templates for events of interest. In our approach, we seek not to detect matching signals, but instead, we examine the signal subspace dimensionality in much the same way that the method addresses node traffic anomalies in large computer systems. Signal anomalies recorded on seismic arrays affect the dimensional structure of the array-wide time-series. We have shown previously that this observation is useful in identifying real seismic events, either by looking at the raw signal or derivatives thereof (entropy, kurtosis), but here we explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for identifying bad array elements through a jackknifing process to isolate the anomalous channels, so that an automated analysis system might discard them prior to FK analysis and beamforming on events of interest.

  11. Otitis Media Diagnosis for Developing Countries Using Tympanic Membrane Image-Analysis.

    PubMed

    Myburgh, Hermanus C; van Zijl, Willemien H; Swanepoel, DeWet; Hellström, Sten; Laurent, Claude

    2016-03-01

    Otitis media is one of the most common childhood diseases worldwide, but because of lack of doctors and health personnel in developing countries it is often misdiagnosed or not diagnosed at all. This may lead to serious, and life-threatening complications. There is, thus a need for an automated computer based image-analyzing system that could assist in making accurate otitis media diagnoses anywhere. A method for automated diagnosis of otitis media is proposed. The method uses image-processing techniques to classify otitis media. The system is trained using high quality pre-assessed images of tympanic membranes, captured by digital video-otoscopes, and classifies undiagnosed images into five otitis media categories based on predefined signs. Several verification tests analyzed the classification capability of the method. An accuracy of 80.6% was achieved for images taken with commercial video-otoscopes, while an accuracy of 78.7% was achieved for images captured on-site with a low cost custom-made video-otoscope. The high accuracy of the proposed otitis media classification system compares well with the classification accuracy of general practitioners and pediatricians (~64% to 80%) using traditional otoscopes, and therefore holds promise for the future in making automated diagnosis of otitis media in medically underserved populations.

  12. Otitis Media Diagnosis for Developing Countries Using Tympanic Membrane Image-Analysis

    PubMed Central

    Myburgh, Hermanus C.; van Zijl, Willemien H.; Swanepoel, DeWet; Hellström, Sten; Laurent, Claude

    2016-01-01

    Background Otitis media is one of the most common childhood diseases worldwide, but because of lack of doctors and health personnel in developing countries it is often misdiagnosed or not diagnosed at all. This may lead to serious, and life-threatening complications. There is, thus a need for an automated computer based image-analyzing system that could assist in making accurate otitis media diagnoses anywhere. Methods A method for automated diagnosis of otitis media is proposed. The method uses image-processing techniques to classify otitis media. The system is trained using high quality pre-assessed images of tympanic membranes, captured by digital video-otoscopes, and classifies undiagnosed images into five otitis media categories based on predefined signs. Several verification tests analyzed the classification capability of the method. Findings An accuracy of 80.6% was achieved for images taken with commercial video-otoscopes, while an accuracy of 78.7% was achieved for images captured on-site with a low cost custom-made video-otoscope. Interpretation The high accuracy of the proposed otitis media classification system compares well with the classification accuracy of general practitioners and pediatricians (~ 64% to 80%) using traditional otoscopes, and therefore holds promise for the future in making automated diagnosis of otitis media in medically underserved populations. PMID:27077122

  13. Challenges and Demands on Automated Software Revision

    NASA Technical Reports Server (NTRS)

    Bonakdarpour, Borzoo; Kulkarni, Sandeep S.

    2008-01-01

    In the past three decades, automated program verification has undoubtedly been one of the most successful contributions of formal methods to software development. However, when verification of a program against a logical specification discovers bugs in the program, manual manipulation of the program is needed in order to repair it. Thus, in the face of existence of numerous unverified and un- certified legacy software in virtually any organization, tools that enable engineers to automatically verify and subsequently fix existing programs are highly desirable. In addition, since requirements of software systems often evolve during the software life cycle, the issue of incomplete specification has become a customary fact in many design and development teams. Thus, automated techniques that revise existing programs according to new specifications are of great assistance to designers, developers, and maintenance engineers. As a result, incorporating program synthesis techniques where an algorithm generates a program, that is correct-by-construction, seems to be a necessity. The notion of manual program repair described above turns out to be even more complex when programs are integrated with large collections of sensors and actuators in hostile physical environments in the so-called cyber-physical systems. When such systems are safety/mission- critical (e.g., in avionics systems), it is essential that the system reacts to physical events such as faults, delays, signals, attacks, etc, so that the system specification is not violated. In fact, since it is impossible to anticipate all possible such physical events at design time, it is highly desirable to have automated techniques that revise programs with respect to newly identified physical events according to the system specification.

  14. Wetland delineation with IKONOS high-resolution satellite imagery, Fort Custer Training Center, Battle Creek, Michigan, 2005

    USGS Publications Warehouse

    Fuller, L.M.; Morgan, T.R.; Aichele, Stephen S.

    2006-01-01

    The Michigan Army National Guard’s Fort Custer Training Center (FCTC) in Battle Creek, Mich., has the responsibility to protect wetland resources on the training grounds while providing training opportunities, and for future development planning at the facility. The National Wetlands Inventory (NWI) data have been the primary wetland-boundary resource, but a check on scale and accuracy of the wetland boundary information for the Fort Custer Training Center was needed. In cooperation with the FCTC, the U.S. Geological Survey (USGS) used an early spring IKONOS pan-sharpened satellite image to delineate the wetlands and create a more accurate wetland map for the FCTC. The USGS tested automated approaches (supervised and unsupervised classifications) to identify the wetland areas from the IKONOS satellite image, but the automated approaches alone did not yield accurate results. To ensure accurate wetland boundaries, the final wetland map was manually digitized on the basis of the automated supervised and unsupervised classifications, in combination with NWI data, field verifications, and visual interpretation of the IKONOS satellite image. The final wetland areas digitized from the IKONOS satellite imagery were similar to those in NWI; however, the wetland boundaries differed in some areas, a few wetlands mapped on the NWI were determined not to be wetlands from the IKONOS image and field verification, and additional previously unmapped wetlands not recognized by the NWI were identified from the IKONOS image.

  15. Automated biphasic morphological assessment of hepatitis B-related liver fibrosis using second harmonic generation microscopy

    PubMed Central

    Wang, Tong-Hong; Chen, Tse-Ching; Teng, Xiao; Liang, Kung-Hao; Yeh, Chau-Ting

    2015-01-01

    Liver fibrosis assessment by biopsy and conventional staining scores is based on histopathological criteria. Variations in sample preparation and the use of semi-quantitative histopathological methods commonly result in discrepancies between medical centers. Thus, minor changes in liver fibrosis might be overlooked in multi-center clinical trials, leading to statistically non-significant data. Here, we developed a computer-assisted, fully automated, staining-free method for hepatitis B-related liver fibrosis assessment. In total, 175 liver biopsies were divided into training (n = 105) and verification (n = 70) cohorts. Collagen was observed using second harmonic generation (SHG) microscopy without prior staining, and hepatocyte morphology was recorded using two-photon excitation fluorescence (TPEF) microscopy. The training cohort was utilized to establish a quantification algorithm. Eleven of 19 computer-recognizable SHG/TPEF microscopic morphological features were significantly correlated with the ISHAK fibrosis stages (P < 0.001). A biphasic scoring method was applied, combining support vector machine and multivariate generalized linear models to assess the early and late stages of fibrosis, respectively, based on these parameters. The verification cohort was used to verify the scoring method, and the area under the receiver operating characteristic curve was >0.82 for liver cirrhosis detection. Since no subjective gradings are needed, interobserver discrepancies could be avoided using this fully automated method. PMID:26260921

  16. Intelligent Tools for Planning Knowledge base Development and Verification

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintaining the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems.

  17. Static and Completion Analysis for Planning Knowledge Base Development and Verification

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintaining the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems.

  18. Automated Network Mapping and Topology Verification

    DTIC Science & Technology

    2016-06-01

    collection of information includes amplifying data about the networked devices such as hardware details, logical addressing schemes, 7 operating ...collection of information, including suggestions for reducing this burden, to Washington headquarters Services, Directorate for Information Operations ...maximum 200 words) The current military reliance on computer networks for operational missions and administrative duties makes network

  19. Automated Driving System Architecture to Ensure Safe Delegation of Driving Authority

    NASA Astrophysics Data System (ADS)

    YUN, Sunkil; NISHIMURA, Hidekazu

    2016-09-01

    In this paper, the architecture of an automated driving system (ADS) is proposed to ensure safe delegation of driving authority between the ADS and a driver. Limitations of the ADS functions may activate delegation of driving authority to a driver. However, it leads to severe consequences in emergency situations where a driver may be drowsy or distracted. To address these issues, first, the concept model for the ADS in the situation for delegation of driving authority is described taking the driver's behaviour and state into account. Second, the behaviour / state of a driver and functional flow / state of ADS and the interactions between them are modelled to understand the context where the ADS requests to delegate the driving authority to a driver. Finally, the proposed architecture of the ADS is verified under the simulations based on the emergency braking scenarios. In the verification process using simulation, we have derived the necessary condition for safe delegation of driving authority is that the ADS should assist s driver even after delegating driving authority to a driver who has not enough capability to regain control of the driving task.

  20. The JPSS Ground Project Algorithm Verification, Test and Evaluation System

    NASA Astrophysics Data System (ADS)

    Vicente, G. A.; Jain, P.; Chander, G.; Nguyen, V. T.; Dixon, V.

    2016-12-01

    The Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) is an operational system that provides services to the Suomi National Polar-orbiting Partnership (S-NPP) Mission. It is also a unique environment for Calibration/Validation (Cal/Val) and Data Quality Assessment (DQA) of the Join Polar Satellite System (JPSS) mission data products. GRAVITE provides a fast and direct access to the data and products created by the Interface Data Processing Segment (IDPS), the NASA/NOAA operational system that converts Raw Data Records (RDR's) generated by sensors on the S-NPP into calibrated geo-located Sensor Data Records (SDR's) and generates Mission Unique Products (MUPS). It also facilitates algorithm investigation, integration, checkouts and tuning, instrument and product calibration and data quality support, monitoring and data/products distribution. GRAVITE is the portal for the latest S-NPP and JPSS baselined Processing Coefficient Tables (PCT's) and Look-Up-Tables (LUT's) and hosts a number DQA offline tools that takes advantage of the proximity to the near-real time data flows. It also contains a set of automated and ad-hoc Cal/Val tools used for algorithm analysis and updates, including an instance of the IDPS called GRAVITE Algorithm Development Area (G-ADA), that has the latest installation of the IDPS algorithms running in an identical software and hardware platforms. Two other important GRAVITE component are the Investigator-led Processing System (IPS) and the Investigator Computing Facility (ICF). The IPS is a dedicated environment where authorized users run automated scripts called Product Generation Executables (PGE's) to support Cal/Val and data quality assurance offline. This data-rich and data-driven service holds its own distribution system and allows operators to retrieve science data products. The ICF is a workspace where users can share computing applications and resources and have full access to libraries and science and sensor quality analysis tools. In this presentation we will describe the GRAVITE systems and subsystems, architecture, technical specifications, capabilities and resources, distributed data and products and the latest advances to support the JPSS science algorithm implementation, validation and testing.

  1. Annual Forest Monitoring as part of Indonesia's National Carbon Accounting System

    NASA Astrophysics Data System (ADS)

    Kustiyo, K.; Roswintiarti, O.; Tjahjaningsih, A.; Dewanti, R.; Furby, S.; Wallace, J.

    2015-04-01

    Land use and forest change, in particular deforestation, have contributed the largest proportion of Indonesia's estimated greenhouse gas emissions. Indonesia's remaining forests store globally significant carbon stocks, as well as biodiversity values. In 2010, the Government of Indonesia entered into a REDD+ partnership. A spatially detailed monitoring and reporting system for forest change which is national and operating in Indonesia is required for participation in such programs, as well as for national policy reasons including Monitoring, Reporting, and Verification (MRV), carbon accounting, and land-use and policy information. Indonesia's National Carbon Accounting System (INCAS) has been designed to meet national and international policy requirements. The INCAS remote sensing program is producing spatially-detailed annual wall-to-wall monitoring of forest cover changes from time-series Landsat imagery for the whole of Indonesia from 2000 to the present day. Work on the program commenced in 2009, under the Indonesia-Australia Forest Carbon Partnership. A principal objective was to build an operational system in Indonesia through transfer of knowledge and experience, from Australia's National Carbon Accounting System, and adaptation of this experience to Indonesia's requirements and conditions. A semi-automated system of image pre-processing (ortho-rectification, calibration, cloud masking and mosaicing) and forest extent and change mapping (supervised classification of a 'base' year, semi-automated single-year classifications and classification within a multi-temporal probabilistic framework) was developed for Landsat 5 TM and Landsat 7 ETM+. Particular attention is paid to the accuracy of each step in the processing. With the advent of Landsat 8 data and parallel development of processing capability, capacity and international collaborations within the LAPAN Data Centre this processing is being increasingly automated. Research is continuing into improved processing methodology and integration of information from other data sources. This paper presents technical elements of the INCAS remote sensing program and some results of the 2000 - 2012 mapping.

  2. Assume-Guarantee Verification of Source Code with Design-Level Assumptions

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.

    2004-01-01

    Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.

  3. A Verification-Driven Approach to Traceability and Documentation for Auto-Generated Mathematical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen W.; Fischer, Bernd

    2009-01-01

    Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.

  4. Implementation of a robotic flexible assembly system

    NASA Technical Reports Server (NTRS)

    Benton, Ronald C.

    1987-01-01

    As part of the Intelligent Task Automation program, a team developed enabling technologies for programmable, sensory controlled manipulation in unstructured environments. These technologies include 2-D/3-D vision sensing and understanding, force sensing and high speed force control, 2.5-D vision alignment and control, and multiple processor architectures. The subsequent design of a flexible, programmable, sensor controlled robotic assembly system for small electromechanical devices is described using these technologies and ongoing implementation and integration efforts. Using vision, the system picks parts dumped randomly in a tray. Using vision and force control, it performs high speed part mating, in-process monitoring/verification of expected results and autonomous recovery from some errors. It is programmed off line with semiautomatic action planning.

  5. Automation and uncertainty analysis of a method for in-vivo range verification in particle therapy.

    PubMed

    Frey, K; Unholtz, D; Bauer, J; Debus, J; Min, C H; Bortfeld, T; Paganetti, H; Parodi, K

    2014-10-07

    We introduce the automation of the range difference calculation deduced from particle-irradiation induced β(+)-activity distributions with the so-called most-likely-shift approach, and evaluate its reliability via the monitoring of algorithm- and patient-specific uncertainty factors. The calculation of the range deviation is based on the minimization of the absolute profile differences in the distal part of two activity depth profiles shifted against each other. Depending on the workflow of positron emission tomography (PET)-based range verification, the two profiles under evaluation can correspond to measured and simulated distributions, or only measured data from different treatment sessions. In comparison to previous work, the proposed approach includes an automated identification of the distal region of interest for each pair of PET depth profiles and under consideration of the planned dose distribution, resulting in the optimal shift distance. Moreover, it introduces an estimate of uncertainty associated to the identified shift, which is then used as weighting factor to 'red flag' problematic large range differences. Furthermore, additional patient-specific uncertainty factors are calculated using available computed tomography (CT) data to support the range analysis. The performance of the new method for in-vivo treatment verification in the clinical routine is investigated with in-room PET images for proton therapy as well as with offline PET images for proton and carbon ion therapy. The comparison between measured PET activity distributions and predictions obtained by Monte Carlo simulations or measurements from previous treatment fractions is performed. For this purpose, a total of 15 patient datasets were analyzed, which were acquired at Massachusetts General Hospital and Heidelberg Ion-Beam Therapy Center with in-room PET and offline PET/CT scanners, respectively. Calculated range differences between the compared activity distributions are reported in a 2D map in beam-eye-view. In comparison to previously proposed approaches, the new most-likely-shift method shows more robust results for assessing in-vivo the range from strongly varying PET distributions caused by differing patient geometry, ion beam species, beam delivery techniques, PET imaging concepts and counting statistics. The additional visualization of the uncertainties and the dedicated weighting strategy contribute to the understanding of the reliability of observed range differences and the complexity in the prediction of activity distributions. The proposed method promises to offer a feasible technique for clinical routine of PET-based range verification.

  6. Automation and uncertainty analysis of a method for in-vivo range verification in particle therapy

    NASA Astrophysics Data System (ADS)

    Frey, K.; Unholtz, D.; Bauer, J.; Debus, J.; Min, C. H.; Bortfeld, T.; Paganetti, H.; Parodi, K.

    2014-10-01

    We introduce the automation of the range difference calculation deduced from particle-irradiation induced β+-activity distributions with the so-called most-likely-shift approach, and evaluate its reliability via the monitoring of algorithm- and patient-specific uncertainty factors. The calculation of the range deviation is based on the minimization of the absolute profile differences in the distal part of two activity depth profiles shifted against each other. Depending on the workflow of positron emission tomography (PET)-based range verification, the two profiles under evaluation can correspond to measured and simulated distributions, or only measured data from different treatment sessions. In comparison to previous work, the proposed approach includes an automated identification of the distal region of interest for each pair of PET depth profiles and under consideration of the planned dose distribution, resulting in the optimal shift distance. Moreover, it introduces an estimate of uncertainty associated to the identified shift, which is then used as weighting factor to ‘red flag’ problematic large range differences. Furthermore, additional patient-specific uncertainty factors are calculated using available computed tomography (CT) data to support the range analysis. The performance of the new method for in-vivo treatment verification in the clinical routine is investigated with in-room PET images for proton therapy as well as with offline PET images for proton and carbon ion therapy. The comparison between measured PET activity distributions and predictions obtained by Monte Carlo simulations or measurements from previous treatment fractions is performed. For this purpose, a total of 15 patient datasets were analyzed, which were acquired at Massachusetts General Hospital and Heidelberg Ion-Beam Therapy Center with in-room PET and offline PET/CT scanners, respectively. Calculated range differences between the compared activity distributions are reported in a 2D map in beam-eye-view. In comparison to previously proposed approaches, the new most-likely-shift method shows more robust results for assessing in-vivo the range from strongly varying PET distributions caused by differing patient geometry, ion beam species, beam delivery techniques, PET imaging concepts and counting statistics. The additional visualization of the uncertainties and the dedicated weighting strategy contribute to the understanding of the reliability of observed range differences and the complexity in the prediction of activity distributions. The proposed method promises to offer a feasible technique for clinical routine of PET-based range verification.

  7. A Tool for Requirements-Based Programming

    NASA Technical Reports Server (NTRS)

    Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis; Erickson, John

    2005-01-01

    Absent a general method for mathematically sound, automated transformation of customer requirements into a formal model of the desired system, developers must resort to either manual application of formal methods or to system testing (either manual or automated). While formal methods have afforded numerous successes, they present serious issues, e.g., costs to gear up to apply them (time, expensive staff), and scalability and reproducibility when standards in the field are not settled. The testing path cannot be walked to the ultimate goal, because exhaustive testing is infeasible for all but trivial systems. So system verification remains problematic. System or requirements validation is similarly problematic. The alternatives available today depend on either having a formal model or pursuing enough testing to enable the customer to be certain that system behavior meets requirements. The testing alternative for non-trivial systems always have some system behaviors unconfirmed and therefore is not the answer. To ensure that a formal model is equivalent to the customer s requirements necessitates that the customer somehow fully understands the formal model, which is not realistic. The predominant view that provably correct system development depends on having a formal model of the system leads to a desire for a mathematically sound method to automate the transformation of customer requirements into a formal model. Such a method, an augmentation of requirements-based programming, will be briefly described in this paper, and a prototype tool to support it will be described. The method and tool enable both requirements validation and system verification for the class of systems whose behavior can be described as scenarios. An application of the tool to a prototype automated ground control system for NASA mission is presented.

  8. Curve Set Feature-Based Robust and Fast Pose Estimation Algorithm

    PubMed Central

    Hashimoto, Koichi

    2017-01-01

    Bin picking refers to picking the randomly-piled objects from a bin for industrial production purposes, and robotic bin picking is always used in automated assembly lines. In order to achieve a higher productivity, a fast and robust pose estimation algorithm is necessary to recognize and localize the randomly-piled parts. This paper proposes a pose estimation algorithm for bin picking tasks using point cloud data. A novel descriptor Curve Set Feature (CSF) is proposed to describe a point by the surface fluctuation around this point and is also capable of evaluating poses. The Rotation Match Feature (RMF) is proposed to match CSF efficiently. The matching process combines the idea of the matching in 2D space of origin Point Pair Feature (PPF) algorithm with nearest neighbor search. A voxel-based pose verification method is introduced to evaluate the poses and proved to be more than 30-times faster than the kd-tree-based verification method. Our algorithm is evaluated against a large number of synthetic and real scenes and proven to be robust to noise, able to detect metal parts, more accurately and more than 10-times faster than PPF and Oriented, Unique and Repeatable (OUR)-Clustered Viewpoint Feature Histogram (CVFH). PMID:28771216

  9. Upgrades at the NASA Langley Research Center National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Paryz, Roman W.

    2012-01-01

    Several projects have been completed or are nearing completion at the NASA Langley Research Center (LaRC) National Transonic Facility (NTF). The addition of a Model Flow-Control/Propulsion Simulation test capability to the NTF provides a unique, transonic, high-Reynolds number test capability that is well suited for research in propulsion airframe integration studies, circulation control high-lift concepts, powered lift, and cruise separation flow control. A 1992 vintage Facility Automation System (FAS) that performs the control functions for tunnel pressure, temperature, Mach number, model position, safety interlock and supervisory controls was replaced using current, commercially available components. This FAS upgrade also involved a design study for the replacement of the facility Mach measurement system and the development of a software-based simulation model of NTF processes and control systems. The FAS upgrades were validated by a post upgrade verification wind tunnel test. The data acquisition system (DAS) upgrade project involves the design, purchase, build, integration, installation and verification of a new DAS by replacing several early 1990's vintage computer systems with state of the art hardware/software. This paper provides an update on the progress made in these efforts. See reference 1.

  10. Current status of 3D EPID-based in vivo dosimetry in The Netherlands Cancer Institute

    NASA Astrophysics Data System (ADS)

    Mijnheer, B.; Olaciregui-Ruiz, I.; Rozendaal, R.; Spreeuw, H.; van Herk, M.; Mans, A.

    2015-01-01

    3D in vivo dose verification using a-Si EPIDs is performed routinely in our institution for almost all RT treatments. The EPID-based 3D dose distribution is reconstructed using a back-projection algorithm and compared with the planned dose distribution using 3D gamma evaluation. Dose-reconstruction and gamma-evaluation software runs automatically, and deviations outside the alert criteria are immediately available and investigated, in combination with inspection of cone-beam CT scans. The implementation of our 3D EPID- based in vivo dosimetry approach was able to replace pre-treatment verification for more than 90% of the patient treatments. Clinically relevant deviations could be detected for approximately 1 out of 300 patient treatments (IMRT and VMAT). Most of these errors were patient related anatomical changes or deviations from the routine clinical procedure, and would not have been detected by pre-treatment verification. Moreover, 3D EPID-based in vivo dose verification is a fast and accurate tool to assure the safe delivery of RT treatments. It provides clinically more useful information and is less time consuming than pre-treatment verification measurements. Automated 3D in vivo dosimetry is therefore a prerequisite for large-scale implementation of patient-specific quality assurance of RT treatments.

  11. Framework for Evaluating Loop Invariant Detection Games in Relation to Automated Dynamic Invariant Detectors

    DTIC Science & Technology

    2015-09-01

    Detectability ...............................................................................................37 Figure 20. Excel VBA Codes for Checker...National Vulnerability Database OS Operating System SQL Structured Query Language VC Verification Condition VBA Visual Basic for Applications...checks each of these assertions for detectability by Daikon. The checker is an Excel Visual Basic for Applications ( VBA ) script that checks the

  12. Verification of the WFAS Lightning Efficiency Map

    Treesearch

    Paul Sopko; Don Latham; Isaac Grenfell

    2007-01-01

    A Lightning Ignition Efficiency map was added to the suite of daily maps offered by the Wildland Fire Assessment System (WFAS) in 1999. This map computes a lightning probability of ignition (POI) based on the estimated fuel type, fuel depth, and 100-hour fuel moisture interpolated from the Remote Automated Weather Station (RAWS) network. An attempt to verify the...

  13. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    PubMed Central

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  14. Interface Generation and Compositional Verification in JavaPathfinder

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina

    2009-01-01

    We present a novel algorithm for interface generation of software components. Given a component, our algorithm uses learning techniques to compute a permissive interface representing legal usage of the component. Unlike our previous work, this algorithm does not require knowledge about the component s environment. Furthermore, in contrast to other related approaches, our algorithm computes permissive interfaces even in the presence of non-determinism in the component. Our algorithm is implemented in the JavaPathfinder model checking framework for UML statechart components. We have also added support for automated assume-guarantee style compositional verification in JavaPathfinder, using component interfaces. We report on the application of the presented approach to the generation of interfaces for flight software components.

  15. Property-driven functional verification technique for high-speed vision system-on-chip processor

    NASA Astrophysics Data System (ADS)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  16. A Roadmap for the Implementation of Continued Process Verification.

    PubMed

    Boyer, Marcus; Gampfer, Joerg; Zamamiri, Abdel; Payne, Robin

    2016-01-01

    In 2014, the members of the BioPhorum Operations Group (BPOG) produced a 100-page continued process verification case study, entitled "Continued Process Verification: An Industry Position Paper with Example Protocol". This case study captures the thought processes involved in creating a continued process verification plan for a new product in response to the U.S. Food and Drug Administration's guidance on the subject introduced in 2011. In so doing, it provided the specific example of a plan developed for a new molecular antibody product based on the "A MAb Case Study" that preceded it in 2009.This document provides a roadmap that draws on the content of the continued process verification case study to provide a step-by-step guide in a more accessible form, with reference to a process map of the product life cycle. It could be used as a basis for continued process verification implementation in a number of different scenarios: For a single product and process;For a single site;To assist in the sharing of data monitoring responsibilities among sites;To assist in establishing data monitoring agreements between a customer company and a contract manufacturing organization. The U.S. Food and Drug Administration issued guidance on the management of manufacturing processes designed to improve quality and control of drug products. This involved increased focus on regular monitoring of manufacturing processes, reporting of the results, and the taking of opportunities to improve. The guidance and practice associated with it is known as continued process verification This paper summarizes good practice in responding to continued process verification guidance, gathered from subject matter experts in the biopharmaceutical industry. © PDA, Inc. 2016.

  17. Development and verification testing of automation and robotics for assembly of space structures

    NASA Technical Reports Server (NTRS)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1993-01-01

    A program was initiated within the past several years to develop operational procedures for automated assembly of truss structures suitable for large-aperture antennas. The assembly operations require the use of a robotic manipulator and are based on the principle of supervised autonomy to minimize crew resources. A hardware testbed was established to support development and evaluation testing. A brute-force automation approach was used to develop the baseline assembly hardware and software techniques. As the system matured and an operation was proven, upgrades were incorprated and assessed against the baseline test results. This paper summarizes the developmental phases of the program, the results of several assembly tests, the current status, and a series of proposed developments for additional hardware and software control capability. No problems that would preclude automated in-space assembly of truss structures have been encountered. The current system was developed at a breadboard level and continued development at an enhanced level is warranted.

  18. Work Practice Simulation of Complex Human-Automation Systems in Safety Critical Situations: The Brahms Generalized berlingen Model

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Linde, Charlotte; Seah, Chin; Shafto, Michael

    2013-01-01

    The transition from the current air traffic system to the next generation air traffic system will require the introduction of new automated systems, including transferring some functions from air traffic controllers to on­-board automation. This report describes a new design verification and validation (V&V) methodology for assessing aviation safety. The approach involves a detailed computer simulation of work practices that includes people interacting with flight-critical systems. The research is part of an effort to develop new modeling and verification methodologies that can assess the safety of flight-critical systems, system configurations, and operational concepts. The 2002 Ueberlingen mid-air collision was chosen for analysis and modeling because one of the main causes of the accident was one crew's response to a conflict between the instructions of the air traffic controller and the instructions of TCAS, an automated Traffic Alert and Collision Avoidance System on-board warning system. It thus furnishes an example of the problem of authority versus autonomy. It provides a starting point for exploring authority/autonomy conflict in the larger system of organization, tools, and practices in which the participants' moment-by-moment actions take place. We have developed a general air traffic system model (not a specific simulation of Überlingen events), called the Brahms Generalized Ueberlingen Model (Brahms-GUeM). Brahms is a multi-agent simulation system that models people, tools, facilities/vehicles, and geography to simulate the current air transportation system as a collection of distributed, interactive subsystems (e.g., airports, air-traffic control towers and personnel, aircraft, automated flight systems and air-traffic tools, instruments, crew). Brahms-GUeM can be configured in different ways, called scenarios, such that anomalous events that contributed to the Überlingen accident can be modeled as functioning according to requirements or in an anomalous condition, as occurred during the accident. Brahms-GUeM thus implicitly defines a class of scenarios, which include as an instance what occurred at Überlingen. Brahms-GUeM is a modeling framework enabling "what if" analysis of alternative work system configurations and thus facilitating design of alternative operations concepts. It enables subsequent adaption (reusing simulation components) for modeling and simulating NextGen scenarios. This project demonstrates that BRAHMS provides the capacity to model the complexity of air transportation systems, going beyond idealized and simple flights to include for example the interaction of pilots and ATCOs. The research shows clearly that verification and validation must include the entire work system, on the one hand to check that mechanisms exist to handle failures of communication and alerting subsystems and/or failures of people to notice, comprehend, or communicate problematic (unsafe) situations; but also to understand how people must use their own judgment in relating fallible systems like TCAS to other sources of information and thus to evaluate how the unreliability of automation affects system safety. The simulation shows in particular that distributed agents (people and automated systems) acting without knowledge of each others' actions can create a complex, dynamic system whose interactive behavior is unexpected and is changing too quickly to comprehend and control.

  19. Formal Safety Certification of Aerospace Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2005-01-01

    In principle, formal methods offer many advantages for aerospace software development: they can help to achieve ultra-high reliability, and they can be used to provide evidence of the reliability claims which can then be subjected to external scrutiny. However, despite years of research and many advances in the underlying formalisms of specification, semantics, and logic, formal methods are not much used in practice. In our opinion this is related to three major shortcomings. First, the application of formal methods is still expensive because they are labor- and knowledge-intensive. Second, they are difficult to scale up to complex systems because they are based on deep mathematical insights about the behavior of the systems (t.e., they rely on the "heroic proof"). Third, the proofs can be difficult to interpret, and typically stand in isolation from the original code. In this paper, we describe a tool for formally demonstrating safety-relevant aspects of aerospace software, which largely circumvents these problems. We focus on safely properties because it has been observed that safety violations such as out-of-bounds memory accesses or use of uninitialized variables constitute the majority of the errors found in the aerospace domain. In our approach, safety means that the program will not violate a set of rules that can range for the simple memory access rules to high-level flight rules. These different safety properties are formalized as different safety policies in Hoare logic, which are then used by a verification condition generator along with the code and logical annotations in order to derive formal safety conditions; these are then proven using an automated theorem prover. Our certification system is currently integrated into a model-based code generation toolset that generates the annotations together with the code. However, this automated formal certification technology is not exclusively constrained to our code generator and could, in principle, also be integrated with other code generators such as RealTime Workshop or even applied to legacy code. Our approach circumvents the historical problems with formal methods by increasing the degree of automation on all levels. The restriction to safety policies (as opposed to arbitrary functional behavior) results in simpler proof problems that can generally be solved by fully automatic theorem proves. An automated linking mechanism between the safety conditions and the code provides some of the traceability mandated by process standards such as DO-178B. An automated explanation mechanism uses semantic markup added by the verification condition generator to produce natural-language explanations of the safety conditions and thus supports their interpretation in relation to the code. It shows an automatically generated certification browser that lets users inspect the (generated) code along with the safety conditions (including textual explanations), and uses hyperlinks to automate tracing between the two levels. Here, the explanations reflect the logical structure of the safety obligation but the mechanism can in principle be customized using different sets of domain concepts. The interface also provides some limited control over the certification process itself. Our long-term goal is a seamless integration of certification, code generation, and manual coding that results in a "certified pipeline" in which specifications are automatically transformed into executable code, together with the supporting artifacts necessary for achieving and demonstrating the high level of assurance needed in the aerospace domain.

  20. 77 FR 60714 - Information Collection Activities: Legacy Data Verification Process (LDVP); Submitted for Office...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-04

    ...-0008; OMB Number 1014-0009] Information Collection Activities: Legacy Data Verification Process (LDVP); Submitted for Office of Management and Budget (OMB) Review; Comment Request ACTION: 30-Day notice. SUMMARY... the Notice to Lessees (NTL) on the Legacy Data Verification Process (LDVP). This notice also provides...

  1. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    NASA Technical Reports Server (NTRS)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  2. First CLIPS Conference Proceedings, volume 1

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The first Conference of C Language Production Systems (CLIPS) hosted by the NASA-Lyndon B. Johnson Space Center in August 1990 is presented. Articles included engineering applications, intelligent tutors and training, intelligent software engineering, automated knowledge acquisition, network applications, verification and validation, enhancements to CLIPS, space shuttle quality control/diagnosis applications, space shuttle and real-time applications, and medical, biological, and agricultural applications.

  3. Engine condition monitoring: CF6 family 60's through the 80's

    NASA Technical Reports Server (NTRS)

    Kent, H. J.; Dienger, G.

    1981-01-01

    The on condition program is described in terms of its effectiveness as a maintenance tool both at the line station as well as at home base by the early detection of engine faults, erroneous instrumentation signals and by verification of engine health. The system encompasses all known methods from manual procedures to the fully automated airborne integrated data system.

  4. Exploiting automatically generated databases of traffic signs and road markings for contextual co-occurrence analysis

    NASA Astrophysics Data System (ADS)

    Hazelhoff, Lykele; Creusen, Ivo M.; Woudsma, Thomas; de With, Peter H. N.

    2015-11-01

    Combined databases of road markings and traffic signs provide a complete and full description of the present traffic legislation and instructions. Such databases contribute to efficient signage maintenance, improve navigation, and benefit autonomous driving vehicles. A system is presented for the automated creation of such combined databases, which additionally investigates the benefit of this combination for automated contextual placement analysis. This analysis involves verification of the co-occurrence of traffic signs and road markings to retrieve a list of potentially incorrectly signaled (and thus potentially unsafe) road situations. This co-occurrence verification is specifically explored for both pedestrian crossings and yield situations. Evaluations on 420 km of road have shown that individual detection of traffic signs and road markings denoting these road situations can be performed with accuracies of 98% and 85%, respectively. Combining both approaches shows that over 95% of the pedestrian crossings and give-way situations can be identified. An exploration toward additional co-occurrence analysis of signs and markings shows that inconsistently signaled situations can successfully be extracted, such that specific safety actions can be directed toward cases lacking signs or markings, while most consistently signaled situations can be omitted from this analysis.

  5. Advanced Plant Habitat Flight Unit #1

    NASA Image and Video Library

    2017-07-24

    Inside a laboratory in the Space Station Processing Facility at NASA's Kennedy Space Center in Florida, a quality technician checks the hardware for the Advanced Plant Habitat flight unit. The flight unit is an exact replica of the APH that was delivered to the International Space Station. Validation tests and post-delivery checkout was performed to prepare for space station in-orbit APH activities. The flight unit will be moved to the International Space Station Environmental Simulator to begin an experiment verification test for the science that will fly on the first mission, PH-01. Developed by NASA and ORBITEC of Madison, Wisconsin, the APH is the largest plant chamber built for the agency. It is a fully automated plant growth facility that will be used to conduct bioscience research on the space station.

  6. Advanced Plant Habitat Flight Unit #1

    NASA Image and Video Library

    2017-07-24

    Inside a laboratory in the Space Station Processing Facility at NASA's Kennedy Space Center in Florida, quality technicians check the hardware for the Advanced Plant Habitat flight unit. The flight unit is an exact replica of the APH that was delivered to the International Space Station. Validation tests and post-delivery checkout was performed to prepare for space station in-orbit APH activities. The flight unit will be moved to the International Space Station Environmental Simulator to begin an experiment verification test for the science that will fly on the first mission, PH-01. Developed by NASA and ORBITEC of Madison, Wisconsin, the APH is the largest plant chamber built for the agency. It is a fully automated plant growth facility that will be used to conduct bioscience research on the space station.

  7. Advanced Plant Habitat Flight Unit #1

    NASA Image and Video Library

    2017-07-24

    Inside a laboratory in the Space Station Processing Facility at NASA's Kennedy Space Center in Florida, quality technicians check components of the hardware for the Advanced Plant Habitat flight unit. The flight unit is an exact replica of the APH that was delivered to the International Space Station. Validation tests and post-delivery checkout was performed to prepare for space station in-orbit APH activities. The flight unit will be moved to the International Space Station Environmental Simulator to begin an experiment verification test for the science that will fly on the first mission, PH-01. Developed by NASA and ORBITEC of Madison, Wisconsin, the APH is the largest plant chamber built for the agency. It is a fully automated plant growth facility that will be used to conduct bioscience research on the space station.

  8. Advanced Plant Habitat Flight Unit #1

    NASA Image and Video Library

    2017-07-24

    Inside a laboratory in the Space Station Processing Facility at NASA's Kennedy Space Center in Florida, quality technicians check the hardware for the Advanced Plant Habitat flight unit. The flight unit is an exact replica of the APH that was delivered to the International Space Station. Validation tests and post-delivery checkout was performed to prepare for space station in-orbit APH activities. The flight unit will be moved to the International Space Station Environment Simulator to begin an experiment verification test for the science that will fly on the first mission, PH-01. Developed by NASA and ORBITEC of Madison, Wisconsin, the APH is the largest plant chamber built for the agency. It is a fully automated plant growth facility that will be used to conduct bioscience research on the space station.

  9. Advanced Plant Habitat Flight Unit #1

    NASA Image and Video Library

    2017-07-24

    Inside a laboratory in the Space Station Processing Facility at NASA's Kennedy Space Center in Florida, a quality technician checks the control panel on hardware for the Advanced Plant Habitat flight unit. The flight unit is an exact replica of the APH that was delivered to the International Space Station. Validation tests and post-delivery checkout was performed to prepare for space station in-orbit APH activities. The flight unit will be moved to the International Space Station Environmental Simulator to begin an experiment verification test for the science that will fly on the first mission, PH-01. Developed by NASA and ORBITEC of Madison, Wisconsin, the APH is the largest plant chamber built for the agency. It is a fully automated plant growth facility that will be used to conduct bioscience research on the space station.

  10. A system for intelligent teleoperation research

    NASA Technical Reports Server (NTRS)

    Orlando, N. E.

    1983-01-01

    The Automation Technology Branch of NASA Langley Research Center is developing a research capability in the field of artificial intelligence, particularly as applicable in teleoperator/robotics development for remote space operations. As a testbed for experimentation in these areas, a system concept has been developed and is being implemented. This system termed DAISIE (Distributed Artificially Intelligent System for Interacting with the Environment), interfaces the key processes of perception, reasoning, and manipulation by linking hardware sensors and manipulators to a modular artificial intelligence (AI) software system in a hierarchical control structure. Verification experiments have been performed: one experiment used a blocksworld database and planner embedded in the DAISIE system to intelligently manipulate a simple physical environment; the other experiment implemented a joint-space collision avoidance algorithm. Continued system development is planned.

  11. Machine assisted histogram classification

    NASA Astrophysics Data System (ADS)

    Benyó, B.; Gaspar, C.; Somogyi, P.

    2010-04-01

    LHCb is one of the four major experiments under completion at the Large Hadron Collider (LHC). Monitoring the quality of the acquired data is important, because it allows the verification of the detector performance. Anomalies, such as missing values or unexpected distributions can be indicators of a malfunctioning detector, resulting in poor data quality. Spotting faulty or ageing components can be either done visually using instruments, such as the LHCb Histogram Presenter, or with the help of automated tools. In order to assist detector experts in handling the vast monitoring information resulting from the sheer size of the detector, we propose a graph based clustering tool combined with machine learning algorithm and demonstrate its use by processing histograms representing 2D hitmaps events. We prove the concept by detecting ion feedback events in the LHCb experiment's RICH subdetector.

  12. ATM test and integration. [Skylab Apollo Telescope Mount

    NASA Technical Reports Server (NTRS)

    Moore, J. W.; Mitchell, J. R.

    1974-01-01

    The test and checkout philosophy of the test program for the Skylab ATM module and the overall test flow including in-process, post-manufacturing, vibration, thermal vacuum, and prelaunch checkout activities are described. Capabilities and limitations of the test complex and its use of automation are discussed. Experiences with the organizational principle of using a dedicated test team for all checkout activities are reported. Material on the development of the ATM subsystems, the experimental program and the requirements of the scientific community, and the integration and verification of the complex systems/subsystems of the ATM are presented. The performance of the ATM test program in such areas as alignment, systems and subsystems, contamination control, and experiment operation is evaluated. The conclusions and recommendations resulting from the ATM test program are enumerated.

  13. Inspection and Verification of Domain Models with PlanWorks and Aver

    NASA Technical Reports Server (NTRS)

    Bedrax-Weiss, Tania; Frank, Jeremy; Iatauro, Michael; McGann, Conor

    2006-01-01

    When developing a domain model, it seems natural to bring the traditional informal tools of inspection and verification, debuggers and automated test suites, to bear upon the problems that will inevitably arise. Debuggers that allow inspection of registers and memory and stepwise execution have been a staple of software development of all sorts from the very beginning. Automated testing has repeatedly proven its considerable worth, to the extent that an entire design philosophy (Test Driven Development) has been developed around the writing of tests. Unfortunately, while not entirely without their uses, the limitations of these tools and the nature of the complexity of models and the underlying planning systems make the diagnosis of certain classes of problems and the verification of their solutions difficult or impossible. Debuggers provide a good local view of executing code, allowing a fine-grained look at algorithms and data. This view is, however, usually only at the level of the current scope in the implementation language, and the data-inspection capabilities of most debuggers usually consist of on-line print statements. More modem graphical debuggers offer a sort of tree view of data structures, but even this is too low-level and is often inappropriate for the kinds of structures created by planning systems. For instance, god or constraint networks are at best awkward when visualized as trees. Any any non-structural link between data structures, as through a lookup table, isn't captured at all. Further, while debuggers have powerful breakpointing facilities that are suitable for finding specific algorithmic errors, they have little use in the diagnosis of modeling errors.

  14. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    NASA Technical Reports Server (NTRS)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  15. Software engineering and automatic continuous verification of scientific software

    NASA Astrophysics Data System (ADS)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical solutions via the method of manufactured solutions. By developing and verifying code in tandem we avoid a number of pitfalls in scientific software development and advocate similar procedures for other scientific code applications.

  16. Evaluation of HCFC AK 225 Alternatives for Precision Cleaning and Verification

    NASA Technical Reports Server (NTRS)

    Melton, D. M.

    1998-01-01

    Maintaining qualified cleaning and verification processes are essential in an production environment. Environmental regulations have and are continuing to impact cleaning and verification processing in component and large structures, both at the Michoud Assembly Facility and component suppliers. The goal of the effort was to assure that the cleaning and verification proceeds unimpeded and that qualified, environmentally compliant material and process replacements are implemented and perform to specifications. The approach consisted of (1) selection of a Supersonic Gas-Liquid Cleaning System; (2) selection and evaluation of three cleaning and verification solvents as candidate alternatives to HCFC 225 (Vertrel 423 (HCFC), Vertrel MCA (HFC/1,2-Dichloroethylene), and HFE 7100DE (HFE/1,2 Dichloroethylene)); and evaluation of an analytical instrumental post cleaning verification technique. This document is presented in viewgraph format.

  17. Implementing Kanban for agile process management within the ALMA Software Operations Group

    NASA Astrophysics Data System (ADS)

    Reveco, Johnny; Mora, Matias; Shen, Tzu-Chiang; Soto, Ruben; Sepulveda, Jorge; Ibsen, Jorge

    2014-07-01

    After the inauguration of the Atacama Large Millimeter/submillimeter Array (ALMA), the Software Operations Group in Chile has refocused its objectives to: (1) providing software support to tasks related to System Integration, Scientific Commissioning and Verification, as well as Early Science observations; (2) testing the remaining software features, still under development by the Integrated Computing Team across the world; and (3) designing and developing processes to optimize and increase the level of automation of operational tasks. Due to their different stakeholders, each of these tasks presents a wide diversity of importances, lifespans and complexities. Aiming to provide the proper priority and traceability for every task without stressing our engineers, we introduced the Kanban methodology in our processes in order to balance the demand on the team against the throughput of the delivered work. The aim of this paper is to share experiences gained during the implementation of Kanban in our processes, describing the difficulties we have found, solutions and adaptations that led us to our current but still evolving implementation, which has greatly improved our throughput, prioritization and problem traceability.

  18. Java PathFinder User Guide

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    1999-01-01

    The JAVA PATHFINDER, JPF, is a translator from a subset of JAVA 1.0 to PROMELA, the programming language of the SPIN model checker. The purpose of JPF is to establish a framework for verification and debugging of JAVA programming based on model checking. The main goal is to automate program verification such that a programmer can apply it in the daily work without the need for a specialist to manually reformulate a program into a different notation in order to analyze the program. The system is especially suited for analyzing multi-threaded JAVA applications, where normal testing usually falls short. The system can find deadlocks and violations of boolean assertions stated by the programmer in a special assertion language. This document explains how to Use JPF.

  19. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qiu, J; Washington University in St Louis, St Louis, MO; Li, H. Harlod

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The mostmore » important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools.« less

  20. Certification-Based Process Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  1. Radiation Planning Assistant - A Streamlined, Fully Automated Radiotherapy Treatment Planning System

    PubMed Central

    Court, Laurence E.; Kisling, Kelly; McCarroll, Rachel; Zhang, Lifei; Yang, Jinzhong; Simonds, Hannah; du Toit, Monique; Trauernicht, Chris; Burger, Hester; Parkes, Jeannette; Mejia, Mike; Bojador, Maureen; Balter, Peter; Branco, Daniela; Steinmann, Angela; Baltz, Garrett; Gay, Skylar; Anderson, Brian; Cardenas, Carlos; Jhingran, Anuja; Shaitelman, Simona; Bogler, Oliver; Schmeller, Kathleen; Followill, David; Howell, Rebecca; Nelson, Christopher; Peterson, Christine; Beadle, Beth

    2018-01-01

    The Radiation Planning Assistant (RPA) is a system developed for the fully automated creation of radiotherapy treatment plans, including volume-modulated arc therapy (VMAT) plans for patients with head/neck cancer and 4-field box plans for patients with cervical cancer. It is a combination of specially developed in-house software that uses an application programming interface to communicate with a commercial radiotherapy treatment planning system. It also interfaces with a commercial secondary dose verification software. The necessary inputs to the system are a Treatment Plan Order, approved by the radiation oncologist, and a simulation computed tomography (CT) image, approved by the radiographer. The RPA then generates a complete radiotherapy treatment plan. For the cervical cancer treatment plans, no additional user intervention is necessary until the plan is complete. For head/neck treatment plans, after the normal tissue and some of the target structures are automatically delineated on the CT image, the radiation oncologist must review the contours, making edits if necessary. They also delineate the gross tumor volume. The RPA then completes the treatment planning process, creating a VMAT plan. Finally, the completed plan must be reviewed by qualified clinical staff. PMID:29708544

  2. Payload and Components Real-Time Automated Test System (PACRATS), Data Acquisition of Leak Rate and Pressure Data Test Procedure

    NASA Technical Reports Server (NTRS)

    Rinehart, Maegan L.

    2011-01-01

    The purpose of this activity is to provide the Mechanical Components Test Facility (MCTF) with the capability to obtain electronic leak test and proof pressure data, Payload and Components Real-time Automated Test System (PACRATS) data acquisition software will be utilized to display real-time data. It will record leak rates and pressure/vacuum level(s) simultaneously. This added functionality will provide electronic leak test and pressure data at specified sampling frequencies. Electronically stored data will provide ES61 with increased data security, analysis, and accuracy. The tasks performed in this procedure are to verify PACRATS only, and are not intended to provide verifications for MCTF equipment.

  3. The Environmental Control and Life Support System (ECLSS) advanced automation project

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.; Carnes, Ray

    1990-01-01

    The objective of the environmental control and life support system (ECLSS) Advanced Automation Project is to influence the design of the initial and evolutionary Space Station Freedom Program (SSFP) ECLSS toward a man-made closed environment in which minimal flight and ground manpower is needed. Another objective includes capturing ECLSS design and development knowledge future missions. Our approach has been to (1) analyze the SSFP ECLSS, (2) envision as our goal a fully automated evolutionary environmental control system - an augmentation of the baseline, and (3) document the advanced software systems, hooks, and scars which will be necessary to achieve this goal. From this analysis, prototype software is being developed, and will be tested using air and water recovery simulations and hardware subsystems. In addition, the advanced software is being designed, developed, and tested using automation software management plan and lifecycle tools. Automated knowledge acquisition, engineering, verification and testing tools are being used to develop the software. In this way, we can capture ECLSS development knowledge for future use develop more robust and complex software, provide feedback to the knowledge based system tool community, and ensure proper visibility of our efforts.

  4. Model Based Verification of Cyber Range Event Environments

    DTIC Science & Technology

    2015-11-13

    Commercial and Open Source Systems," in SOSP, Cascais, Portugal, 2011. [3] Sanjai Narain, Sharad Malik, and Ehab Al-Shaer, "Towards Eliminating...Configuration Errors in Cyber Infrastructure," in 4th IEEE Symposium on Configuration Analytics and Automation, Arlington, VA, 2011. [4] Sanjai Narain...Verlag, 2010. [5] Sanjai Narain, "Network Configuration Management via Model Finding," in 19th Large Installation System Administration Conference, San

  5. Aqueous cleaning and verification processes for precision cleaning of small parts

    NASA Technical Reports Server (NTRS)

    Allen, Gale J.; Fishell, Kenneth A.

    1995-01-01

    The NASA Kennedy Space Center (KSC) Materials Science Laboratory (MSL) has developed a totally aqueous process for precision cleaning and verification of small components. In 1990 the Precision Cleaning Facility at KSC used approximately 228,000 kg (500,000 lbs) of chlorofluorocarbon (CFC) 113 in the cleaning operations. It is estimated that current CFC 113 usage has been reduced by 75 percent and it is projected that a 90 percent reduction will be achieved by the end of calendar year 1994. The cleaning process developed utilizes aqueous degreasers, aqueous surfactants, and ultrasonics in the cleaning operation and an aqueous surfactant, ultrasonics, and Total Organic Carbon Analyzer (TOCA) in the nonvolatile residue (NVR) and particulate analysis for verification of cleanliness. The cleaning and verification process is presented in its entirety, with comparison to the CFC 113 cleaning and verification process, including economic and labor costs/savings.

  6. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  7. Simplifying operations with an uplink/downlink integration toolkit

    NASA Technical Reports Server (NTRS)

    Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine

    1994-01-01

    The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.

  8. Simplifying operations with an uplink/downlink integration toolkit

    NASA Astrophysics Data System (ADS)

    Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine

    1994-11-01

    The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.

  9. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What is the Platform Verification Program? 250... Platforms and Structures Platform Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms...

  10. 49 CFR 236.905 - Railroad Safety Program Plan (RSPP).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... to be used in the verification and validation process, consistent with appendix C to this part. The...; and (iv) The identification of the safety assessment process. (2) Design for verification and validation. The RSPP must require the identification of verification and validation methods for the...

  11. 49 CFR 236.905 - Railroad Safety Program Plan (RSPP).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... to be used in the verification and validation process, consistent with appendix C to this part. The...; and (iv) The identification of the safety assessment process. (2) Design for verification and validation. The RSPP must require the identification of verification and validation methods for the...

  12. 49 CFR 236.905 - Railroad Safety Program Plan (RSPP).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... to be used in the verification and validation process, consistent with appendix C to this part. The...; and (iv) The identification of the safety assessment process. (2) Design for verification and validation. The RSPP must require the identification of verification and validation methods for the...

  13. 49 CFR 236.905 - Railroad Safety Program Plan (RSPP).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... to be used in the verification and validation process, consistent with appendix C to this part. The...; and (iv) The identification of the safety assessment process. (2) Design for verification and validation. The RSPP must require the identification of verification and validation methods for the...

  14. Assessing the viability of `over-the-loop' real-time short-to-medium range ensemble streamflow forecasts

    NASA Astrophysics Data System (ADS)

    Wood, A. W.; Clark, E.; Mendoza, P. A.; Nijssen, B.; Newman, A. J.; Clark, M. P.; Arnold, J.; Nowak, K. C.

    2016-12-01

    Many if not most national operational short-to-medium range streamflow prediction systems rely on a forecaster-in-the-loop approach in which some parts of the forecast workflow are automated, but others require the hands-on-effort of an experienced human forecaster. This approach evolved out of the need to correct for deficiencies in the models and datasets that were available for forecasting, and often leads to skillful predictions despite the use of relatively simple, conceptual models. On the other hand, the process is not reproducible, which limits opportunities to assess and incorporate process variations, and the effort required to make forecasts in this way is an obstacle to expanding forecast services - e.g., though adding new forecast locations or more frequent forecast updates, running more complex models, or producing forecast ensembles and hindcasts that can support verification. In the last decade, the hydrologic forecasting community has begun to develop more centralized, `over-the-loop' systems. The quality of these new forecast products will depend on their ability to leverage research in areas including earth system modeling, parameter estimation, data assimilation, statistical post-processing, weather and climate prediction, verification, and uncertainty estimation through the use of ensembles. Currently, the operational streamflow forecasting and water management communities have little experience with the strengths and weaknesses of over-the-loop approaches, even as the systems are being rolled out in major operational forecasting centers. There is thus a need both to evaluate these forecasting advances and to demonstrate their potential in a public arena, raising awareness in forecast user communities and development programs alike. To address this need, the National Center for Atmospheric Research is collaborating with the University of Washington, the Bureau of Reclamation and the US Army Corps of Engineers, using the NCAR 'System for Hydromet Analysis, Research, and Prediction' (SHARP) to implement, assess and demonstrate real-time over-the-loop forecasts. We present early hindcast and verification results from SHARP for short to medium range streamflow forecasts in a number of US case study watersheds.

  15. An Efficient Location Verification Scheme for Static Wireless Sensor Networks.

    PubMed

    Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok

    2017-01-24

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.

  16. An Efficient Location Verification Scheme for Static Wireless Sensor Networks

    PubMed Central

    Kim, In-hwan; Kim, Bo-sung; Song, JooSeok

    2017-01-01

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007

  17. Automated Verification of Spatial Resolution in Remotely Sensed Imagery

    NASA Technical Reports Server (NTRS)

    Davis, Bruce; Ryan, Robert; Holekamp, Kara; Vaughn, Ronald

    2011-01-01

    Image spatial resolution characteristics can vary widely among sources. In the case of aerial-based imaging systems, the image spatial resolution characteristics can even vary between acquisitions. In these systems, aircraft altitude, speed, and sensor look angle all affect image spatial resolution. Image spatial resolution needs to be verified with estimators that include the ground sample distance (GSD), the modulation transfer function (MTF), and the relative edge response (RER), all of which are key components of image quality, along with signal-to-noise ratio (SNR) and dynamic range. Knowledge of spatial resolution parameters is important to determine if features of interest are distinguishable in imagery or associated products, and to develop image restoration algorithms. An automated Spatial Resolution Verification Tool (SRVT) was developed to rapidly determine the spatial resolution characteristics of remotely sensed aerial and satellite imagery. Most current methods for assessing spatial resolution characteristics of imagery rely on pre-deployed engineered targets and are performed only at selected times within preselected scenes. The SRVT addresses these insufficiencies by finding uniform, high-contrast edges from urban scenes and then using these edges to determine standard estimators of spatial resolution, such as the MTF and the RER. The SRVT was developed using the MATLAB programming language and environment. This automated software algorithm assesses every image in an acquired data set, using edges found within each image, and in many cases eliminating the need for dedicated edge targets. The SRVT automatically identifies high-contrast, uniform edges and calculates the MTF and RER of each image, and when possible, within sections of an image, so that the variation of spatial resolution characteristics across the image can be analyzed. The automated algorithm is capable of quickly verifying the spatial resolution quality of all images within a data set, enabling the appropriate use of those images in a number of applications.

  18. Student-Teacher Linkage Verification: Model Process and Recommendations

    ERIC Educational Resources Information Center

    Watson, Jeffery; Graham, Matthew; Thorn, Christopher A.

    2012-01-01

    As momentum grows for tracking the role of individual educators in student performance, school districts across the country are implementing projects that involve linking teachers to their students. Programs that link teachers to student outcomes require a verification process for student-teacher linkages. Linkage verification improves accuracy by…

  19. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What is the Platform Verification Program? 250... Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms; platforms of a new or unique design...

  20. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  1. Low-Cost, High-Throughput Sequencing of DNA Assemblies Using a Highly Multiplexed Nextera Process.

    PubMed

    Shapland, Elaine B; Holmes, Victor; Reeves, Christopher D; Sorokin, Elena; Durot, Maxime; Platt, Darren; Allen, Christopher; Dean, Jed; Serber, Zach; Newman, Jack; Chandran, Sunil

    2015-07-17

    In recent years, next-generation sequencing (NGS) technology has greatly reduced the cost of sequencing whole genomes, whereas the cost of sequence verification of plasmids via Sanger sequencing has remained high. Consequently, industrial-scale strain engineers either limit the number of designs or take short cuts in quality control. Here, we show that over 4000 plasmids can be completely sequenced in one Illumina MiSeq run for less than $3 each (15× coverage), which is a 20-fold reduction over using Sanger sequencing (2× coverage). We reduced the volume of the Nextera tagmentation reaction by 100-fold and developed an automated workflow to prepare thousands of samples for sequencing. We also developed software to track the samples and associated sequence data and to rapidly identify correctly assembled constructs having the fewest defects. As DNA synthesis and assembly become a centralized commodity, this NGS quality control (QC) process will be essential to groups operating high-throughput pipelines for DNA construction.

  2. Improved cyberinfrastructure for integrated hydrometeorological predictions within the fully-coupled WRF-Hydro modeling system

    NASA Astrophysics Data System (ADS)

    gochis, David; hooper, Rick; parodi, Antonio; Jha, Shantenu; Yu, Wei; Zaslavsky, Ilya; Ganapati, Dinesh

    2014-05-01

    The community WRF-Hydro system is currently being used in a variety of flood prediction and regional hydroclimate impacts assessment applications around the world. Despite its increasingly wide use certain cyberinfrastructure bottlenecks exist in the setup, execution and post-processing of WRF-Hydro model runs. These bottlenecks result in wasted time, labor, data transfer bandwidth and computational resource use. Appropriate development and use of cyberinfrastructure to setup and manage WRF-Hydro modeling applications will streamline the entire workflow of hydrologic model predictions. This talk will present recent advances in the development and use of new open-source cyberinfrastructure tools for the WRF-Hydro architecture. These tools include new web-accessible pre-processing applications, supercomputer job management applications and automated verification and visualization applications. The tools will be described successively and then demonstrated in a set of flash flood use cases for recent destructive flood events in the U.S. and in Europe. Throughout, an emphasis on the implementation and use of community data standards for data exchange is made.

  3. A rule-based system for real-time analysis of control systems

    NASA Astrophysics Data System (ADS)

    Larson, Richard R.; Millard, D. Edward

    1992-10-01

    An approach to automate the real-time analysis of flight critical health monitoring and system status is being developed and evaluated at the NASA Dryden Flight Research Facility. A software package was developed in-house and installed as part of the extended aircraft interrogation and display system. This design features a knowledge-base structure in the form of rules to formulate interpretation and decision logic of real-time data. This technique has been applied for ground verification and validation testing and flight testing monitoring where quick, real-time, safety-of-flight decisions can be very critical. In many cases post processing and manual analysis of flight system data are not required. The processing is described of real-time data for analysis along with the output format which features a message stack display. The development, construction, and testing of the rule-driven knowledge base, along with an application using the X-31A flight test program, are presented.

  4. A rule-based system for real-time analysis of control systems

    NASA Technical Reports Server (NTRS)

    Larson, Richard R.; Millard, D. Edward

    1992-01-01

    An approach to automate the real-time analysis of flight critical health monitoring and system status is being developed and evaluated at the NASA Dryden Flight Research Facility. A software package was developed in-house and installed as part of the extended aircraft interrogation and display system. This design features a knowledge-base structure in the form of rules to formulate interpretation and decision logic of real-time data. This technique has been applied for ground verification and validation testing and flight testing monitoring where quick, real-time, safety-of-flight decisions can be very critical. In many cases post processing and manual analysis of flight system data are not required. The processing is described of real-time data for analysis along with the output format which features a message stack display. The development, construction, and testing of the rule-driven knowledge base, along with an application using the X-31A flight test program, are presented.

  5. Compromises produced by the dialectic between self-verification and self-enhancement.

    PubMed

    Morling, B; Epstein, S

    1997-12-01

    Three studies of people's reactions to evaluative feedback demonstrated that the dialectic between self-enhancement and self-verification results in compromises between these 2 motives, as hypothesized in cognitive-experiential self-theory. The demonstration was facilitated by 2 procedural improvements: Enhancement and verification were established by calibrating evaluative feedback against self appraisals, and degree of enhancement and of verification were varied along a continuum, rather than categorically. There was also support for the hypotheses that processing in an intuitive-experiential mode favors enhancement and processing in an analytical-rational mode favors verification in the kinds of situations investigated.

  6. Advanced Plant Habitat Flight Unit #1

    NASA Image and Video Library

    2017-07-24

    Inside a laboratory in the Space Station Processing Facility at NASA's Kennedy Space Center in Florida, LED plant growth lights are being checked out on the hardware for the Advanced Plant Habitat flight unit. The flight unit is an exact replica of the APH that was delivered to the International Space Station. Validation tests and post-delivery checkout was performed to prepare for space station in-orbit APH activities. The flight unit will be moved to the International Space Station Environmental Simulator to begin an experiment verification test for the science that will fly on the first mission, PH-01. Developed by NASA and ORBITEC of Madison, Wisconsin, the APH is the largest plant chamber built for the agency. It is a fully automated plant growth facility that will be used to conduct bioscience research on the space station.

  7. Land use/land cover mapping (1:25000) of Taiwan, Republic of China by automated multispectral interpretation of LANDSAT imagery

    NASA Technical Reports Server (NTRS)

    Sung, Q. C.; Miller, L. D.

    1977-01-01

    Three methods were tested for collection of the training sets needed to establish the spectral signatures of the land uses/land covers sought due to the difficulties of retrospective collection of representative ground control data. Computer preprocessing techniques applied to the digital images to improve the final classification results were geometric corrections, spectral band or image ratioing and statistical cleaning of the representative training sets. A minimal level of statistical verification was made based upon the comparisons between the airphoto estimates and the classification results. The verifications provided a further support to the selection of MSS band 5 and 7. It also indicated that the maximum likelihood ratioing technique can achieve more agreeable classification results with the airphoto estimates than the stepwise discriminant analysis.

  8. Bay of Fundy verification of a system for multidate Landsat measurement of suspended sediment

    NASA Technical Reports Server (NTRS)

    Munday, J. C., Jr.; Afoldi, T. T.; Amos, C. L.

    1981-01-01

    A system for automated multidate Landsat CCT MSS measurement of suspended sediment concentration (S) has been implemented and verified on nine sets (108 points) of data from the Bay of Fundy, Canada. The system employs 'chromaticity analysis' to provide automatic pixel-by-pixel adjustment of atmospheric variations, permitting reference calibration data from one or several dates to be spatially and temporally extrapolated to other regions and to other dates. For verification, each data set was used in turn as test data against the remainder as a calibration set: the average absolute error was 44 percent of S over the range 1-1000 mg/l. The system can be used to measure chlorophyll (in the absence of atmospheric variations), Secchi disk depth, and turbidity.

  9. An unattended verification station for UF6 cylinders: Field trial findings

    NASA Astrophysics Data System (ADS)

    Smith, L. E.; Miller, K. A.; McDonald, B. S.; Webster, J. B.; Zalavadia, M. A.; Garner, J. R.; Stewart, S. L.; Branney, S. J.; Todd, L. C.; Deshmukh, N. S.; Nordquist, H. A.; Kulisek, J. A.; Swinhoe, M. T.

    2017-12-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. Analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 "typical" Type 30B cylinders, and the viability of an "NDA Fingerprint" concept as a high-fidelity means to periodically verify that material diversion has not occurred.

  10. An automated system for pulmonary function testing

    NASA Technical Reports Server (NTRS)

    Mauldin, D. G.

    1974-01-01

    An experiment to quantitate pulmonary function was accepted for the space shuttle concept verification test. The single breath maneuver and the nitrogen washout are combined to reduce the test time. Parameters are defined from the forced vital capacity maneuvers. A spirometer measures the breath volume and a magnetic section mass spectrometer provides definition of gas composition. Mass spectrometer and spirometer data are analyzed by a PDP-81 digital computer.

  11. Regional Seismic Arrays and Nuclear Test Ban Verification

    DTIC Science & Technology

    1990-12-01

    estimation has been difficult to automate, at least for regional and teleseismic signals. A neural network approach might be applicable here. The data must...use of trained neural networks . Of the 95 events examined, 66 were selected for the classification study based on high signal-to-noise ratio and...the International Joint Conference on Neural Networks , Washington, D.C., June, 1989. Menke, W. Geophysical Data Analysis : Discrete Inverse Theory

  12. Motion Planning of Two Stacker Cranes in a Large-Scale Automated Storage/Retrieval System

    NASA Astrophysics Data System (ADS)

    Kung, Yiheng; Kobayashi, Yoshimasa; Higashi, Toshimitsu; Ota, Jun

    We propose a method for reducing the computational time of motion planning for stacker cranes. Most automated storage/retrieval systems (AS/RSs) are only equipped with one stacker crane. However, this is logistically challenging, and greater work efficiency in warehouses, such as those using two stacker cranes, is required. In this paper, a warehouse with two stacker cranes working simultaneously is proposed. Unlike warehouses with only one crane, trajectory planning in those with two cranes is very difficult. Since there are two cranes working together, a proper trajectory must be considered to avoid collision. However, verifying collisions is complicated and requires a considerable amount of computational time. As transport work in AS/RSs occurs randomly, motion planning cannot be conducted in advance. Planning an appropriate trajectory within a restricted duration would be a difficult task. We thereby address the current problem of motion planning requiring extensive calculation time. As a solution, we propose a “free-step” to simplify the procedure of collision verification and reduce the computational time. On the other hand, we proposed a method to reschedule the order of collision verification in order to find an appropriate trajectory in less time. By the proposed method, we reduce the calculation time to less than 1/300 of that achieved in former research.

  13. Automated Storm Tracking and the Lightning Jump Algorithm Using GOES-R Geostationary Lightning Mapper (GLM) Proxy Data.

    PubMed

    Schultz, Elise V; Schultz, Christopher J; Carey, Lawrence D; Cecil, Daniel J; Bateman, Monte

    2016-01-01

    This study develops a fully automated lightning jump system encompassing objective storm tracking, Geostationary Lightning Mapper proxy data, and the lightning jump algorithm (LJA), which are important elements in the transition of the LJA concept from a research to an operational based algorithm. Storm cluster tracking is based on a product created from the combination of a radar parameter (vertically integrated liquid, VIL), and lightning information (flash rate density). Evaluations showed that the spatial scale of tracked features or storm clusters had a large impact on the lightning jump system performance, where increasing spatial scale size resulted in decreased dynamic range of the system's performance. This framework will also serve as a means to refine the LJA itself to enhance its operational applicability. Parameters within the system are isolated and the system's performance is evaluated with adjustments to parameter sensitivity. The system's performance is evaluated using the probability of detection (POD) and false alarm ratio (FAR) statistics. Of the algorithm parameters tested, sigma-level (metric of lightning jump strength) and flash rate threshold influenced the system's performance the most. Finally, verification methodologies are investigated. It is discovered that minor changes in verification methodology can dramatically impact the evaluation of the lightning jump system.

  14. Automated Storm Tracking and the Lightning Jump Algorithm Using GOES-R Geostationary Lightning Mapper (GLM) Proxy Data

    NASA Technical Reports Server (NTRS)

    Schultz, Elise; Schultz, Christopher Joseph; Carey, Lawrence D.; Cecil, Daniel J.; Bateman, Monte

    2016-01-01

    This study develops a fully automated lightning jump system encompassing objective storm tracking, Geostationary Lightning Mapper proxy data, and the lightning jump algorithm (LJA), which are important elements in the transition of the LJA concept from a research to an operational based algorithm. Storm cluster tracking is based on a product created from the combination of a radar parameter (vertically integrated liquid, VIL), and lightning information (flash rate density). Evaluations showed that the spatial scale of tracked features or storm clusters had a large impact on the lightning jump system performance, where increasing spatial scale size resulted in decreased dynamic range of the system's performance. This framework will also serve as a means to refine the LJA itself to enhance its operational applicability. Parameters within the system are isolated and the system's performance is evaluated with adjustments to parameter sensitivity. The system's performance is evaluated using the probability of detection (POD) and false alarm ratio (FAR) statistics. Of the algorithm parameters tested, sigma-level (metric of lightning jump strength) and flash rate threshold influenced the system's performance the most. Finally, verification methodologies are investigated. It is discovered that minor changes in verification methodology can dramatically impact the evaluation of the lightning jump system.

  15. Automated Storm Tracking and the Lightning Jump Algorithm Using GOES-R Geostationary Lightning Mapper (GLM) Proxy Data

    PubMed Central

    SCHULTZ, ELISE V.; SCHULTZ, CHRISTOPHER J.; CAREY, LAWRENCE D.; CECIL, DANIEL J.; BATEMAN, MONTE

    2017-01-01

    This study develops a fully automated lightning jump system encompassing objective storm tracking, Geostationary Lightning Mapper proxy data, and the lightning jump algorithm (LJA), which are important elements in the transition of the LJA concept from a research to an operational based algorithm. Storm cluster tracking is based on a product created from the combination of a radar parameter (vertically integrated liquid, VIL), and lightning information (flash rate density). Evaluations showed that the spatial scale of tracked features or storm clusters had a large impact on the lightning jump system performance, where increasing spatial scale size resulted in decreased dynamic range of the system’s performance. This framework will also serve as a means to refine the LJA itself to enhance its operational applicability. Parameters within the system are isolated and the system’s performance is evaluated with adjustments to parameter sensitivity. The system’s performance is evaluated using the probability of detection (POD) and false alarm ratio (FAR) statistics. Of the algorithm parameters tested, sigma-level (metric of lightning jump strength) and flash rate threshold influenced the system’s performance the most. Finally, verification methodologies are investigated. It is discovered that minor changes in verification methodology can dramatically impact the evaluation of the lightning jump system. PMID:29303164

  16. Passive coherent location system simulation and evaluation

    NASA Astrophysics Data System (ADS)

    Slezák, Libor; Kvasnička, Michael; Pelant, Martin; Vávra, Jiř; Plšek, Radek

    2006-02-01

    Passive Coherent Location (PCL) is going to be important and perspective system of passive location of non cooperative and stealth targets. It works with the sources of irradiation of opportunity. PCL is intended to be a part of mobile Air Command and Control System (ACCS) as a Deployable ACCS Component (DAC). The company ERA works on PCL system parameters verification program by complete PCL simulator development since the year 2003. The Czech DoD takes financial participation on this program. The moving targets scenario, the RCS calculation by method of moment, ground clutter scattering and signal processing method (the bottle neck of the PCL) are available up to now in simulator tool. The digital signal (DSP) processing algorithms are performed both on simulated data and on real data measured at NATO C3 Agency in their Haag experiment. The Institute of Information Theory and Automation of the Academy of Sciences of the Czech Republic takes part on the implementation of the DSP algorithms in FPGA. The paper describes the simulator and signal processing structure and results both on simulated and measured data.

  17. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  18. International Space Station Requirement Verification for Commercial Visiting Vehicles

    NASA Technical Reports Server (NTRS)

    Garguilo, Dan

    2017-01-01

    The COTS program demonstrated NASA could rely on commercial providers for safe, reliable, and cost-effective cargo delivery to ISS. The ISS Program has developed a streamlined process to safely integrate commercial visiting vehicles and ensure requirements are met Levy a minimum requirement set (down from 1000s to 100s) focusing on the ISS interface and safety, reducing the level of NASA oversight/insight and burden on the commercial Partner. Partners provide a detailed verification and validation plan documenting how they will show they've met NASA requirements. NASA conducts process sampling to ensure that the established verification processes is being followed. NASA participates in joint verification events and analysis for requirements that require both parties verify. Verification compliance is approved by NASA and launch readiness certified at mission readiness reviews.

  19. Multi-level slug tests in highly permeable formations: 2. Hydraulic conductivity identification, method verification, and field applications

    USGS Publications Warehouse

    Zlotnik, V.A.; McGuire, V.L.

    1998-01-01

    Using the developed theory and modified Springer-Gelhar (SG) model, an identification method is proposed for estimating hydraulic conductivity from multi-level slug tests. The computerized algorithm calculates hydraulic conductivity from both monotonic and oscillatory well responses obtained using a double-packer system. Field verification of the method was performed at a specially designed fully penetrating well of 0.1-m diameter with a 10-m screen in a sand and gravel alluvial aquifer (MSEA site, Shelton, Nebraska). During well installation, disturbed core samples were collected every 0.6 m using a split-spoon sampler. Vertical profiles of hydraulic conductivity were produced on the basis of grain-size analysis of the disturbed core samples. These results closely correlate with the vertical profile of horizontal hydraulic conductivity obtained by interpreting multi-level slug test responses using the modified SG model. The identification method was applied to interpret the response from 474 slug tests in 156 locations at the MSEA site. More than 60% of responses were oscillatory. The method produced a good match to experimental data for both oscillatory and monotonic responses using an automated curve matching procedure. The proposed method allowed us to drastically increase the efficiency of each well used for aquifer characterization and to process massive arrays of field data. Recommendations generalizing this experience to massive application of the proposed method are developed.Using the developed theory and modified Springer-Gelhar (SG) model, an identification method is proposed for estimating hydraulic conductivity from multi-level slug tests. The computerized algorithm calculates hydraulic conductivity from both monotonic and oscillatory well responses obtained using a double-packer system. Field verification of the method was performed at a specially designed fully penetrating well of 0.1-m diameter with a 10-m screen in a sand and gravel alluvial aquifer (MSEA site, Shelton, Nebraska). During well installation, disturbed core samples were collected every 0.6 m using a split-spoon sampler. Vertical profiles of hydraulic conductivity were produced on the basis of grain-size analysis of the disturbed core samples. These results closely correlate with the vertical profile of horizontal hydraulic conductivity obtained by interpreting multi-level slug test responses using the modified SG model. The identification method was applied to interpret the response from 474 slug tests in 156 locations at the MSEA site. More than 60% of responses were oscillatory. The method produced a good match to experimental data for both oscillatory and monotonic responses using an automated curve matching procedure. The proposed method allowed us to drastically increase the efficiency of each well used for aquifer characterization and to process massive arrays of field data. Recommendations generalizing this experience to massive application of the proposed method are developed.

  20. Computer science: Key to a space program renaissance. The 1981 NASA/ASEE summer study on the use of computer science and technology in NASA. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Freitas, R. A., Jr. (Editor); Carlson, P. A. (Editor)

    1983-01-01

    Adoption of an aggressive computer science research and technology program within NASA will: (1) enable new mission capabilities such as autonomous spacecraft, reliability and self-repair, and low-bandwidth intelligent Earth sensing; (2) lower manpower requirements, especially in the areas of Space Shuttle operations, by making fuller use of control center automation, technical support, and internal utilization of state-of-the-art computer techniques; (3) reduce project costs via improved software verification, software engineering, enhanced scientist/engineer productivity, and increased managerial effectiveness; and (4) significantly improve internal operations within NASA with electronic mail, managerial computer aids, an automated bureaucracy and uniform program operating plans.

  1. National Scale Monitoring Reporting and Verification of Deforestation and Forest Degradation in Guyana

    NASA Astrophysics Data System (ADS)

    Bholanath, P.; Cort, K.

    2015-04-01

    Monitoring deforestation and forest degradation at national scale has been identified as a national priority under Guyana's REDD+ Programme. Based on Guyana's MRV (Monitoring Reporting and Verification) System Roadmap developed in 2009, Guyana sought to establish a comprehensive, national system to monitor, report and verify forest carbon emissions resulting from deforestation and forest degradation in Guyana. To date, four national annual assessments have been conducted: 2010, 2011, 2012 and 2013. Monitoring of forest change in 2010 was completed with medium resolution imagery, mainly Landsat 5. In 2011, assessment was conducted using a combination of Landsat (5 and 7) and for the first time, 5m high resolution imagery, with RapidEye coverage for approximately half of Guyana where majority of land use changes were taking place. Forest change in 2013 was determined using high resolution imagery for the whole of Guyana. The current method is an automated-assisted process of careful systematic manual interpretation of satellite imagery to identify deforestation based on different drivers of change. The minimum mapping unit (MMU) for deforestation is 1 ha (Guyana's forest definition) and a country-specific definition of 0.25 ha for degradation. The total forested area of Guyana is estimated as 18.39 million hectares (ha). In 2012 as planned, Guyana's forest area was reevaluated using RapidEye 5 m imagery. Deforestation in 2013 is estimated at 12 733 ha which equates to a total deforestation rate of 0.068%. Significant progress was made in 2012 and 2013, in mapping forest degradation. The area of forest degradation as measured by interpretation of 5 m RapidEye satellite imagery in 2013 was 4 352 ha. All results are subject to accuracy assessment and independent third party verification.

  2. Decision Engines for Software Analysis Using Satisfiability Modulo Theories Solvers

    NASA Technical Reports Server (NTRS)

    Bjorner, Nikolaj

    2010-01-01

    The area of software analysis, testing and verification is now undergoing a revolution thanks to the use of automated and scalable support for logical methods. A well-recognized premise is that at the core of software analysis engines is invariably a component using logical formulas for describing states and transformations between system states. The process of using this information for discovering and checking program properties (including such important properties as safety and security) amounts to automatic theorem proving. In particular, theorem provers that directly support common software constructs offer a compelling basis. Such provers are commonly called satisfiability modulo theories (SMT) solvers. Z3 is a state-of-the-art SMT solver. It is developed at Microsoft Research. It can be used to check the satisfiability of logical formulas over one or more theories such as arithmetic, bit-vectors, lists, records and arrays. The talk describes some of the technology behind modern SMT solvers, including the solver Z3. Z3 is currently mainly targeted at solving problems that arise in software analysis and verification. It has been applied to various contexts, such as systems for dynamic symbolic simulation (Pex, SAGE, Vigilante), for program verification and extended static checking (Spec#/Boggie, VCC, HAVOC), for software model checking (Yogi, SLAM), model-based design (FORMULA), security protocol code (F7), program run-time analysis and invariant generation (VS3). We will describe how it integrates support for a variety of theories that arise naturally in the context of the applications. There are several new promising avenues and the talk will touch on some of these and the challenges related to SMT solvers. Proceedings

  3. Rule groupings: An approach towards verification of expert systems

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala

    1991-01-01

    Knowledge-based expert systems are playing an increasingly important role in NASA space and aircraft systems. However, many of NASA's software applications are life- or mission-critical and knowledge-based systems do not lend themselves to the traditional verification and validation techniques for highly reliable software. Rule-based systems lack the control abstractions found in procedural languages. Hence, it is difficult to verify or maintain such systems. Our goal is to automatically structure a rule-based system into a set of rule-groups having a well-defined interface to other rule-groups. Once a rule base is decomposed into such 'firewalled' units, studying the interactions between rules would become more tractable. Verification-aid tools can then be developed to test the behavior of each such rule-group. Furthermore, the interactions between rule-groups can be studied in a manner similar to integration testing. Such efforts will go a long way towards increasing our confidence in the expert-system software. Our research efforts address the feasibility of automating the identification of rule groups, in order to decompose the rule base into a number of meaningful units.

  4. Distilling the Verification Process for Prognostics Algorithms

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai

    2013-01-01

    The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.

  5. Design and dosimetry of a few leaf electron collimator for energy modulated electron therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Yahya, Khalid; Verhaegen, Frank; Seuntjens, Jan

    2007-12-15

    Despite the capability of energy modulated electron therapy (EMET) to achieve highly conformal dose distributions in superficial targets it has not been widely implemented due to problems inherent in electron beam radiotherapy such as planning dosimetry accuracy, and verification as well as a lack of systems for automated delivery. In previous work we proposed a novel technique to deliver EMET using an automated 'few leaf electron collimator' (FLEC) that consists of four motor-driven leaves fit in a standard clinical electron beam applicator. Integrated with a Monte Carlo based optimization algorithm that utilizes patient-specific dose kernels, a treatment delivery was incorporatedmore » within the linear accelerator operation. The FLEC was envisioned to work as an accessory tool added to the clinical accelerator. In this article the design and construction of the FLEC prototype that match our compact design goals are presented. It is controlled using an in-house developed EMET controller. The structure of the software and the hardware characteristics of the EMET controller are demonstrated. Using a parallel plate ionization chamber, output measurements were obtained to validate the Monte Carlo calculations for a range of fields with different energies and sizes. Further verifications were also performed for comparing 1-D and 2-D dose distributions using energy independent radiochromic films. Comparisons between Monte Carlo calculations and measurements of complex intensity map deliveries show an overall agreement to within {+-}3%. This work confirms our design objectives of the FLEC that allow for automated delivery of EMET. Furthermore, the Monte Carlo dose calculation engine required for EMET planning was validated. The result supports the potential of the prototype FLEC for the planning and delivery of EMET.« less

  6. Semi-automated literature mining to identify putative biomarkers of disease from multiple biofluids

    PubMed Central

    2014-01-01

    Background Computational methods for mining of biomedical literature can be useful in augmenting manual searches of the literature using keywords for disease-specific biomarker discovery from biofluids. In this work, we develop and apply a semi-automated literature mining method to mine abstracts obtained from PubMed to discover putative biomarkers of breast and lung cancers in specific biofluids. Methodology A positive set of abstracts was defined by the terms ‘breast cancer’ and ‘lung cancer’ in conjunction with 14 separate ‘biofluids’ (bile, blood, breastmilk, cerebrospinal fluid, mucus, plasma, saliva, semen, serum, synovial fluid, stool, sweat, tears, and urine), while a negative set of abstracts was defined by the terms ‘(biofluid) NOT breast cancer’ or ‘(biofluid) NOT lung cancer.’ More than 5.3 million total abstracts were obtained from PubMed and examined for biomarker-disease-biofluid associations (34,296 positive and 2,653,396 negative for breast cancer; 28,355 positive and 2,595,034 negative for lung cancer). Biological entities such as genes and proteins were tagged using ABNER, and processed using Python scripts to produce a list of putative biomarkers. Z-scores were calculated, ranked, and used to determine significance of putative biomarkers found. Manual verification of relevant abstracts was performed to assess our method’s performance. Results Biofluid-specific markers were identified from the literature, assigned relevance scores based on frequency of occurrence, and validated using known biomarker lists and/or databases for lung and breast cancer [NCBI’s On-line Mendelian Inheritance in Man (OMIM), Cancer Gene annotation server for cancer genomics (CAGE), NCBI’s Genes & Disease, NCI’s Early Detection Research Network (EDRN), and others]. The specificity of each marker for a given biofluid was calculated, and the performance of our semi-automated literature mining method assessed for breast and lung cancer. Conclusions We developed a semi-automated process for determining a list of putative biomarkers for breast and lung cancer. New knowledge is presented in the form of biomarker lists; ranked, newly discovered biomarker-disease-biofluid relationships; and biomarker specificity across biofluids. PMID:25379168

  7. The Mediation of Mothers’ Self-Fulfilling Effects on Their Children’s Alcohol Use: Self-Verification, Informational Conformity and Modeling Processes

    PubMed Central

    Madon, Stephanie; Guyll, Max; Buller, Ashley A.; Scherr, Kyle C.; Willard, Jennifer; Spoth, Richard

    2010-01-01

    This research examined whether self-fulfilling prophecy effects are mediated by self-verification, informational conformity, and modeling processes. The authors examined these mediational processes across multiple time frames with longitudinal data obtained from two samples of mother – child dyads (N1 = 487; N2 = 287). Children’s alcohol use was the outcome variable. The results provided consistent support for the mediational process of self-verification. In both samples and across several years of adolescence, there was a significant indirect effect of mothers’ beliefs on children’s alcohol use through children’s self-assessed likelihood of drinking alcohol in the future. Comparatively less support was found for informational conformity and modeling processes as mediators of mothers’ self-fulfilling effects. The potential for self-fulfilling prophecies to produce long lasting changes in targets’ behavior via self-verification processes are discussed. PMID:18665708

  8. The mediation of mothers' self-fulfilling effects on their children's alcohol use: self-verification, informational conformity, and modeling processes.

    PubMed

    Madon, Stephanie; Guyll, Max; Buller, Ashley A; Scherr, Kyle C; Willard, Jennifer; Spoth, Richard

    2008-08-01

    This research examined whether self-fulfilling prophecy effects are mediated by self-verification, informational conformity, and modeling processes. The authors examined these mediational processes across multiple time frames with longitudinal data obtained from two samples of mother-child dyads (N-sub-1 = 486; N-sub-2 = 287), with children's alcohol use as the outcome variable. The results provided consistent support for the mediational process of self-verification. In both samples and across several years of adolescence, there was a significant indirect effect of mothers' beliefs on children's alcohol use through children's self-assessed likelihood of drinking alcohol in the future. Comparatively less support was found for informational conformity and modeling processes as mediators of mothers' self-fulfilling effects. The potential for self-fulfilling prophecies to produce long-lasting changes in targets' behavior via self-verification processes are discussed. (c) 2008 APA, all rights reserved

  9. Design, Development, and Automated Verification of an Integrity-Protected Hypervisor

    DTIC Science & Technology

    2012-07-16

    mechanism for implementing software virtualization. Since hypervisors execute at a very high privilege level, they must be secure. A fundamental security...using the CBMC model checker. CBMC verified XMHF?s implementation ? about 4700 lines of C code ? in about 80 seconds using less than 2GB of RAM. 15...Hypervisors are a popular mechanism for implementing software virtualization. Since hypervisors execute at a very high privilege level, they must be

  10. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  11. Human-Agent Teaming for Multi-Robot Control: A Literature Review

    DTIC Science & Technology

    2013-02-01

    neurophysiological devices are becoming more cost effective and less invasive, future systems will most likely take advantage of this technology to monitor...Parasuraman et al., 1993). It has also been reported that both the cost of automation errors and the cost of verification affect humans’ reliance on...decision aids, and the effects are also moderated by age (Ezer et al., 2008). Generally, reliance is reduced as the cost of error increases and it

  12. System diagnostic builder

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph L.; Burke, Roger

    1992-01-01

    The System Diagnostic Builder (SDB) is an automated software verification and validation tool using state-of-the-art Artificial Intelligence (AI) technologies. The SDB is used extensively by project BURKE at NASA-JSC as one component of a software re-engineering toolkit. The SDB is applicable to any government or commercial organization which performs verification and validation tasks. The SDB has an X-window interface, which allows the user to 'train' a set of rules for use in a rule-based evaluator. The interface has a window that allows the user to plot up to five data parameters (attributes) at a time. Using these plots and a mouse, the user can identify and classify a particular behavior of the subject software. Once the user has identified the general behavior patterns of the software, he can train a set of rules to represent his knowledge of that behavior. The training process builds rules and fuzzy sets to use in the evaluator. The fuzzy sets classify those data points not clearly identified as a particular classification. Once an initial set of rules is trained, each additional data set given to the SDB will be used by a machine learning mechanism to refine the rules and fuzzy sets. This is a passive process and, therefore, it does not require any additional operator time. The evaluation component of the SDB can be used to validate a single software system using some number of different data sets, such as a simulator. Moreover, it can be used to validate software systems which have been re-engineered from one language and design methodology to a totally new implementation.

  13. Image quality specification and maintenance for airborne SAR

    NASA Astrophysics Data System (ADS)

    Clinard, Mark S.

    2004-08-01

    Specification, verification, and maintenance of image quality over the lifecycle of an operational airborne SAR begin with the specification for the system itself. Verification of image quality-oriented specification compliance can be enhanced by including a specification requirement that a vendor provide appropriate imagery at the various phases of the system life cycle. The nature and content of the imagery appropriate for each stage of the process depends on the nature of the test, the economics of collection, and the availability of techniques to extract the desired information from the data. At the earliest lifecycle stages, Concept and Technology Development (CTD) and System Development and Demonstration (SDD), the test set could include simulated imagery to demonstrate the mathematical and engineering concepts being implemented thus allowing demonstration of compliance, in part, through simulation. For Initial Operational Test and Evaluation (IOT&E), imagery collected from precisely instrumented test ranges and targets of opportunity consisting of a priori or a posteriori ground-truthed cultural and natural features are of value to the analysis of product quality compliance. Regular monitoring of image quality is possible using operational imagery and automated metrics; more precise measurements can be performed with imagery of instrumented scenes, when available. A survey of image quality measurement techniques is presented along with a discussion of the challenges of managing an airborne SAR program with the scarce resources of time, money, and ground-truthed data. Recommendations are provided that should allow an improvement in the product quality specification and maintenance process with a minimal increase in resource demands on the customer, the vendor, the operational personnel, and the asset itself.

  14. A study of applications scribe frame data verifications using design rule check

    NASA Astrophysics Data System (ADS)

    Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki

    2013-06-01

    In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.

  15. Verification Test of Automated Robotic Assembly of Space Truss Structures

    NASA Technical Reports Server (NTRS)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1995-01-01

    A multidisciplinary program has been conducted at the Langley Research Center to develop operational procedures for supervised autonomous assembly of truss structures suitable for large-aperture antennas. The hardware and operations required to assemble a 102-member tetrahedral truss and attach 12 hexagonal panels were developed and evaluated. A brute-force automation approach was used to develop baseline assembly hardware and software techniques. However, as the system matured and operations were proven, upgrades were incorporated and assessed against the baseline test results. These upgrades included the use of distributed microprocessors to control dedicated end-effector operations, machine vision guidance for strut installation, and the use of an expert system-based executive-control program. This paper summarizes the developmental phases of the program, the results of several assembly tests, and a series of proposed enhancements. No problems that would preclude automated in-space assembly or truss structures have been encountered. The test system was developed at a breadboard level and continued development at an enhanced level is warranted.

  16. Efficient model checking of network authentication protocol based on SPIN

    NASA Astrophysics Data System (ADS)

    Tan, Zhi-hua; Zhang, Da-fang; Miao, Li; Zhao, Dan

    2013-03-01

    Model checking is a very useful technique for verifying the network authentication protocols. In order to improve the efficiency of modeling and verification on the protocols with the model checking technology, this paper first proposes a universal formalization description method of the protocol. Combined with the model checker SPIN, the method can expediently verify the properties of the protocol. By some modeling simplified strategies, this paper can model several protocols efficiently, and reduce the states space of the model. Compared with the previous literature, this paper achieves higher degree of automation, and better efficiency of verification. Finally based on the method described in the paper, we model and verify the Privacy and Key Management (PKM) authentication protocol. The experimental results show that the method of model checking is effective, which is useful for the other authentication protocols.

  17. Model verification of mixed dynamic systems. [POGO problem in liquid propellant rockets

    NASA Technical Reports Server (NTRS)

    Chrostowski, J. D.; Evensen, D. A.; Hasselman, T. K.

    1978-01-01

    A parameter-estimation method is described for verifying the mathematical model of mixed (combined interactive components from various engineering fields) dynamic systems against pertinent experimental data. The model verification problem is divided into two separate parts: defining a proper model and evaluating the parameters of that model. The main idea is to use differences between measured and predicted behavior (response) to adjust automatically the key parameters of a model so as to minimize response differences. To achieve the goal of modeling flexibility, the method combines the convenience of automated matrix generation with the generality of direct matrix input. The equations of motion are treated in first-order form, allowing for nonsymmetric matrices, modeling of general networks, and complex-mode analysis. The effectiveness of the method is demonstrated for an example problem involving a complex hydraulic-mechanical system.

  18. Model Checking for Verification of Interactive Health IT Systems

    PubMed Central

    Butler, Keith A.; Mercer, Eric; Bahrami, Ali; Tao, Cui

    2015-01-01

    Rigorous methods for design and verification of health IT systems have lagged far behind their proliferation. The inherent technical complexity of healthcare, combined with the added complexity of health information technology makes their resulting behavior unpredictable and introduces serious risk. We propose to mitigate this risk by formalizing the relationship between HIT and the conceptual work that increasingly typifies modern care. We introduce new techniques for modeling clinical workflows and the conceptual products within them that allow established, powerful modeling checking technology to be applied to interactive health IT systems. The new capability can evaluate the workflows of a new HIT system performed by clinicians and computers to improve safety and reliability. We demonstrate the method on a patient contact system to demonstrate model checking is effective for interactive systems and that much of it can be automated. PMID:26958166

  19. Automatic Compilation from High-Level Biologically-Oriented Programming Language to Genetic Regulatory Networks

    PubMed Central

    Beal, Jacob; Lu, Ting; Weiss, Ron

    2011-01-01

    Background The field of synthetic biology promises to revolutionize our ability to engineer biological systems, providing important benefits for a variety of applications. Recent advances in DNA synthesis and automated DNA assembly technologies suggest that it is now possible to construct synthetic systems of significant complexity. However, while a variety of novel genetic devices and small engineered gene networks have been successfully demonstrated, the regulatory complexity of synthetic systems that have been reported recently has somewhat plateaued due to a variety of factors, including the complexity of biology itself and the lag in our ability to design and optimize sophisticated biological circuitry. Methodology/Principal Findings To address the gap between DNA synthesis and circuit design capabilities, we present a platform that enables synthetic biologists to express desired behavior using a convenient high-level biologically-oriented programming language, Proto. The high level specification is compiled, using a regulatory motif based mechanism, to a gene network, optimized, and then converted to a computational simulation for numerical verification. Through several example programs we illustrate the automated process of biological system design with our platform, and show that our compiler optimizations can yield significant reductions in the number of genes () and latency of the optimized engineered gene networks. Conclusions/Significance Our platform provides a convenient and accessible tool for the automated design of sophisticated synthetic biological systems, bridging an important gap between DNA synthesis and circuit design capabilities. Our platform is user-friendly and features biologically relevant compiler optimizations, providing an important foundation for the development of sophisticated biological systems. PMID:21850228

  20. Automatic compilation from high-level biologically-oriented programming language to genetic regulatory networks.

    PubMed

    Beal, Jacob; Lu, Ting; Weiss, Ron

    2011-01-01

    The field of synthetic biology promises to revolutionize our ability to engineer biological systems, providing important benefits for a variety of applications. Recent advances in DNA synthesis and automated DNA assembly technologies suggest that it is now possible to construct synthetic systems of significant complexity. However, while a variety of novel genetic devices and small engineered gene networks have been successfully demonstrated, the regulatory complexity of synthetic systems that have been reported recently has somewhat plateaued due to a variety of factors, including the complexity of biology itself and the lag in our ability to design and optimize sophisticated biological circuitry. To address the gap between DNA synthesis and circuit design capabilities, we present a platform that enables synthetic biologists to express desired behavior using a convenient high-level biologically-oriented programming language, Proto. The high level specification is compiled, using a regulatory motif based mechanism, to a gene network, optimized, and then converted to a computational simulation for numerical verification. Through several example programs we illustrate the automated process of biological system design with our platform, and show that our compiler optimizations can yield significant reductions in the number of genes (~ 50%) and latency of the optimized engineered gene networks. Our platform provides a convenient and accessible tool for the automated design of sophisticated synthetic biological systems, bridging an important gap between DNA synthesis and circuit design capabilities. Our platform is user-friendly and features biologically relevant compiler optimizations, providing an important foundation for the development of sophisticated biological systems.

  1. Verification of false-positive blood culture results generated by the BACTEC 9000 series by eubacterial 16S rDNA and panfungal 18S rDNA directed polymerase chain reaction (PCR).

    PubMed

    Daxboeck, Florian; Dornbusch, Hans Jürgen; Krause, Robert; Assadian, Ojan; Wenisch, Christoph

    2004-01-01

    A small but significant proportion of blood cultures processed by the BACTEC 9000 series systems is signaled positive, while subsequent Gram's stain and culture on solid media yield no pathogens. In this study, 15 "false-positive" vials (7 aerobes, 8 anaerobes) from 15 patients were investigated for the presence of bacteria and fungi by eubacterial 16S rDNA and panfungal 18S rDNA amplification, respectively. All samples turned out negative by both methods. Most patients (7) had neutropenia, which does not support the theory that high leukocyte counts enhance the generation of false-positive results. In conclusion, the results of this study indicate that false-negative results generated by the BACTEC 9000 series are inherent to the automated detection and not due to the growth of fastidious organisms.

  2. Pattern matching techniques for correcting low-confidence OCR words in a known context

    NASA Astrophysics Data System (ADS)

    Ford, Glenn; Hauser, Susan E.; Le, Daniel X.; Thoma, George R.

    2000-12-01

    A commercial OCR system is a key component of a system developed at the National Library of Medicine for the automated extraction of bibliographic fields from biomedical journals. This 5-engine OCR system, while exhibiting high performance overall, does not reliably convert very small characters, especially those that are in italics. As a result, the 'affiliations' field that typically contains such characters in most journals, is not captured accurately, and requires a disproportionately high manual input. To correct this problem, dictionaries have been created from words occurring in this field (e.g., university, department, street addresses, names of cities, etc.) from 230,000 articles already processed. The OCR output corresponding to the affiliation field is then matched against these dictionary entries by approximate string-matching techniques, and the ranked matches are presented to operators for verification. This paper outlines the techniques employed and the results of a comparative evaluation.

  3. Use of Unstructured Event-Based Reports for Global Infectious Disease Surveillance

    PubMed Central

    Blench, Michael; Tolentino, Herman; Freifeld, Clark C.; Mandl, Kenneth D.; Mawudeku, Abla; Eysenbach, Gunther; Brownstein, John S.

    2009-01-01

    Free or low-cost sources of unstructured information, such as Internet news and online discussion sites, provide detailed local and near real-time data on disease outbreaks, even in countries that lack traditional public health surveillance. To improve public health surveillance and, ultimately, interventions, we examined 3 primary systems that process event-based outbreak information: Global Public Health Intelligence Network, HealthMap, and EpiSPIDER. Despite similarities among them, these systems are highly complementary because they monitor different data types, rely on varying levels of automation and human analysis, and distribute distinct information. Future development should focus on linking these systems more closely to public health practitioners in the field and establishing collaborative networks for alert verification and dissemination. Such development would further establish event-based monitoring as an invaluable public health resource that provides critical context and an alternative to traditional indicator-based outbreak reporting. PMID:19402953

  4. Formal Multilevel Hierarchical Verification of Synchronous MOS VLSI Circuits.

    DTIC Science & Technology

    1987-06-01

    166 12.4 Capacitance Coupling............................. 166 12.5 Multiple Abstraction Fuctions ....................... 168...depend on whether it is performing flat verification or hierarchical verification. The primary operations of Silica Pithecus when performing flat...signals never arise. The primary operation of Silica Pithecus when performing hierarchical verification is processing constraints to show they hold

  5. An unattended verification station for UF 6 cylinders: Field trial findings

    DOE PAGES

    Smith, L. E.; Miller, K. A.; McDonald, B. S.; ...

    2017-08-26

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less

  6. An unattended verification station for UF 6 cylinders: Field trial findings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, L. E.; Miller, K. A.; McDonald, B. S.

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less

  7. Process improvement of pap smear tracking in a women's medicine center clinic in residency training.

    PubMed

    Calhoun, Byron C; Goode, Jeff; Simmons, Kathy

    2011-11-01

    Application of Six-Sigma methodology and Change Acceleration Process (CAP)/Work Out (WO) tools to track pap smear results in an outpatient clinic in a hospital-based residency-training program. Observational study of impact of changes obtained through application of Six-Sigma principles in clinic process with particular attention to prevention of sentinel events. Using cohort analysis and applying Six-Sigma principles to an interactive electronic medical record Soarian workflow engine, we designed a system of timely accession and reporting of pap smear and pathology results. We compared manual processes from January 1, 2007 to February 28, 2008 to automated processes from March 1, 2008 to December 31, 2009. Using the Six-Sigma principles, CAP/WO tools, including "voice of the customer" and team focused approach, no outlier events went untracked. Applying the Soarian workflow engine to track prescribed 7 day turnaround time for completion, we identified 148 pap results in 3,936, 3 non-gynecological results in 15, and 41 surgical results in 246. We applied Six-Sigma principles to an outpatient clinic facilitating an interdisciplinary team approach to improve the clinic's reporting system. Through focused problem assessment, verification of process, and validation of outcomes, we improved patient care for pap smears and critical pathology. © 2011 National Association for Healthcare Quality.

  8. Automation of the Environmental Control and Life Support System

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.; Carnes, J. Ray

    1990-01-01

    The objective of the Environmental Control and Life Support System (ECLSS) Advanced Automation Project is to recommend and develop advanced software for the initial and evolutionary Space Station Freedom (SSF) ECLS system which will minimize the crew and ground manpower needed for operations. Another objective includes capturing ECLSS design and development knowledge for future missions. This report summarizes our results from Phase I, the ECLSS domain analysis phase, which we broke down into three steps: 1) Analyze and document the baselined ECLS system, 2) envision as our goal an evolution to a fully automated regenerative life support system, built upon an augmented baseline, and 3) document the augmentations (hooks and scars) and advanced software systems which we see as necessary in achieving minimal manpower support for ECLSS operations. In addition, Phase I included development of an advanced software life cycle testing tools will be used in the development of the software. In this way, we plan in preparation for phase II and III, the development and integration phases, respectively. Automated knowledge acquisition, engineering, verification, and can capture ECLSS development knowledge for future use, develop more robust and complex software, provide feedback to the KBS tool community, and insure proper visibility of our efforts.

  9. Data Quality Verification at STScI - Automated Assessment and Your Data

    NASA Astrophysics Data System (ADS)

    Dempsey, R.; Swade, D.; Scott, J.; Hamilton, F.; Holm, A.

    1996-12-01

    As satellite based observatories improve their ability to deliver wider varieties and more complex types of scientific data, so to does the process of analyzing and reducing these data. It becomes correspondingly imperative that Guest Observers or Archival Researchers have access to an accurate, consistent, and easily understandable summary of the quality of their data. Previously, at the STScI, an astronomer would display and examine the quality and scientific usefulness of every single observation obtained with HST. Recently, this process has undergone a major reorganization at the Institute. A major part of the new process is that the majority of data are assessed automatically with little or no human intervention. As part of routine processing in the OSS--PODPS Unified System (OPUS), the Observatory Monitoring System (OMS) observation logs, the science processing trailer file (also known as the TRL file), and the science data headers are inspected by an automated tool, AUTO_DQ. AUTO_DQ then determines if any anomalous events occurred during the observation or through processing and calibration of the data that affects the procedural quality of the data. The results are placed directly into the Procedural Data Quality (PDQ) file as a string of predefined data quality keywords and comments. These in turn are used by the Contact Scientist (CS) to check the scientific usefulness of the observations. In this manner, the telemetry stream is checked for known problems such as losses of lock, re-centerings, or degraded guiding, for example, while missing data or calibration errors are also easily flagged. If the problem is serious, the data are then queued for manual inspection by an astronomer. The success of every target acquisition is verified manually. If serious failures are confirmed, the PI and the scheduling staff are notified so that options concerning rescheduling the observations can be explored.

  10. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  11. How smart is your BEOL? productivity improvement through intelligent automation

    NASA Astrophysics Data System (ADS)

    Schulz, Kristian; Egodage, Kokila; Tabbone, Gilles; Garetto, Anthony

    2017-07-01

    The back end of line (BEOL) workflow in the mask shop still has crucial issues throughout all standard steps which are inspection, disposition, photomask repair and verification of repair success. All involved tools are typically run by highly trained operators or engineers who setup jobs and recipes, execute tasks, analyze data and make decisions based on the results. No matter how experienced operators are and how good the systems perform, there is one aspect that always limits the productivity and effectiveness of the operation: the human aspect. Human errors can range from seemingly rather harmless slip-ups to mistakes with serious and direct economic impact including mask rejects, customer returns and line stops in the wafer fab. Even with the introduction of quality control mechanisms that help to reduce these critical but unavoidable faults, they can never be completely eliminated. Therefore the mask shop BEOL cannot run in the most efficient manner as unnecessary time and money are spent on processes that still remain labor intensive. The best way to address this issue is to automate critical segments of the workflow that are prone to human errors. In fact, manufacturing errors can occur for each BEOL step where operators intervene. These processes comprise of image evaluation, setting up tool recipes, data handling and all other tedious but required steps. With the help of smart solutions, operators can work more efficiently and dedicate their time to less mundane tasks. Smart solutions connect tools, taking over the data handling and analysis typically performed by operators and engineers. These solutions not only eliminate the human error factor in the manufacturing process but can provide benefits in terms of shorter cycle times, reduced bottlenecks and prediction of an optimized workflow. In addition such software solutions consist of building blocks that seamlessly integrate applications and allow the customers to use tailored solutions. To accommodate for the variability and complexity in mask shops today, individual workflows can be supported according to the needs of any particular manufacturing line with respect to necessary measurement and production steps. At the same time the efficiency of assets is increased by avoiding unneeded cycle time and waste of resources due to the presence of process steps that are very crucial for a given technology. In this paper we present details of which areas of the BEOL can benefit most from intelligent automation, what solutions exist and the quantification of benefits to a mask shop with full automation by the use of a back end of line model.

  12. 49 CFR 40.135 - What does the MRO tell the employee at the beginning of the verification interview?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... beginning of the verification interview? 40.135 Section 40.135 Transportation Office of the Secretary of... verification interview? (a) As the MRO, you must tell the employee that the laboratory has determined that the... finding of adulteration or substitution. (b) You must explain the verification interview process to the...

  13. Nodal network generator for CAVE3

    NASA Technical Reports Server (NTRS)

    Palmieri, J. V.; Rathjen, K. A.

    1982-01-01

    A new extension of CAVE3 code was developed that automates the creation of a finite difference math model in digital form ready for input to the CAVE3 code. The new software, Nodal Network Generator, is broken into two segments. One segment generates the model geometry using a Tektronix Tablet Digitizer and the other generates the actual finite difference model and allows for graphic verification using Tektronix 4014 Graphic Scope. Use of the Nodal Network Generator is described.

  14. Comparative verification between GEM model and official aviation terminal forecasts

    NASA Technical Reports Server (NTRS)

    Miller, Robert G.

    1988-01-01

    The Generalized Exponential Markov (GEM) model uses the local standard airways observation (SAO) to predict hour-by-hour the following elements: temperature, pressure, dew point depression, first and second cloud-layer height and amount, ceiling, total cloud amount, visibility, wind, and present weather conditions. GEM is superior to persistence at all projections for all elements in a large independent sample. A minute-by-minute GEM forecasting system utilizing the Automated Weather Observation System (AWOS) is under development.

  15. Automated radiotherapy treatment plan integrity verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang Deshan; Moore, Kevin L.

    2012-03-15

    Purpose: In our clinic, physicists spend from 15 to 60 min to verify the physical and dosimetric integrity of radiotherapy plans before presentation to radiation oncology physicians for approval. The purpose of this study was to design and implement a framework to automate as many elements of this quality control (QC) step as possible. Methods: A comprehensive computer application was developed to carry out a majority of these verification tasks in the Philips PINNACLE treatment planning system (TPS). This QC tool functions based on both PINNACLE scripting elements and PERL sub-routines. The core of this technique is the method ofmore » dynamic scripting, which involves a PERL programming module that is flexible and powerful for treatment plan data handling. Run-time plan data are collected, saved into temporary files, and analyzed against standard values and predefined logical rules. The results were summarized in a hypertext markup language (HTML) report that is displayed to the user. Results: This tool has been in clinical use for over a year. The occurrence frequency of technical problems, which would cause delays and suboptimal plans, has been reduced since clinical implementation. Conclusions: In addition to drastically reducing the set of human-driven logical comparisons, this QC tool also accomplished some tasks that are otherwise either quite laborious or impractical for humans to verify, e.g., identifying conflicts amongst IMRT optimization objectives.« less

  16. Software verification plan for GCS. [guidance and control software

    NASA Technical Reports Server (NTRS)

    Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.

    1990-01-01

    This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.

  17. Low Cost Structures, but How Much are we Paying for Them?

    NASA Astrophysics Data System (ADS)

    Gomez Molinero, Vincent

    2014-06-01

    Based on more than 37 years developing spacecraft structures - both for launchers starting with Ariane-1 up to the most modern ones and for satellites of any type - a critical review of the current trends, aiming specially to low cost solutions, will be presented.Airbus Defence and Space (CASA Espacio previously) has been developing structures for launchers and satellites during more than 4 decades. All types of spacecraft structures - primary and secondary ones, high stability ones and special critical cases like antenna reflectors, high stiffness structures and load carrying ones - have been developed using different types of materials and structural constructions. Although our main expertise is concentrated on composite structures, we have also developed many types of metallic ones, when the best solution was that one, not necessarily only based on pure technical reasons.From that perspective and experience, this paper tries to review the current trend of imposing the low cost as the main requirement for the development of satellites and launchers and its intrinsic characteristic of being a non- ending process: the spacecraft structures are never sufficiently cheaper.The main ways used today to justify low cost spacecraft structures will be reviewed trying to understand their rationale and some prejudices always present when the trade-off studies are performed. Some of the reviewed cost-killing factors will be (non-exhaustive list) Material type (i.e.: metallic vs composite). Low cost materials in general. Manufacturing process (i.e.: autoclave curing vs out-of-autoclave one). Automation in manufacturing. Automation in assembly. Automation in inspection and verification. Lean manufacturing techniques. Standardization. Some insight about how to solve this problem without losing our distinctive nature (we are developing high performance systems many of them unique prototypes and thought to work in environments not perfectly known and highly unknown in some cases) will be provided from the author's point of view.

  18. Generic Protocol for the Verification of Ballast Water Treatment Technology. Version 5.1

    DTIC Science & Technology

    2010-09-01

    the Protocol ..................................................................................... 2 1.4 Verification Testing Process ...Volumes, Containers and Processing .................................................................38 Table 10. Recommendation for Water...or persistent distortion of a measurement process that causes errors in one direction. Challenge Water: Water supplied to a treatment system under

  19. An improved automated procedure for informal and temporary dwellings detection and enumeration, using mathematical morphology operators on VHR satellite data

    NASA Astrophysics Data System (ADS)

    Jenerowicz, Małgorzata; Kemper, Thomas

    2016-10-01

    Every year thousands of people are displaced by conflicts or natural disasters and often gather in large camps. Knowing how many people have been gathered is crucial for an efficient relief operation. However, it is often difficult to collect exact information on the total number of the population. This paper presents the improved morphological methodology for the estimation of dwellings structures located in several Internally Displaced Persons (IDPs) Camps, based on Very High Resolution (VHR) multispectral satellite imagery with pixel sizes of 1 meter or less including GeoEye-1, WorldView-2, QuickBird-2, Ikonos-2, Pléiades-A and Pléiades-B. The main topic of this paper is the approach enhancement with selection of feature extraction algorithm, the improvement and automation of pre-processing and results verification. For the informal and temporary dwellings extraction purpose the high quality of data has to be ensured. The pre-processing has been extended by including the input data hierarchy level assignment and data fusion method selection and evaluation. The feature extraction algorithm follows the procedure presented in Jenerowicz, M., Kemper, T., 2011. Optical data are analysed in a cyclic approach comprising image segmentation, geometrical, textural and spectral class modeling aiming at camp area identification. The successive steps of morphological processing have been combined in a one stand-alone application for automatic dwellings detection and enumeration. Actively implemented, these approaches can provide a reliable and consistent results, independent of the imaging satellite type and different study sites location, providing decision support in emergency response for the humanitarian community like United Nations, European Union and Non-Governmental relief organizations.

  20. Separating stages of arithmetic verification: An ERP study with a novel paradigm.

    PubMed

    Avancini, Chiara; Soltész, Fruzsina; Szűcs, Dénes

    2015-08-01

    In studies of arithmetic verification, participants typically encounter two operands and they carry out an operation on these (e.g. adding them). Operands are followed by a proposed answer and participants decide whether this answer is correct or incorrect. However, interpretation of results is difficult because multiple parallel, temporally overlapping numerical and non-numerical processes of the human brain may contribute to task execution. In order to overcome this problem here we used a novel paradigm specifically designed to tease apart the overlapping cognitive processes active during arithmetic verification. Specifically, we aimed to separate effects related to detection of arithmetic correctness, detection of the violation of strategic expectations, detection of physical stimulus properties mismatch and numerical magnitude comparison (numerical distance effects). Arithmetic correctness, physical stimulus properties and magnitude information were not task-relevant properties of the stimuli. We distinguished between a series of temporally highly overlapping cognitive processes which in turn elicited overlapping ERP effects with distinct scalp topographies. We suggest that arithmetic verification relies on two major temporal phases which include parallel running processes. Our paradigm offers a new method for investigating specific arithmetic verification processes in detail. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Monitoring/Verification using DMS: TATP Example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephan Weeks, Kevin Kyle, Manuel Manard

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations-management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biologicalmore » materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. Fast GC is the leading field analytical method for gas phase separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.« less

  2. Monitoring/Verification Using DMS: TATP Example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevin Kyle; Stephan Weeks

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operationsmanagement systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biologicalmore » materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. GC is the leading analytical method for the separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.« less

  3. Usefulness of biological fingerprint in magnetic resonance imaging for patient verification.

    PubMed

    Ueda, Yasuyuki; Morishita, Junji; Kudomi, Shohei; Ueda, Katsuhiko

    2016-09-01

    The purpose of our study is to investigate the feasibility of automated patient verification using multi-planar reconstruction (MPR) images generated from three-dimensional magnetic resonance (MR) imaging of the brain. Several anatomy-related MPR images generated from three-dimensional fast scout scan of each MR examination were used as biological fingerprint images in this study. The database of this study consisted of 730 temporal pairs of MR examination of the brain. We calculated the correlation value between current and prior biological fingerprint images of the same patient and also all combinations of two images for different patients to evaluate the effectiveness of our method for patient verification. The best performance of our system were as follows: a half-total error rate of 1.59 % with a false acceptance rate of 0.023 % and a false rejection rate of 3.15 %, an equal error rate of 1.37 %, and a rank-one identification rate of 98.6 %. Our method makes it possible to verify the identity of the patient using only some existing medical images without the addition of incidental equipment. Also, our method will contribute to patient misidentification error management caused by human errors.

  4. Process Document, Joint Verification Protocol, and Joint Test Plan for Verification of HACH-LANGE GmbH LUMIStox 300 Bench Top Luminometer and ECLOX Handheld Luminometer for Luminescent Bacteria Test for use in Wastewater

    EPA Science Inventory

    The Danish Environmental Technology Verification program (DANETV) Water Test Centre operated by DHI, is supported by the Danish Ministry for Science, Technology and Innovation. DANETV, the United States Environmental Protection Agency Environmental Technology Verification Progra...

  5. 49 CFR 236.903 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... the site-specific application programs, run timers, read inputs, drive outputs, perform self... validation process is to determine “whether the correct product was built.” Verification means the process of... established at the start of that phase. The goal of the verification process is to determine “whether the...

  6. 48 CFR 4.1300 - Scope of subpart.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ADMINISTRATIVE MATTERS Personal Identity Verification 4.1300 Scope of subpart. This subpart provides policy and procedures associated with Personal Identity Verification as required by— (a) Federal Information Processing Standards Publication (FIPS PUB) Number 201, “Personal Identity Verification of Federal Employees and...

  7. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  8. Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth

    2016-01-01

    If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.

  9. Authoring and verification of clinical guidelines: a model driven approach.

    PubMed

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc. All rights reserved.

  10. Software for Better Documentation of Other Software

    NASA Technical Reports Server (NTRS)

    Pinedo, John

    2003-01-01

    The Literate Programming Extraction Engine is a Practical Extraction and Reporting Language- (PERL-)based computer program that facilitates and simplifies the implementation of a concept of self-documented literate programming in a fashion tailored to the typical needs of scientists. The advantage for the programmer is that documentation and source code are written side-by-side in the same file, reducing the likelihood that the documentation will be inconsistent with the code and improving the verification that the code performs its intended functions. The advantage for the user is the knowledge that the documentation matches the software because they come from the same file. This program unifies the documentation process for a variety of programming languages, including C, C++, and several versions of FORTRAN. This program can process the documentation in any markup language, and incorporates the LaTeX typesetting software. The program includes sample Makefile scripts for automating both the code-compilation (when appropriate) and documentation-generation processes into a single command-line statement. Also included are macro instructions for the Emacs display-editor software, making it easy for a programmer to toggle between editing in a code or a documentation mode.

  11. An Efficient Method for Verifying Gyrokinetic Microstability Codes

    NASA Astrophysics Data System (ADS)

    Bravenec, R.; Candy, J.; Dorland, W.; Holland, C.

    2009-11-01

    Benchmarks for gyrokinetic microstability codes can be developed through successful ``apples-to-apples'' comparisons among them. Unlike previous efforts, we perform the comparisons for actual discharges, rendering the verification efforts relevant to existing experiments and future devices (ITER). The process requires i) assembling the experimental analyses at multiple times, radii, discharges, and devices, ii) creating the input files ensuring that the input parameters are faithfully translated code-to-code, iii) running the codes, and iv) comparing the results, all in an organized fashion. The purpose of this work is to automate this process as much as possible: At present, a python routine is used to generate and organize GYRO input files from TRANSP or ONETWO analyses. Another routine translates the GYRO input files into GS2 input files. (Translation software for other codes has not yet been written.) Other python codes submit the multiple GYRO and GS2 jobs, organize the results, and collect them into a table suitable for plotting. (These separate python routines could easily be consolidated.) An example of the process -- a linear comparison between GYRO and GS2 for a DIII-D discharge at multiple radii -- will be presented.

  12. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  13. Process development for automated solar cell and module production. Task 4: Automated array assembly

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A process sequence which can be used in conjunction with automated equipment for the mass production of solar cell modules for terrestrial use was developed. The process sequence was then critically analyzed from a technical and economic standpoint to determine the technological readiness of certain process steps for implementation. The steps receiving analysis were: back contact metallization, automated cell array layup/interconnect, and module edge sealing. For automated layup/interconnect, both hard automation and programmable automation (using an industrial robot) were studied. The programmable automation system was then selected for actual hardware development.

  14. PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Frederick, J. M.

    2016-12-01

    In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.

  15. Automated Discovery of Simulation Between Programs

    DTIC Science & Technology

    2014-10-18

    relation. These relations enable the refinement-step of SimAbs. We have implemented SimAbs using UFO framework and Z3 SMT-solver and applied it to...step of SimAbs. We implemented SimAbs and AE-VAL on the top of the UFO framework [1, 15] and an SMT-solver Z3 [8], respectively. We have evaluated SimAbs...ut 6 Evaluation We have implemented SimAbs in the UFO framework, and evaluated it on the Software Verification Competition (SVCOMP’14) benchmarks and

  16. JOVIAL J73 Automated Verification System - Study Phase

    DTIC Science & Technology

    1980-08-01

    capabil- ities for the tool, and the high-level design of the tool are also described. Future capabilities for the tool are identified. -N CONTENTS...Implemented Test Tools 3-22 4 FUNCTIONAL DESCRIPTION OF Ji3AVS 4-1 4.1 Summary of Capabilities 4-3 4.2 J 3.AVS Operat . 4-11 5 DESIGN OF J73AVS 5-1 6...Both JOVIAL languages are primarily designed for command and control system programming. They are es- pecially well suited to large systems requiring

  17. Modeling human response errors in synthetic flight simulator domain

    NASA Technical Reports Server (NTRS)

    Ntuen, Celestine A.

    1992-01-01

    This paper presents a control theoretic approach to modeling human response errors (HRE) in the flight simulation domain. The human pilot is modeled as a supervisor of a highly automated system. The synthesis uses the theory of optimal control pilot modeling for integrating the pilot's observation error and the error due to the simulation model (experimental error). Methods for solving the HRE problem are suggested. Experimental verification of the models will be tested in a flight quality handling simulation.

  18. Combining real-time monitoring and knowledge-based analysis in MARVEL

    NASA Technical Reports Server (NTRS)

    Schwuttke, Ursula M.; Quan, A. G.; Angelino, R.; Veregge, J. R.

    1993-01-01

    Real-time artificial intelligence is gaining increasing attention for applications in which conventional software methods are unable to meet technology needs. One such application area is the monitoring and analysis of complex systems. MARVEL, a distributed monitoring and analysis tool with multiple expert systems, was developed and successfully applied to the automation of interplanetary spacecraft operations at NASA's Jet Propulsion Laboratory. MARVEL implementation and verification approaches, the MARVEL architecture, and the specific benefits that were realized by using MARVEL in operations are described.

  19. A front-end automation tool supporting design, verification and reuse of SOC.

    PubMed

    Yan, Xiao-lang; Yu, Long-li; Wang, Jie-bing

    2004-09-01

    This paper describes an in-house developed language tool called VPerl used in developing a 250 MHz 32-bit high-performance low power embedded CPU core. The authors showed that use of this tool can compress the Verilog code by more than a factor of 5, increase the efficiency of the front-end design, reduce the bug rate significantly. This tool can be used to enhance the reusability of an intellectual property model, and facilitate porting design for different platforms.

  20. A Perspective on Computational Human Performance Models as Design Tools

    NASA Technical Reports Server (NTRS)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  1. Rapid-viability PCR method for detection of live, virulent Bacillus anthracis in environmental samples.

    PubMed

    Létant, Sonia E; Murphy, Gloria A; Alfaro, Teneile M; Avila, Julie R; Kane, Staci R; Raber, Ellen; Bunt, Thomas M; Shah, Sanjiv R

    2011-09-01

    In the event of a biothreat agent release, hundreds of samples would need to be rapidly processed to characterize the extent of contamination and determine the efficacy of remediation activities. Current biological agent identification and viability determination methods are both labor- and time-intensive such that turnaround time for confirmed results is typically several days. In order to alleviate this issue, automated, high-throughput sample processing methods were developed in which real-time PCR analysis is conducted on samples before and after incubation. The method, referred to as rapid-viability (RV)-PCR, uses the change in cycle threshold after incubation to detect the presence of live organisms. In this article, we report a novel RV-PCR method for detection of live, virulent Bacillus anthracis, in which the incubation time was reduced from 14 h to 9 h, bringing the total turnaround time for results below 15 h. The method incorporates a magnetic bead-based DNA extraction and purification step prior to PCR analysis, as well as specific real-time PCR assays for the B. anthracis chromosome and pXO1 and pXO2 plasmids. A single laboratory verification of the optimized method applied to the detection of virulent B. anthracis in environmental samples was conducted and showed a detection level of 10 to 99 CFU/sample with both manual and automated RV-PCR methods in the presence of various challenges. Experiments exploring the relationship between the incubation time and the limit of detection suggest that the method could be further shortened by an additional 2 to 3 h for relatively clean samples.

  2. A Framework for Translating a High Level Security Policy into Low Level Security Mechanisms

    NASA Astrophysics Data System (ADS)

    Hassan, Ahmed A.; Bahgat, Waleed M.

    2010-01-01

    Security policies have different components; firewall, active directory, and IDS are some examples of these components. Enforcement of network security policies to low level security mechanisms faces some essential difficulties. Consistency, verification, and maintenance are the major ones of these difficulties. One approach to overcome these difficulties is to automate the process of translation of high level security policy into low level security mechanisms. This paper introduces a framework of an automation process that translates a high level security policy into low level security mechanisms. The framework is described in terms of three phases; in the first phase all network assets are categorized according to their roles in the network security and relations between them are identified to constitute the network security model. This proposed model is based on organization based access control (OrBAC). However, the proposed model extend the OrBAC model to include not only access control policy but also some other administrative security policies like auditing policy. Besides, the proposed model enables matching of each rule of the high level security policy with the corresponding ones of the low level security policy. Through the second phase of the proposed framework, the high level security policy is mapped into the network security model. The second phase could be considered as a translation of the high level security policy into an intermediate model level. Finally, the intermediate model level is translated automatically into low level security mechanism. The paper illustrates the applicability of proposed approach through an application example.

  3. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  4. Oak ridge national laboratory automated clean chemistry for bulk analysis of environmental swipe samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bostick, Debra A.; Hexel, Cole R.; Ticknor, Brian W.

    2016-11-01

    To shorten the lengthy and costly manual chemical purification procedures, sample preparation methods for mass spectrometry are being automated using commercial-off-the-shelf (COTS) equipment. This addresses a serious need in the nuclear safeguards community to debottleneck the separation of U and Pu in environmental samples—currently performed by overburdened chemists—with a method that allows unattended, overnight operation. In collaboration with Elemental Scientific Inc., the prepFAST-MC2 was designed based on current COTS equipment that was modified for U/Pu separations utilizing Eichrom™ TEVA and UTEVA resins. Initial verification of individual columns yielded small elution volumes with consistent elution profiles and good recovery. Combined columnmore » calibration demonstrated ample separation without crosscontamination of the eluent. Automated packing and unpacking of the built-in columns initially showed >15% deviation in resin loading by weight, which can lead to inconsistent separations. Optimization of the packing and unpacking methods led to a reduction in the variability of the packed resin to less than 5% daily. The reproducibility of the automated system was tested with samples containing 30 ng U and 15 pg Pu, which were separated in a series with alternating reagent blanks. These experiments showed very good washout of both the resin and the sample from the columns as evidenced by low blank values. Analysis of the major and minor isotope ratios for U and Pu provided values well within data quality limits for the International Atomic Energy Agency. Additionally, system process blanks spiked with 233U and 244Pu tracers were separated using the automated system after it was moved outside of a clean room and yielded levels equivalent to clean room blanks, confirming that the system can produce high quality results without the need for expensive clean room infrastructure. Comparison of the amount of personnel time necessary for successful manual vs. automated chemical separations showed a significant decrease in hands-on time from 9.8 hours to 35 minutes for seven samples, respectively. This documented time savings and reduced labor translates to a significant cost savings per sample. Overall, the system will enable faster sample reporting times at reduced costs by limiting personnel hours dedicated to the chemical separation.« less

  5. High-Resolution Fast-Neutron Spectrometry for Arms Control and Treaty Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David L. Chichester; James T. Johnson; Edward H. Seabury

    2012-07-01

    Many nondestructive nuclear analysis techniques have been developed to support the measurement needs of arms control and treaty verification, including gross photon and neutron counting, low- and high-resolution gamma spectrometry, time-correlated neutron measurements, and photon and neutron imaging. One notable measurement technique that has not been extensively studied to date for these applications is high-resolution fast-neutron spectrometry (HRFNS). Applied for arms control and treaty verification, HRFNS has the potential to serve as a complimentary measurement approach to these other techniques by providing a means to either qualitatively or quantitatively determine the composition and thickness of non-nuclear materials surrounding neutron-emitting materials.more » The technique uses the normally-occurring neutrons present in arms control and treaty verification objects of interest as an internal source of neutrons for performing active-interrogation transmission measurements. Most low-Z nuclei of interest for arms control and treaty verification, including 9Be, 12C, 14N, and 16O, possess fast-neutron resonance features in their absorption cross sections in the 0.5- to 5-MeV energy range. Measuring the selective removal of source neutrons over this energy range, assuming for example a fission-spectrum starting distribution, may be used to estimate the stoichiometric composition of intervening materials between the neutron source and detector. At a simpler level, determination of the emitted fast-neutron spectrum may be used for fingerprinting 'known' assemblies for later use in template-matching tests. As with photon spectrometry, automated analysis of fast-neutron spectra may be performed to support decision making and reporting systems protected behind information barriers. This paper will report recent work at Idaho National Laboratory to explore the feasibility of using HRFNS for arms control and treaty verification applications, including simulations and experiments, using fission-spectrum neutron sources to assess neutron transmission through composite low-Z attenuators.« less

  6. Automated Test Environment for a Real-Time Control System

    NASA Technical Reports Server (NTRS)

    Hall, Ronald O.

    1994-01-01

    An automated environment with hardware-in-the-loop has been developed by Rocketdyne Huntsville for test of a real-time control system. The target system of application is the man-rated real-time system which controls the Space Shuttle Main Engines (SSME). The primary use of the environment is software verification and validation, but it is also useful for evaluation and analysis of SSME avionics hardware and mathematical engine models. It provides a test bed for the integration of software and hardware. The principles and skills upon which it operates may be applied to other target systems, such as those requiring hardware-in-the-loop simulation and control system development. Potential applications are in problem domains demanding highly reliable software systems requiring testing to formal requirements and verifying successful transition to/from off-nominal system states.

  7. An Overview of SAL

    NASA Technical Reports Server (NTRS)

    Bensalem, Saddek; Ganesh, Vijay; Lakhnech, Yassine; Munoz, Cesar; Owre, Sam; Ruess, Harald; Rushby, John; Rusu, Vlad; Saiedi, Hassen; Shankar, N.

    2000-01-01

    To become practical for assurance, automated formal methods must be made more scalable, automatic, and cost-effective. Such an increase in scope, scale, automation, and utility can be derived from an emphasis on a systematic separation of concerns during verification. SAL (Symbolic Analysis Laboratory) attempts to address these issues. It is a framework for combining different tools to calculate properties of concurrent systems. The heart of SAL is a language, developed in collaboration with Stanford, Berkeley, and Verimag for specifying concurrent systems in a compositional way. Our instantiation of the SAL framework augments PVS with tools for abstraction, invariant generation, program analysis (such as slicing), theorem proving, and model checking to separate concerns as well as calculate properties (i.e., perform, symbolic analysis) of concurrent systems. We. describe the motivation, the language, the tools, their integration in SAL/PAS, and some preliminary experience of their use.

  8. Current status of verification practices in clinical biochemistry in Spain.

    PubMed

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  9. Improved Detection Technique for Solvent Rinse Cleanliness Verification

    NASA Technical Reports Server (NTRS)

    Hornung, S. D.; Beeson, H. D.

    2001-01-01

    The NASA White Sands Test Facility (WSTF) has an ongoing effort to reduce or eliminate usage of cleaning solvents such as CFC-113 and its replacements. These solvents are used in the final clean and cleanliness verification processes for flight and ground support hardware, especially for oxygen systems where organic contaminants can pose an ignition hazard. For the final cleanliness verification in the standard process, the equivalent of one square foot of surface area of parts is rinsed with the solvent, and the final 100 mL of the rinse is captured. The amount of nonvolatile residue (NVR) in the solvent is determined by weight after the evaporation of the solvent. An improved process of sampling this rinse, developed at WSTF, requires evaporation of less than 2 mL of the solvent to make the cleanliness verification. Small amounts of the solvent are evaporated in a clean stainless steel cup, and the cleanliness of the stainless steel cup is measured using a commercially available surface quality monitor. The effectiveness of this new cleanliness verification technique was compared to the accepted NVR sampling procedures. Testing with known contaminants in solution, such as hydraulic fluid, fluorinated lubricants, and cutting and lubricating oils, was performed to establish a correlation between amount in solution and the process response. This report presents the approach and results and discusses the issues in establishing the surface quality monitor-based cleanliness verification.

  10. Automatic specification of reliability models for fault-tolerant computers

    NASA Technical Reports Server (NTRS)

    Liceaga, Carlos A.; Siewiorek, Daniel P.

    1993-01-01

    The calculation of reliability measures using Markov models is required for life-critical processor-memory-switch structures that have standby redundancy or that are subject to transient or intermittent faults or repair. The task of specifying these models is tedious and prone to human error because of the large number of states and transitions required in any reasonable system. Therefore, model specification is a major analysis bottleneck, and model verification is a major validation problem. The general unfamiliarity of computer architects with Markov modeling techniques further increases the necessity of automating the model specification. Automation requires a general system description language (SDL). For practicality, this SDL should also provide a high level of abstraction and be easy to learn and use. The first attempt to define and implement an SDL with those characteristics is presented. A program named Automated Reliability Modeling (ARM) was constructed as a research vehicle. The ARM program uses a graphical interface as its SDL, and it outputs a Markov reliability model specification formulated for direct use by programs that generate and evaluate the model.

  11. Rule Systems for Runtime Verification: A Short Tutorial

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  12. Generating Phenotypical Erroneous Human Behavior to Evaluate Human-automation Interaction Using Model Checking

    PubMed Central

    Bolton, Matthew L.; Bass, Ellen J.; Siminiceanu, Radu I.

    2012-01-01

    Breakdowns in complex systems often occur as a result of system elements interacting in unanticipated ways. In systems with human operators, human-automation interaction associated with both normative and erroneous human behavior can contribute to such failures. Model-driven design and analysis techniques provide engineers with formal methods tools and techniques capable of evaluating how human behavior can contribute to system failures. This paper presents a novel method for automatically generating task analytic models encompassing both normative and erroneous human behavior from normative task models. The generated erroneous behavior is capable of replicating Hollnagel’s zero-order phenotypes of erroneous action for omissions, jumps, repetitions, and intrusions. Multiple phenotypical acts can occur in sequence, thus allowing for the generation of higher order phenotypes. The task behavior model pattern capable of generating erroneous behavior can be integrated into a formal system model so that system safety properties can be formally verified with a model checker. This allows analysts to prove that a human-automation interactive system (as represented by the model) will or will not satisfy safety properties with both normative and generated erroneous human behavior. We present benchmarks related to the size of the statespace and verification time of models to show how the erroneous human behavior generation process scales. We demonstrate the method with a case study: the operation of a radiation therapy machine. A potential problem resulting from a generated erroneous human action is discovered. A design intervention is presented which prevents this problem from occurring. We discuss how our method could be used to evaluate larger applications and recommend future paths of development. PMID:23105914

  13. Suite of Benchmark Tests to Conduct Mesh-Convergence Analysis of Nonlinear and Non-constant Coefficient Transport Codes

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2014-12-01

    Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport. We also convey our experiences in finding several errors which were not detectable with routine verification techniques.

  14. Response to "Improving Patient Safety With Error Identification in Chemotherapy Orders by Verification Nurses"
.

    PubMed

    Zhu, Ling-Ling; Lv, Na; Zhou, Quan

    2016-12-01

    We read, with great interest, the study by Baldwin and Rodriguez (2016), which described the role of the verification nurse and details the verification process in identifying errors related to chemotherapy orders. We strongly agree with their findings that a verification nurse, collaborating closely with the prescribing physician, pharmacist, and treating nurse, can better identify errors and maintain safety during chemotherapy administration.

  15. Process Document for the joint ETV/NOWATECH verification of the Sorbisense GSW40 passive sampler

    EPA Science Inventory

    Nordic Water Technology Verification Center’s (NOWATECH) DHI Water Monitoring Center (DHI WMC), a pilot Environmental Technology Verification (ETV) program in the European Union, and the United States Environmental Protection Agency ETV (US EPA ETV) program’s Advanced Monitoring ...

  16. Verification of S&D Solutions for Network Communications and Devices

    NASA Astrophysics Data System (ADS)

    Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen

    This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.

  17. The All Sky Automated Survey

    NASA Astrophysics Data System (ADS)

    Pojmański, G.

    2004-10-01

    The All Sky Automated Survey is a low cost project, the ultimate goal of which is detection and investigation of any kind of the photometric variability present all over the sky. The current system consists of 4 instruments covering 36x36, 9x9 (2 units) and 2x2 degrees, equipped with 2Kx2K CCDs, V,R,I standard filters and custom made automated mounts. All are working in Las Campanas Observatory, Chile in fully automated mode. In the ASAS-3 phase of the project we have been taking data at a rate of 1 measurement per 1-3 days for all available objects brighter than V=14, located south of δ=+28 deg. So far over 2 TB of images has been collected and analyzed, leading to a photometric light curve catalog of over 10 million sources. A preliminary search for variability revealed over 40,000 bright, variable sources (over 75 % were not previously known). Direct access to the data is available over the Internet: http://www.astrouw.edu.pl/˜ gp/asas. At present the ASAS Alert System is being tested. Events, like outbursts of CV's or Novae, eclipses etc. are reported within a few minutes after first detection. Due to large number of artifacts in these data raw events require verification, which can take up to 24 hours.

  18. AutoMOPS- B2B and B2C in mask making: Mask manufacturing performance and customer satisfaction improvement through better information flow management using generic models and standardized languages

    NASA Astrophysics Data System (ADS)

    Filies, Olaf; de Ridder, Luc; Rodriguez, Ben; Kujiken, Aart

    2002-03-01

    Semiconductor manufacturing has become a global business, in which companies of different size unite in virtual enterprises to meet new opportunities. Therefore Mask manufacturing is a key business, but mask ordering is a complex process and is always critical regarding design to market time, even though mask complexity and customer base are increasing using a wide variety of different mask order forms which are frequently faulty and very seldom complete. This is effectively blocking agile manufacturing and can tie wafer fabs to a single mask The goal of the project is elimination of the order verification through paperless, electronically linked information sharing/exchange between chip design, mask production and production stages, which will allow automation of the mask preparation. To cover these new techniques and their specifications as well as the common ones with automated tools a special generic Meta-model will be generated, based on the current standards for mask specifications, including the requirements from the involved partners (Alcatel Microelectronics, Altis, Compugraphics, Infineon, Nimble, Sigma-C), the project works out a pre-normative standard. The paper presents the current status of work. This work is partly funded by the Commission of the European Union under the Fifth Framework project IST-1999-10332 AutoMOPS.

  19. Automated management of radioactive sources in Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Al-Kheliewi, Abdullah S.; Jamil, M. F.; Basar, M. R.; Tuwaili, W. R.

    2014-09-01

    For usage of radioactive substances, any facility has to register and take license from relevant authority of the country in which such facility is operating. In the Kingdom of Saudi Arabia (KSA), the authority for managing radioactive sources and providing licenses to organizations for its usage is the National Center of Radiation Protection (NCRP). This paper describes the system that automates registration and licensing process of the National Center of Radiation Protection. To provide 24×7 accesses to all the customers of NCRP, system is developed as web-based application that provide facility to online register, request license, renew license, check request status, view historical data and reports etc. and other features are provided as Electronic Services that would be accessible to users via internet. The system also was designed to streamline and optimize internal operations of NCRP besides providing ease of access to its customers by implementing a defined workflow through which every registration and license request will be routed. In addition to manual payment option, the system would also be integrated with SADAD (online payment system) that will avoid lengthy and cumbersome procedures associated with manual payment mechanism. Using SADAD payment option license fee could be paid through internet/ATM machine or branch of any designated bank, Payment will be instantly notified to NCRP hence delay in funds transfer and verification of invoice could be avoided, SADAD integration is discussed later in the document.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coble, Jamie; Orton, Christopher; Schwantes, Jon

    Abstract—The Multi-Isotope Process (MIP) Monitor provides an efficient approach to monitoring the process conditions in used nuclear fuel reprocessing facilities to support process verification and validation. The MIP Monitor applies multivariate analysis to gamma spectroscopy of reprocessing streams in order to detect small changes in the gamma spectrum, which may indicate changes in process conditions. This research extends the MIP Monitor by characterizing a used fuel sample after initial dissolution according to the type of reactor of origin (pressurized or boiling water reactor), initial enrichment, burn up, and cooling time. Simulated gamma spectra were used to develop and test threemore » fuel characterization algorithms. The classification and estimation models employed are based on the partial least squares regression (PLS) algorithm. A PLS discriminate analysis model was developed which perfectly classified reactor type. Locally weighted PLS models were fitted on-the-fly to estimate continuous fuel characteristics. Burn up was predicted within 0.1% root mean squared percent error (RMSPE) and both cooling time and initial enrichment within approximately 2% RMSPE. This automated fuel characterization can be used to independently verify operator declarations of used fuel characteristics and inform the MIP Monitor anomaly detection routines at later stages of the fuel reprocessing stream to improve sensitivity to changes in operational parameters and material diversions.« less

  1. Selecting automation for the clinical chemistry laboratory.

    PubMed

    Melanson, Stacy E F; Lindeman, Neal I; Jarolim, Petr

    2007-07-01

    Laboratory automation proposes to improve the quality and efficiency of laboratory operations, and may provide a solution to the quality demands and staff shortages faced by today's clinical laboratories. Several vendors offer automation systems in the United States, with both subtle and obvious differences. Arriving at a decision to automate, and the ensuing evaluation of available products, can be time-consuming and challenging. Although considerable discussion concerning the decision to automate has been published, relatively little attention has been paid to the process of evaluating and selecting automation systems. To outline a process for evaluating and selecting automation systems as a reference for laboratories contemplating laboratory automation. Our Clinical Chemistry Laboratory staff recently evaluated all major laboratory automation systems in the United States, with their respective chemistry and immunochemistry analyzers. Our experience is described and organized according to the selection process, the important considerations in clinical chemistry automation, decisions and implementation, and we give conclusions pertaining to this experience. Including the formation of a committee, workflow analysis, submitting a request for proposal, site visits, and making a final decision, the process of selecting chemistry automation took approximately 14 months. We outline important considerations in automation design, preanalytical processing, analyzer selection, postanalytical storage, and data management. Selecting clinical chemistry laboratory automation is a complex, time-consuming process. Laboratories considering laboratory automation may benefit from the concise overview and narrative and tabular suggestions provided.

  2. 49 CFR 40.327 - When must the MRO report medical information gathered in the verification process?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... results and medical information you learned as part of the verification process to third parties without... the course of an accident investigation. (c) If the law of a foreign country (e.g., Canada) prohibits...

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT FOR AMMONIA RECOVERY PROCESS

    EPA Science Inventory

    This Technology Verification report describes the nature and scope of an environmental evaluation of ThermoEnergy Corporation’s Ammonia Recovery Process (ARP) system. The information contained in this report represents data that were collected over a 3-month pilot study. The ti...

  4. Evaluation of STAT medication ordering process in a community hospital.

    PubMed

    Abdelaziz, Hani; Richardson, Sandra; Walsh, Kim; Nodzon, Jessica; Schwartz, Barbara

    2016-01-01

    In most health care facilities, problems related to delays in STAT medication order processing time are of common concern. The purpose of this study was to evaluate processing time for STAT orders at Kimball Medical Center. All STAT orders were reviewed to determine processing time; order processing time was also stratified by physician order entry (physician entered (PE) orders vs. non-physician entered (NPE) orders). Collected data included medication ordered, indication, time ordered, time verified by pharmacist, time sent from pharmacy, and time charted as given to the patient. A total of 502 STAT orders were reviewed and 389 orders were included for analysis. Overall, median time was 29 minutes, IQR 16-63; p<0.0001.). The time needed to process NPE orders was significantly less than that needed for PE orders (median 27 vs. 34 minutes; p=0.026). In terms of NPE orders, the median total time required to process STAT orders for medications available in the Automated Dispensing Devices (ADM) was within 30 minutes, while that required to process orders for medications not available in the ADM was significantly greater than 30 minutes. For PE orders, the median total time required to process orders for medications available in the ADM (i.e., not requiring pharmacy involvement) was significantly greater than 30 minutes. [Median time = 34 minutes (p<0.001)]. We conclude that STAT order processing time may be improved by increasing the availability of medications in ADM, and pharmacy involvement in the verification process.

  5. Identity Verification, Control, and Aggression in Marriage

    ERIC Educational Resources Information Center

    Stets, Jan E.; Burke, Peter J.

    2005-01-01

    In this research we study the identity verification process and its effects in marriage. Drawing on identity control theory, we hypothesize that a lack of verification in the spouse identity (1) threatens stable self-meanings and interaction patterns between spouses, and (2) challenges a (nonverified) spouse's perception of control over the…

  6. 24 CFR 3286.207 - Process for obtaining installation license.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... installation license must submit verification of the experience required in § 3286.205(a). This verification may be in the form of statements by past or present employers or a self-certification that the applicant meets those experience requirements, but HUD may contact the applicant for additional verification...

  7. 24 CFR 3286.307 - Process for obtaining trainer's qualification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... verification of the experience required in § 3286.305. This verification may be in the form of statements by past or present employers or a self-certification that the applicant meets those experience requirements, but HUD may contact the applicant for additional verification at any time. The applicant must...

  8. 7 CFR 1980.353 - Filing and processing applications.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... subject to the availability of funds. (15) A copy of a valid verification of income for each adult member... method of verifying information. Verifications must pass directly from the source of information to the Lender and shall not pass through the hands of a third party or applicant. (1) Income verification...

  9. Automation and control of off-planet oxygen production processes

    NASA Technical Reports Server (NTRS)

    Marner, W. J.; Suitor, J. W.; Schooley, L. S.; Cellier, F. E.

    1990-01-01

    This paper addresses several aspects of the automation and control of off-planet production processes. First, a general approach to process automation and control is discussed from the viewpoint of translating human process control procedures into automated procedures. Second, the control issues for the automation and control of off-planet oxygen processes are discussed. Sensors, instruments, and components are defined and discussed in the context of off-planet applications, and the need for 'smart' components is clearly established.

  10. What is the Final Verification of Engineering Requirements?

    NASA Technical Reports Server (NTRS)

    Poole, Eric

    2010-01-01

    This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.

  11. Neural network face recognition using wavelets

    NASA Astrophysics Data System (ADS)

    Karunaratne, Passant V.; Jouny, Ismail I.

    1997-04-01

    The recognition of human faces is a phenomenon that has been mastered by the human visual system and that has been researched extensively in the domain of computer neural networks and image processing. This research is involved in the study of neural networks and wavelet image processing techniques in the application of human face recognition. The objective of the system is to acquire a digitized still image of a human face, carry out pre-processing on the image as required, an then, given a prior database of images of possible individuals, be able to recognize the individual in the image. The pre-processing segment of the system includes several procedures, namely image compression, denoising, and feature extraction. The image processing is carried out using Daubechies wavelets. Once the images have been passed through the wavelet-based image processor they can be efficiently analyzed by means of a neural network. A back- propagation neural network is used for the recognition segment of the system. The main constraints of the system is with regard to the characteristics of the images being processed. The system should be able to carry out effective recognition of the human faces irrespective of the individual's facial-expression, presence of extraneous objects such as head-gear or spectacles, and face/head orientation. A potential application of this face recognition system would be as a secondary verification method in an automated teller machine.

  12. A Novel Patient Recruitment Strategy: Patient Selection Directly from the Community through Linkage to Clinical Data.

    PubMed

    Zimmerman, Lindsay P; Goel, Satyender; Sathar, Shazia; Gladfelter, Charon E; Onate, Alejandra; Kane, Lindsey L; Sital, Shelly; Phua, Jasmin; Davis, Paris; Margellos-Anast, Helen; Meltzer, David O; Polonsky, Tamar S; Shah, Raj C; Trick, William E; Ahmad, Faraz S; Kho, Abel N

    2018-01-01

    This article presents and describes our methods in developing a novel strategy for recruitment of underrepresented, community-based participants, for pragmatic research studies leveraging routinely collected electronic health record (EHR) data. We designed a new approach for recruiting eligible patients from the community, while also leveraging affiliated health systems to extract clinical data for community participants. The strategy involves methods for data collection, linkage, and tracking. In this workflow, potential participants are identified in the community and surveyed regarding eligibility. These data are then encrypted and deidentified via a hashing algorithm for linkage of the community participant back to a record at a clinical site. The linkage allows for eligibility verification and automated follow-up. Longitudinal data are collected by querying the EHR data and surveying the community participant directly. We discuss this strategy within the context of two national research projects, a clinical trial and an observational cohort study. The community-based recruitment strategy is a novel, low-touch, clinical trial enrollment method to engage a diverse set of participants. Direct outreach to community participants, while utilizing EHR data for clinical information and follow-up, allows for efficient recruitment and follow-up strategies. This new strategy for recruitment links data reported from community participants to clinical data in the EHR and allows for eligibility verification and automated follow-up. The workflow has the potential to improve recruitment efficiency and engage traditionally underrepresented individuals in research. Schattauer GmbH Stuttgart.

  13. Cognitive Bias in the Verification and Validation of Space Flight Systems

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of future systems.

  14. Verification of a rapid mooring and foundation design tool

    DOE PAGES

    Weller, Sam D.; Hardwick, Jon; Gomez, Steven; ...

    2018-02-15

    Marine renewable energy devices require mooring and foundation systems that suitable in terms of device operation and are also robust and cost effective. In the initial stages of mooring and foundation development a large number of possible configuration permutations exist. Filtering of unsuitable designs is possible using information specific to the deployment site (i.e. bathymetry, environmental conditions) and device (i.e. mooring and/or foundation system role and cable connection requirements). The identification of a final solution requires detailed analysis, which includes load cases based on extreme environmental statistics following certification guidance processes. Static and/or quasi-static modelling of the mooring and/or foundationmore » system serves as an intermediate design filtering stage enabling dynamic time-domain analysis to be focused on a small number of potential configurations. Mooring and foundation design is therefore reliant on logical decision making throughout this stage-gate process. The open-source DTOcean (Optimal Design Tools for Ocean Energy Arrays) Tool includes a mooring and foundation module, which automates the configuration selection process for fixed and floating wave and tidal energy devices. As far as the authors are aware, this is one of the first tools to be developed for the purpose of identifying potential solutions during the initial stages of marine renewable energy design. While the mooring and foundation module does not replace a full design assessment, it provides in addition to suitable configuration solutions, assessments in terms of reliability, economics and environmental impact. This article provides insight into the solution identification approach used by the module and features the verification of both the mooring system calculations and the foundation design using commercial software. Several case studies are investigated: a floating wave energy converter and several anchoring systems. It is demonstrated that the mooring and foundation module is able to provide device and/or site developers with rapid mooring and foundation design solutions to appropriate design criteria.« less

  15. SU-F-J-72: A Clinical Usable Integrated Contouring Quality Evaluation Software for Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, S; Dolly, S; Cai, B

    Purpose: To introduce the Auto Contour Evaluation (ACE) software, which is the clinical usable, user friendly, efficient and all-in-one toolbox for automatically identify common contouring errors in radiotherapy treatment planning using supervised machine learning techniques. Methods: ACE is developed with C# using Microsoft .Net framework and Windows Presentation Foundation (WPF) for elegant GUI design and smooth GUI transition animations through the integration of graphics engines and high dots per inch (DPI) settings on modern high resolution monitors. The industrial standard software design pattern, Model-View-ViewModel (MVVM) pattern, is chosen to be the major architecture of ACE for neat coding structure, deepmore » modularization, easy maintainability and seamless communication with other clinical software. ACE consists of 1) a patient data importing module integrated with clinical patient database server, 2) a 2D DICOM image and RT structure simultaneously displaying module, 3) a 3D RT structure visualization module using Visualization Toolkit or VTK library and 4) a contour evaluation module using supervised pattern recognition algorithms to detect contouring errors and display detection results. ACE relies on supervised learning algorithms to handle all image processing and data processing jobs. Implementations of related algorithms are powered by Accord.Net scientific computing library for better efficiency and effectiveness. Results: ACE can take patient’s CT images and RT structures from commercial treatment planning software via direct user input or from patients’ database. All functionalities including 2D and 3D image visualization and RT contours error detection have been demonstrated with real clinical patient cases. Conclusion: ACE implements supervised learning algorithms and combines image processing and graphical visualization modules for RT contours verification. ACE has great potential for automated radiotherapy contouring quality verification. Structured with MVVM pattern, it is highly maintainable and extensible, and support smooth connections with other clinical software tools.« less

  16. Verification of a rapid mooring and foundation design tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weller, Sam D.; Hardwick, Jon; Gomez, Steven

    Marine renewable energy devices require mooring and foundation systems that suitable in terms of device operation and are also robust and cost effective. In the initial stages of mooring and foundation development a large number of possible configuration permutations exist. Filtering of unsuitable designs is possible using information specific to the deployment site (i.e. bathymetry, environmental conditions) and device (i.e. mooring and/or foundation system role and cable connection requirements). The identification of a final solution requires detailed analysis, which includes load cases based on extreme environmental statistics following certification guidance processes. Static and/or quasi-static modelling of the mooring and/or foundationmore » system serves as an intermediate design filtering stage enabling dynamic time-domain analysis to be focused on a small number of potential configurations. Mooring and foundation design is therefore reliant on logical decision making throughout this stage-gate process. The open-source DTOcean (Optimal Design Tools for Ocean Energy Arrays) Tool includes a mooring and foundation module, which automates the configuration selection process for fixed and floating wave and tidal energy devices. As far as the authors are aware, this is one of the first tools to be developed for the purpose of identifying potential solutions during the initial stages of marine renewable energy design. While the mooring and foundation module does not replace a full design assessment, it provides in addition to suitable configuration solutions, assessments in terms of reliability, economics and environmental impact. This article provides insight into the solution identification approach used by the module and features the verification of both the mooring system calculations and the foundation design using commercial software. Several case studies are investigated: a floating wave energy converter and several anchoring systems. It is demonstrated that the mooring and foundation module is able to provide device and/or site developers with rapid mooring and foundation design solutions to appropriate design criteria.« less

  17. A Methodology for Evaluating Artifacts Produced by a Formal Verification Process

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette

    2011-01-01

    The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.

  18. Self-Verification of Ability through Biased Performance Memory.

    ERIC Educational Resources Information Center

    Karabenick, Stuart A.; LeBlanc, Daniel

    Evidence points to a pervasive tendency for persons to behave to maintain their existing cognitive structures. One strategy by which this self-verification is made more probable involves information processing. Through attention, encoding and retrieval, and the interpretation of events, persons process information so that self-confirmatory…

  19. Self-verification motives at the collective level of self-definition.

    PubMed

    Chen, Serena; Chen, Karen Y; Shaw, Lindsay

    2004-01-01

    Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.

  20. Expertise Development With Different Types of Automation: A Function of Different Cognitive Abilities.

    PubMed

    Jipp, Meike

    2016-02-01

    I explored whether different cognitive abilities (information-processing ability, working-memory capacity) are needed for expertise development when different types of automation (information vs. decision automation) are employed. It is well documented that expertise development and the employment of automation lead to improved performance. Here, it is argued that a learner's ability to reason about an activity may be hindered by the employment of information automation. Additional feedback needs to be processed, thus increasing the load on working memory and decelerating expertise development. By contrast, the employment of decision automation may stimulate reasoning, increase the initial load on information-processing ability, and accelerate expertise development. Authors of past research have not investigated the interrelations between automation assistance, individual differences, and expertise development. Sixty-one naive learners controlled simulated air traffic with two types of automation: information automation and decision automation. Their performance was captured across 16 trials. Well-established tests were used to assess information-processing ability and working-memory capacity. As expected, learners' performance benefited from expertise development and decision automation. Furthermore, individual differences moderated the effect of the type of automation on expertise development: The employment of only information automation increased the load on working memory during later expertise development. The employment of decision automation initially increased the need to process information. These findings highlight the importance of considering individual differences and expertise development when investigating human-automation interaction. The results are relevant for selecting automation configurations for expertise development. © 2015, Human Factors and Ergonomics Society.

  1. Students' Use of Technological Tools for Verification Purposes in Geometry Problem Solving

    ERIC Educational Resources Information Center

    Papadopoulos, Ioannis; Dagdilelis, Vassilios

    2008-01-01

    Despite its importance in mathematical problem solving, verification receives rather little attention by the students in classrooms, especially at the primary school level. Under the hypotheses that (a) non-standard tasks create a feeling of uncertainty that stimulates the students to proceed to verification processes and (b) computational…

  2. Guidelines for qualifying cleaning and verification materials

    NASA Technical Reports Server (NTRS)

    Webb, D.

    1995-01-01

    This document is intended to provide guidance in identifying technical issues which must be addressed in a comprehensive qualification plan for materials used in cleaning and cleanliness verification processes. Information presented herein is intended to facilitate development of a definitive checklist that should address all pertinent materials issues when down selecting a cleaning/verification media.

  3. Integrated guidance, navigation and control verification plan primary flight system. [space shuttle avionics integration

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The verification process and requirements for the ascent guidance interfaces and the ascent integrated guidance, navigation and control system for the space shuttle orbiter are defined as well as portions of supporting systems which directly interface with the system. The ascent phase of verification covers the normal and ATO ascent through the final OMS-2 circularization burn (all of OPS-1), the AOA ascent through the OMS-1 burn, and the RTLS ascent through ET separation (all of MM 601). In addition, OPS translation verification is defined. Verification trees and roadmaps are given.

  4. An Overview of NASA's IM&S Verification and Validation Process Plan and Specification for Space Exploration

    NASA Technical Reports Server (NTRS)

    Gravitz, Robert M.; Hale, Joseph

    2006-01-01

    NASA's Exploration Systems Mission Directorate (ESMD) is implementing a management approach for modeling and simulation (M&S) that will provide decision-makers information on the model's fidelity, credibility, and quality. This information will allow the decision-maker to understand the risks involved in using a model's results in the decision-making process. This presentation will discuss NASA's approach for verification and validation (V&V) of its models or simulations supporting space exploration. This presentation will describe NASA's V&V process and the associated M&S verification and validation (V&V) activities required to support the decision-making process. The M&S V&V Plan and V&V Report templates for ESMD will also be illustrated.

  5. Property-Based Monitoring of Analog and Mixed-Signal Systems

    NASA Astrophysics Data System (ADS)

    Havlicek, John; Little, Scott; Maler, Oded; Nickovic, Dejan

    In the recent past, there has been a steady growth of the market for consumer embedded devices such as cell phones, GPS and portable multimedia systems. In embedded systems, digital, analog and software components are combined on a single chip, resulting in increasingly complex designs that introduce richer functionality on smaller devices. As a consequence, the potential insertion of errors into a design becomes higher, yielding an increasing need for automated analog and mixed-signal validation tools. In the purely digital setting, formal verification based on properties expressed in industrial specification languages such as PSL and SVA is nowadays successfully integrated in the design flow. On the other hand, the validation of analog and mixed-signal systems still largely depends on simulation-based, ad-hoc methods. In this tutorial, we consider some ingredients of the standard verification methodology that can be successfully exported from digital to analog and mixed-signal setting, in particular property-based monitoring techniques. Property-based monitoring is a lighter approach to the formal verification, where the system is seen as a "black-box" that generates sets of traces, whose correctness is checked against a property, that is its high-level specification. Although incomplete, monitoring is effectively used to catch faults in systems, without guaranteeing their full correctness.

  6. Improvement and automation of a real-time PCR assay for vaginal fluids.

    PubMed

    De Vittori, E; Giampaoli, S; Barni, F; Baldi, M; Berti, A; Ripani, L; Romano Spica, V

    2016-05-01

    The identification of vaginal fluids is crucial in forensic science. Several molecular protocols based on PCR amplification of mfDNA (microflora DNA) specific for vaginal bacteria are now available. Unfortunately mfDNA extraction and PCR reactions require manual optimization of several steps. The aim of present study was the verification of a partial automatization of vaginal fluids identification through two instruments widely diffused in forensic laboratories: EZ1 Advanced robot and Rotor Gene Q 5Plex HRM. Moreover, taking advantage of 5-plex thermocycler technology, the ForFluid kit performances were improved by expanding the mfDNA characterization panel with a new bacterial target for vaginal fluids and with an internal positive control (IPC) to monitor PCR inhibition. Results underlined the feasibility of a semi-automated extraction of mfDNA using a BioRobot and demonstrated the analytical improvements of the kit. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. Formal Methods for Automated Diagnosis of Autosub 6000

    NASA Technical Reports Server (NTRS)

    Ernits, Juhan; Dearden, Richard; Pebody, Miles

    2009-01-01

    This is a progress report on applying formal methods in the context of building an automated diagnosis and recovery system for Autosub 6000, an Autonomous Underwater Vehicle (AUV). The diagnosis task involves building abstract models of the control system of the AUV. The diagnosis engine is based on Livingstone 2, a model-based diagnoser originally built for aerospace applications. Large parts of the diagnosis model can be built without concrete knowledge about each mission, but actual mission scripts and configuration parameters that carry important information for diagnosis are changed for every mission. Thus we use formal methods for generating the mission control part of the diagnosis model automatically from the mission script and perform a number of invariant checks to validate the configuration. After the diagnosis model is augmented with the generated mission control component model, it needs to be validated using verification techniques.

  8. An assessment of space shuttle flight software development processes

    NASA Technical Reports Server (NTRS)

    1993-01-01

    In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.

  9. Development of CFC-Free Cleaning Processes at the NASA White Sands Test Facility

    NASA Technical Reports Server (NTRS)

    Beeson, Harold; Kirsch, Mike; Hornung, Steven; Biesinger, Paul

    1995-01-01

    The NASA White Sands Test Facility (WSTF) is developing cleaning and verification processes to replace currently used chlorofluorocarbon-113- (CFC-113-) based processes. The processes being evaluated include both aqueous- and solvent-based techniques. The presentation will include the findings of investigations of aqueous cleaning and verification processes that are based on a draft of a proposed NASA Kennedy Space Center (KSC) cleaning procedure. Verification testing with known contaminants, such as hydraulic fluid and commonly used oils, established correlations between nonvolatile residue and CFC-113. Recoveries ranged from 35 to 60 percent of theoretical. WSTF is also investigating enhancements to aqueous sampling for organics and particulates. Although aqueous alternatives have been identified for several processes, a need still exists for nonaqueous solvent cleaning, such as the cleaning and cleanliness verification of gauges used for oxygen service. The cleaning effectiveness of tetrachloroethylene (PCE), trichloroethylene (TCE), ethanol, hydrochlorofluorocarbon-225 (HCFC-225), tert-butylmethylether, and n-Hexane was evaluated using aerospace gauges and precision instruments and then compared to the cleaning effectiveness of CFC-113. Solvents considered for use in oxygen systems were also tested for oxygen compatibility using high-pressure oxygen autoignition and liquid oxygen mechanical impact testing.

  10. 21 CFR 120.25 - Process verification for certain processors.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Process verification for certain processors. 120.25 Section 120.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS...

  11. 21 CFR 120.25 - Process verification for certain processors.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Process verification for certain processors. 120.25 Section 120.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS...

  12. 21 CFR 120.25 - Process verification for certain processors.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Process verification for certain processors. 120.25 Section 120.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS...

  13. 9 CFR 310.25 - Contamination with microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... CERTIFICATION POST-MORTEM INSPECTION § 310.25 Contamination with microorganisms; process control verification... testing. (1) Each official establishment that slaughters livestock must test for Escherichia coli Biotype... poultry, shall test the type of livestock or poultry slaughtered in the greatest number. The establishment...

  14. 9 CFR 310.25 - Contamination with microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... CERTIFICATION POST-MORTEM INSPECTION § 310.25 Contamination with microorganisms; process control verification... testing. (1) Each official establishment that slaughters livestock must test for Escherichia coli Biotype... poultry, shall test the type of livestock or poultry slaughtered in the greatest number. The establishment...

  15. 9 CFR 310.25 - Contamination with microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... CERTIFICATION POST-MORTEM INSPECTION § 310.25 Contamination with microorganisms; process control verification... testing. (1) Each official establishment that slaughters livestock must test for Escherichia coli Biotype... poultry, shall test the type of livestock or poultry slaughtered in the greatest number. The establishment...

  16. 9 CFR 310.25 - Contamination with microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... CERTIFICATION POST-MORTEM INSPECTION § 310.25 Contamination with microorganisms; process control verification... testing. (1) Each official establishment that slaughters livestock must test for Escherichia coli Biotype... poultry, shall test the type of livestock or poultry slaughtered in the greatest number. The establishment...

  17. 9 CFR 310.25 - Contamination with microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... CERTIFICATION POST-MORTEM INSPECTION § 310.25 Contamination with microorganisms; process control verification... testing. (1) Each official establishment that slaughters livestock must test for Escherichia coli Biotype... poultry, shall test the type of livestock or poultry slaughtered in the greatest number. The establishment...

  18. Verification Processes in Recognition Memory: The Role of Natural Language Mediators

    ERIC Educational Resources Information Center

    Marshall, Philip H.; Smith, Randolph A. S.

    1977-01-01

    The existence of verification processes in recognition memory was confirmed in the context of Adams' (Adams & Bray, 1970) closed-loop theory. Subjects' recognition was tested following a learning session. The expectation was that data would reveal consistent internal relationships supporting the position that natural language mediation plays…

  19. The Automation of Reserve Processing.

    ERIC Educational Resources Information Center

    Self, James

    1985-01-01

    Describes an automated reserve processing system developed locally at Clemons Library, University of Virginia. Discussion covers developments in the reserve operation at Clemons Library, automation of the processing and circulation functions of reserve collections, and changes in reserve operation performance and staffing needs due to automation.…

  20. Arc Second Alignment of International X-Ray Observatory Mirror Segments in a Fixed Structure

    NASA Technical Reports Server (NTRS)

    Evans, Tyler C.; Chan, Kai-Wing

    2009-01-01

    The optics for the International X-Ray Observatory (IXO) require alignment and integration of about fourteen thousand thin mirror segments to achieve the mission goal of 3.0 square meters of effective area at 1.25 keV with an angular resolution of five arc seconds. These mirror segments are 0.4mm thick, and 200 to 400mm in size, which makes it hard not to impart distortion at the subarc second level. This paper outlines the precise alignment, verification testing, and permanent bonding techniques developed at NASA's Goddard Space Flight Center (GSFC). These techniques are used to overcome the challenge of transferring thin mirror segments from a temporary mount to a fixed structure with arc second alignment and minimal figure distortion. Recent advances in technology development in addition to the automation of several processes have produced significant results. This paper will highlight the recent advances in alignment, testing, and permanent bonding techniques as well as the results they have produced.

  1. Arc-Second Alignment of International X-Ray Observatory Mirror Segments in a Fixed Structure

    NASA Technical Reports Server (NTRS)

    Evans, Tyler C.; Chan, Kai-Wing; Saha, Timo T.

    2010-01-01

    The optics for the International X-Ray Observatory (IXO) require alignment and integration of about fourteen thousand thin mirror segments to achieve the mission goal of 3.0 square meters of effective area at 1.25 keV with an angular resolution of five arc-seconds. These mirror segments are 0.4 mm thick, and 200 to 400 mm in size, which makes it hard not to impart distortion at the subare- second level. This paper outlines the precise alignment, verification testing, and permanent bonding techniques developed at NASA's Goddard Space Flight Center (GSFC). These techniques are used to overcome the challenge of transferring thin mirror segments from a temporary mount to a fixed structure with arc-second alignment and minimal figure distortion. Recent advances in technology development in addition to the automation of several processes have produced significant results. This paper will highlight the recent advances in alignment, testing, and permanent bonding techniques as well as the results they have produced.

  2. Evaluating a variety of text-mined features for automatic protein function prediction with GOstruct.

    PubMed

    Funk, Christopher S; Kahanda, Indika; Ben-Hur, Asa; Verspoor, Karin M

    2015-01-01

    Most computational methods that predict protein function do not take advantage of the large amount of information contained in the biomedical literature. In this work we evaluate both ontology term co-mention and bag-of-words features mined from the biomedical literature and analyze their impact in the context of a structured output support vector machine model, GOstruct. We find that even simple literature based features are useful for predicting human protein function (F-max: Molecular Function =0.408, Biological Process =0.461, Cellular Component =0.608). One advantage of using literature features is their ability to offer easy verification of automated predictions. We find through manual inspection of misclassifications that some false positive predictions could be biologically valid predictions based upon support extracted from the literature. Additionally, we present a "medium-throughput" pipeline that was used to annotate a large subset of co-mentions; we suggest that this strategy could help to speed up the rate at which proteins are curated.

  3. Neural Network Target Identification System for False Alarm Reduction

    NASA Technical Reports Server (NTRS)

    Ye, David; Edens, Weston; Lu, Thomas T.; Chao, Tien-Hsin

    2009-01-01

    A multi-stage automated target recognition (ATR) system has been designed to perform computer vision tasks with adequate proficiency in mimicking human vision. The system is able to detect, identify, and track targets of interest. Potential regions of interest (ROIs) are first identified by the detection stage using an Optimum Trade-off Maximum Average Correlation Height (OT-MACH) filter combined with a wavelet transform. False positives are then eliminated by the verification stage using feature extraction methods in conjunction with neural networks. Feature extraction transforms the ROIs using filtering and binning algorithms to create feature vectors. A feed forward back propagation neural network (NN) is then trained to classify each feature vector and remove false positives. This paper discusses the test of the system performance and parameter optimizations process which adapts the system to various targets and datasets. The test results show that the system was successful in substantially reducing the false positive rate when tested on a sonar image dataset.

  4. Advanced Cell Classifier: User-Friendly Machine-Learning-Based Software for Discovering Phenotypes in High-Content Imaging Data.

    PubMed

    Piccinini, Filippo; Balassa, Tamas; Szkalisity, Abel; Molnar, Csaba; Paavolainen, Lassi; Kujala, Kaisa; Buzas, Krisztina; Sarazova, Marie; Pietiainen, Vilja; Kutay, Ulrike; Smith, Kevin; Horvath, Peter

    2017-06-28

    High-content, imaging-based screens now routinely generate data on a scale that precludes manual verification and interrogation. Software applying machine learning has become an essential tool to automate analysis, but these methods require annotated examples to learn from. Efficiently exploring large datasets to find relevant examples remains a challenging bottleneck. Here, we present Advanced Cell Classifier (ACC), a graphical software package for phenotypic analysis that addresses these difficulties. ACC applies machine-learning and image-analysis methods to high-content data generated by large-scale, cell-based experiments. It features methods to mine microscopic image data, discover new phenotypes, and improve recognition performance. We demonstrate that these features substantially expedite the training process, successfully uncover rare phenotypes, and improve the accuracy of the analysis. ACC is extensively documented, designed to be user-friendly for researchers without machine-learning expertise, and distributed as a free open-source tool at www.cellclassifier.org. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Electronic Data Interchange in Procurement

    DTIC Science & Technology

    1990-04-01

    contract management and order processing systems. This conversion of automated information to paper and back to automated form is not only slow and...automated purchasing computer and the contractor’s order processing computer through telephone lines, as illustrated in Figure 1-1. Computer-to-computer...into the contractor’s order processing or contract management system. This approach - converting automated information to paper and back to automated

  6. Automated Test for NASA CFS

    NASA Technical Reports Server (NTRS)

    McComas, David C.; Strege, Susanne L.; Carpenter, Paul B. Hartman, Randy

    2015-01-01

    The core Flight System (cFS) is a flight software (FSW) product line developed by the Flight Software Systems Branch (FSSB) at NASA's Goddard Space Flight Center (GSFC). The cFS uses compile-time configuration parameters to implement variable requirements to enable portability across embedded computing platforms and to implement different end-user functional needs. The verification and validation of these requirements is proving to be a significant challenge. This paper describes the challenges facing the cFS and the results of a pilot effort to apply EXB Solution's testing approach to the cFS applications.

  7. Automated Verification of Design Patterns with LePUS3

    NASA Technical Reports Server (NTRS)

    Nicholson, Jonathan; Gasparis, Epameinondas; Eden, Ammon H.; Kazman, Rick

    2009-01-01

    Specification and [visual] modelling languages are expected to combine strong abstraction mechanisms with rigour, scalability, and parsimony. LePUS3 is a visual, object-oriented design description language axiomatized in a decidable subset of the first-order predicate logic. We demonstrate how LePUS3 is used to formally specify a structural design pattern and prove ( verify ) whether any JavaTM 1.4 program satisfies that specification. We also show how LePUS3 specifications (charts) are composed and how they are verified fully automatically in the Two-Tier Programming Toolkit.

  8. Optical benchmarking of security document readers for automated border control

    NASA Astrophysics Data System (ADS)

    Valentín, Kristián.; Wild, Peter; Å tolc, Svorad; Daubner, Franz; Clabian, Markus

    2016-10-01

    Authentication and optical verification of travel documents upon crossing borders is of utmost importance for national security. Understanding the workflow and different approaches to ICAO 9303 travel document scanning in passport readers, as well as highlighting normalization issues and designing new methods to achieve better harmonization across inspection devices are key steps for the development of more effective and efficient next- generation passport inspection. This paper presents a survey of state-of-the-art document inspection systems, showcasing results of a document reader challenge investigating 9 devices with regards to optical characteristics.

  9. U. S. GEOLOGICAL SURVEY LAND REMOTE SENSING ACTIVITIES.

    USGS Publications Warehouse

    Frederick, Doyle G.

    1983-01-01

    USGS uses all types of remotely sensed data, in combination with other sources of data, to support geologic analyses, hydrologic assessments, land cover mapping, image mapping, and applications research. Survey scientists use all types of remotely sensed data with ground verifications and digital topographic and cartographic data. A considerable amount of research is being done by Survey scientists on developing automated geographic information systems that can handle a wide variety of digital data. The Survey is also investigating the use of microprocessor computer systems for accessing, displaying, and analyzing digital data.

  10. INDEPENDENT VERIFICATION SURVEY REPORT FOR ZONE 1 OF THE EAST TENNESSEE TECHNOLOGY PARK IN OAK RIDGE, TENNESSEE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, David A.

    2012-08-16

    Oak Ridge Associated Universities (ORAU) conducted in-process inspections and independent verification (IV) surveys in support of DOE's remedial efforts in Zone 1 of East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. Inspections concluded that the remediation contractor's soil removal and survey objectives were satisfied and the dynamic verification strategy (DVS) was implemented as designed. Independent verification (IV) activities included gamma walkover surveys and soil sample collection/analysis over multiple exposure units (EUs).

  11. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation.

    PubMed

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin

    2017-04-01

    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  12. Rapid-Viability PCR Method for Detection of Live, Virulent Bacillus anthracis in Environmental Samples ▿

    PubMed Central

    Létant, Sonia E.; Murphy, Gloria A.; Alfaro, Teneile M.; Avila, Julie R.; Kane, Staci R.; Raber, Ellen; Bunt, Thomas M.; Shah, Sanjiv R.

    2011-01-01

    In the event of a biothreat agent release, hundreds of samples would need to be rapidly processed to characterize the extent of contamination and determine the efficacy of remediation activities. Current biological agent identification and viability determination methods are both labor- and time-intensive such that turnaround time for confirmed results is typically several days. In order to alleviate this issue, automated, high-throughput sample processing methods were developed in which real-time PCR analysis is conducted on samples before and after incubation. The method, referred to as rapid-viability (RV)-PCR, uses the change in cycle threshold after incubation to detect the presence of live organisms. In this article, we report a novel RV-PCR method for detection of live, virulent Bacillus anthracis, in which the incubation time was reduced from 14 h to 9 h, bringing the total turnaround time for results below 15 h. The method incorporates a magnetic bead-based DNA extraction and purification step prior to PCR analysis, as well as specific real-time PCR assays for the B. anthracis chromosome and pXO1 and pXO2 plasmids. A single laboratory verification of the optimized method applied to the detection of virulent B. anthracis in environmental samples was conducted and showed a detection level of 10 to 99 CFU/sample with both manual and automated RV-PCR methods in the presence of various challenges. Experiments exploring the relationship between the incubation time and the limit of detection suggest that the method could be further shortened by an additional 2 to 3 h for relatively clean samples. PMID:21764960

  13. Expert system decision support for low-cost launch vehicle operations

    NASA Technical Reports Server (NTRS)

    Szatkowski, G. P.; Levin, Barry E.

    1991-01-01

    Progress in assessing the feasibility, benefits, and risks associated with AI expert systems applied to low cost expendable launch vehicle systems is described. Part one identified potential application areas in vehicle operations and on-board functions, assessed measures of cost benefit, and identified key technologies to aid in the implementation of decision support systems in this environment. Part two of the program began the development of prototypes to demonstrate real-time vehicle checkout with controller and diagnostic/analysis intelligent systems and to gather true measures of cost savings vs. conventional software, verification and validation requirements, and maintainability improvement. The main objective of the expert advanced development projects was to provide a robust intelligent system for control/analysis that must be performed within a specified real-time window in order to meet the demands of the given application. The efforts to develop the two prototypes are described. Prime emphasis was on a controller expert system to show real-time performance in a cryogenic propellant loading application and safety validation implementation of this system experimentally, using commercial-off-the-shelf software tools and object oriented programming techniques. This smart ground support equipment prototype is based in C with imbedded expert system rules written in the CLIPS protocol. The relational database, ORACLE, provides non-real-time data support. The second demonstration develops the vehicle/ground intelligent automation concept, from phase one, to show cooperation between multiple expert systems. This automated test conductor (ATC) prototype utilizes a knowledge-bus approach for intelligent information processing by use of virtual sensors and blackboards to solve complex problems. It incorporates distributed processing of real-time data and object-oriented techniques for command, configuration control, and auto-code generation.

  14. Spinoff 2010

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Topics covered include: Burnishing Techniques Strengthen Hip Implants; Signal Processing Methods Monitor Cranial Pressure; Ultraviolet-Blocking Lenses Protect, Enhance Vision; Hyperspectral Systems Increase Imaging Capabilities; Programs Model the Future of Air Traffic Management; Tail Rotor Airfoils Stabilize Helicopters, Reduce Noise; Personal Aircraft Point to the Future of Transportation; Ducted Fan Designs Lead to Potential New Vehicles; Winglets Save Billions of Dollars in Fuel Costs; Sensor Systems Collect Critical Aerodynamics Data; Coatings Extend Life of Engines and Infrastructure; Radiometers Optimize Local Weather Prediction; Energy-Efficient Systems Eliminate Icing Danger for UAVs; Rocket-Powered Parachutes Rescue Entire Planes; Technologies Advance UAVs for Science, Military; Inflatable Antennas Support Emergency Communication; Smart Sensors Assess Structural Health; Hand-Held Devices Detect Explosives and Chemical Agents; Terahertz Tools Advance Imaging for Security, Industry; LED Systems Target Plant Growth; Aerogels Insulate Against Extreme Temperatures; Image Sensors Enhance Camera Technologies; Lightweight Material Patches Allow for Quick Repairs; Nanomaterials Transform Hairstyling Tools; Do-It-Yourself Additives Recharge Auto Air Conditioning; Systems Analyze Water Quality in Real Time; Compact Radiometers Expand Climate Knowledge; Energy Servers Deliver Clean, Affordable Power; Solutions Remediate Contaminated Groundwater; Bacteria Provide Cleanup of Oil Spills, Wastewater; Reflective Coatings Protect People and Animals; Innovative Techniques Simplify Vibration Analysis; Modeling Tools Predict Flow in Fluid Dynamics; Verification Tools Secure Online Shopping, Banking; Toolsets Maintain Health of Complex Systems; Framework Resources Multiply Computing Power; Tools Automate Spacecraft Testing, Operation; GPS Software Packages Deliver Positioning Solutions; Solid-State Recorders Enhance Scientific Data Collection; Computer Models Simulate Fine Particle Dispersion; Composite Sandwich Technologies Lighten Components; Cameras Reveal Elements in the Short Wave Infrared; Deformable Mirrors Correct Optical Distortions; Stitching Techniques Advance Optics Manufacturing; Compact, Robust Chips Integrate Optical Functions; Fuel Cell Stations Automate Processes, Catalyst Testing; Onboard Systems Record Unique Videos of Space Missions; Space Research Results Purify Semiconductor Materials; and Toolkits Control Motion of Complex Robotics.

  15. SU-F-T-65: AutomaticTreatment Planning for High-Dose Rate (HDR) Brachytherapy with a VaginalCylinder Applicator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Y; Tan, J; Jiang, S

    Purpose: High dose rate (HDR) brachytherapy treatment planning is conventionally performed in a manual fashion. Yet it is highly desirable to perform computerized automated planning to improve treatment planning efficiency, eliminate human errors, and reduce plan quality variation. The goal of this research is to develop an automatic treatment planning tool for HDR brachytherapy with a cylinder applicator for vaginal cancer. Methods: After inserting the cylinder applicator into the patient, a CT scan was acquired and was loaded to an in-house developed treatment planning software. The cylinder applicator was automatically segmented using image-processing techniques. CTV was generated based on user-specifiedmore » treatment depth and length. Locations of relevant points (apex point, prescription point, and vaginal surface point), central applicator channel coordinates, and dwell positions were determined according to their geometric relations with the applicator. Dwell time was computed through an inverse optimization process. The planning information was written into DICOM-RT plan and structure files to transfer the automatically generated plan to a commercial treatment planning system for plan verification and delivery. Results: We have tested the system retrospectively in nine patients treated with vaginal cylinder applicator. These cases were selected with different treatment prescriptions, lengths, depths, and cylinder diameters to represent a large patient population. Our system was able to generate treatment plans for these cases with clinically acceptable quality. Computation time varied from 3–6 min. Conclusion: We have developed a system to perform automated treatment planning for HDR brachytherapy with a cylinder applicator. Such a novel system has greatly improved treatment planning efficiency and reduced plan quality variation. It also served as a testbed to demonstrate the feasibility of automatic HDR treatment planning for more complicated cases.« less

  16. Advanced automation for in-space vehicle processing

    NASA Technical Reports Server (NTRS)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  17. Enriching regulatory networks by bootstrap learning using optimised GO-based gene similarity and gene links mined from PubMed abstracts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Ronald C.; Sanfilippo, Antonio P.; McDermott, Jason E.

    2011-02-18

    Transcriptional regulatory networks are being determined using “reverse engineering” methods that infer connections based on correlations in gene state. Corroboration of such networks through independent means such as evidence from the biomedical literature is desirable. Here, we explore a novel approach, a bootstrapping version of our previous Cross-Ontological Analytic method (XOA) that can be used for semi-automated annotation and verification of inferred regulatory connections, as well as for discovery of additional functional relationships between the genes. First, we use our annotation and network expansion method on a biological network learned entirely from the literature. We show how new relevant linksmore » between genes can be iteratively derived using a gene similarity measure based on the Gene Ontology that is optimized on the input network at each iteration. Second, we apply our method to annotation, verification, and expansion of a set of regulatory connections found by the Context Likelihood of Relatedness algorithm.« less

  18. Jeagle: a JAVA Runtime Verification Tool

    NASA Technical Reports Server (NTRS)

    DAmorim, Marcelo; Havelund, Klaus

    2005-01-01

    We introduce the temporal logic Jeagle and its supporting tool for runtime verification of Java programs. A monitor for an Jeagle formula checks if a finite trace of program events satisfies the formula. Jeagle is a programming oriented extension of the rule-based powerful Eagle logic that has been shown to be capable of defining and implementing a range of finite trace monitoring logics, including future and past time temporal logic, real-time and metric temporal logics, interval logics, forms of quantified temporal logics, and so on. Monitoring is achieved on a state-by-state basis avoiding any need to store the input trace. Jeagle extends Eagle with constructs for capturing parameterized program events such as method calls and method returns. Parameters can be the objects that methods are called upon, arguments to methods, and return values. Jeagle allows one to refer to these in formulas. The tool performs automated program instrumentation using AspectJ. We show the transformational semantics of Jeagle.

  19. A lightweight and secure two factor anonymous authentication protocol for Global Mobility Networks.

    PubMed

    Baig, Ahmed Fraz; Hassan, Khwaja Mansoor Ul; Ghani, Anwar; Chaudhry, Shehzad Ashraf; Khan, Imran; Ashraf, Muhammad Usman

    2018-01-01

    Global Mobility Networks(GLOMONETs) in wireless communication permits the global roaming services that enable a user to leverage the mobile services in any foreign country. Technological growth in wireless communication is also accompanied by new security threats and challenges. A threat-proof authentication protocol in wireless communication may overcome the security flaws by allowing only legitimate users to access a particular service. Recently, Lee et al. found Mun et al. scheme vulnerable to different attacks and proposed an advanced secure scheme to overcome the security flaws. However, this article points out that Lee et al. scheme lacks user anonymity, inefficient user authentication, vulnerable to replay and DoS attacks and Lack of local password verification. Furthermore, this article presents a more robust anonymous authentication scheme to handle the threats and challenges found in Lee et al.'s protocol. The proposed protocol is formally verified with an automated tool(ProVerif). The proposed protocol has superior efficiency in comparison to the existing protocols.

  20. A lightweight and secure two factor anonymous authentication protocol for Global Mobility Networks

    PubMed Central

    2018-01-01

    Global Mobility Networks(GLOMONETs) in wireless communication permits the global roaming services that enable a user to leverage the mobile services in any foreign country. Technological growth in wireless communication is also accompanied by new security threats and challenges. A threat-proof authentication protocol in wireless communication may overcome the security flaws by allowing only legitimate users to access a particular service. Recently, Lee et al. found Mun et al. scheme vulnerable to different attacks and proposed an advanced secure scheme to overcome the security flaws. However, this article points out that Lee et al. scheme lacks user anonymity, inefficient user authentication, vulnerable to replay and DoS attacks and Lack of local password verification. Furthermore, this article presents a more robust anonymous authentication scheme to handle the threats and challenges found in Lee et al.’s protocol. The proposed protocol is formally verified with an automated tool(ProVerif). The proposed protocol has superior efficiency in comparison to the existing protocols. PMID:29702675

  1. Towards a Compositional SPIN

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Giannakopoulou, Dimitra

    2006-01-01

    This paper discusses our initial experience with introducing automated assume-guarantee verification based on learning in the SPIN tool. We believe that compositional verification techniques such as assume-guarantee reasoning could complement the state-reduction techniques that SPIN already supports, thus increasing the size of systems that SPIN can handle. We present a "light-weight" approach to evaluating the benefits of learning-based assume-guarantee reasoning in the context of SPIN: we turn our previous implementation of learning for the LTSA tool into a main program that externally invokes SPIN to provide the model checking-related answers. Despite its performance overheads (which mandate a future implementation within SPIN itself), this approach provides accurate information about the savings in memory. We have experimented with several versions of learning-based assume guarantee reasoning, including a novel heuristic introduced here for generating component assumptions when their environment is unavailable. We illustrate the benefits of learning-based assume-guarantee reasoning in SPIN through the example of a resource arbiter for a spacecraft. Keywords: assume-guarantee reasoning, model checking, learning.

  2. Why Verifying Diagnostic Decisions with a Checklist Can Help: Insights from Eye Tracking

    ERIC Educational Resources Information Center

    Sibbald, Matthew; de Bruin, Anique B. H.; Yu, Eric; van Merrienboer, Jeroen J. G.

    2015-01-01

    Making a diagnosis involves ratifying or verifying a proposed answer. Formalizing this verification process with checklists, which highlight key variables involved in the diagnostic decision, is often advocated. However, the mechanisms by which a checklist might allow clinicians to improve their verification process have not been well studied. We…

  3. Pilot Guidelines for Improving Instructional Materials Through the Process of Learner Verification and Revision.

    ERIC Educational Resources Information Center

    Educational Products Information Exchange Inst., Stony Brook, NY.

    Learner Verification and Revision (LVR) Process of Instructional Materials is an ongoing effort for the improvement of instructional materials based on systematic feedback from learners who have used the materials. This evaluation gives publishers a method of identifying instructional strengths and weaknesses of a product and provides an…

  4. 77 FR 28401 - Information Collection Activities: Legacy Data Verification Process (LDVP); Submitted for Office...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-14

    ... and natural gas resources in a manner that is consistent with the need to make such resources... to prevent or minimize the likelihood of blowouts, loss of well control, fires, spillages, physical... the environment or to property, or endanger life or health.'' BSEE's Legacy Data Verification Process...

  5. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  6. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  7. Computer-aided design for metabolic engineering.

    PubMed

    Fernández-Castané, Alfred; Fehér, Tamás; Carbonell, Pablo; Pauthenier, Cyrille; Faulon, Jean-Loup

    2014-12-20

    The development and application of biotechnology-based strategies has had a great socio-economical impact and is likely to play a crucial role in the foundation of more sustainable and efficient industrial processes. Within biotechnology, metabolic engineering aims at the directed improvement of cellular properties, often with the goal of synthesizing a target chemical compound. The use of computer-aided design (CAD) tools, along with the continuously emerging advanced genetic engineering techniques have allowed metabolic engineering to broaden and streamline the process of heterologous compound-production. In this work, we review the CAD tools available for metabolic engineering with an emphasis, on retrosynthesis methodologies. Recent advances in genetic engineering strategies for pathway implementation and optimization are also reviewed as well as a range of bionalytical tools to validate in silico predictions. A case study applying retrosynthesis is presented as an experimental verification of the output from Retropath, the first complete automated computational pipeline applicable to metabolic engineering. Applying this CAD pipeline, together with genetic reassembly and optimization of culture conditions led to improved production of the plant flavonoid pinocembrin. Coupling CAD tools with advanced genetic engineering strategies and bioprocess optimization is crucial for enhanced product yields and will be of great value for the development of non-natural products through sustainable biotechnological processes. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Flexible and Low-Cost Measurements for Space Software Development- The Measurements Exploration Framework

    NASA Astrophysics Data System (ADS)

    Marculescu, Bogdan; Feldt, Robert; Torkar, Richard; Green, Lars-Goran; Liljegren, Thomas; Hult, Erika

    2011-08-01

    Verification and validation is an important part of software development and accounts for significant amounts of the costs associated with such a project. For developers of life or mission critical systems, such as software being developed for space applications, a balance must be reached between ensuring the quality of the system by extensive and rigorous testing and reducing costs and allowing the company to compete.Ensuring the quality of any system starts with a quality development process. To evaluate both the software development process and the product itself, measurements are needed. A balance must be then struck between ensuring the best possible quality of both process and product on the one hand, and reducing the cost of performing requirements on the other.A number of measurements have already been defined and are being used. For some of these, data collection can be automated as well, further lowering costs associated with implementing them. In practice, however, there may be situations where existing measurements are unsuitable for a variety of reasons.This paper describes a framework for creating low cost, flexible measurements in areas where initial information is scarce. The framework, called The Measurements Exploration Framework, is aimed in particular at the Space Software development industry and was developed is such an environment.

  9. Development of microtitre plates for electrokinetic assays

    NASA Astrophysics Data System (ADS)

    Burt, J. P. H.; Goater, A. D.; Menachery, A.; Pethig, R.; Rizvi, N. H.

    2007-02-01

    Electrokinetic processes have wide ranging applications in microsystems technology. Their optimum performance at micro and nano dimensions allows their use both as characterization and diagnostic tools and as a means of general particle manipulation. Within analytical studies, measurement of the electrokinesis of biological cells has the sensitivity and selectivity to distinguish subtle differences between cell types and cells undergoing changes and is gaining acceptance as a diagnostic tool in high throughput screening for drug discovery applications. In this work the development and manufacture of an electrokinetic-based microtitre plate is described. The plate is intended to be compatible with automated sample loading and handling systems. Manufacturing of the microtitre plate, which employs indium tin oxide microelectrodes, has been entirely undertaken using excimer and ultra-fast pulsed laser micromachining due to its flexibility in materials processing and accuracy in microstructuring. Laser micromachining has the ability to rapidly realize iterations in device prototype design while also having the capability to be scaled up for large scale manufacture. Device verification is achieved by the measurement of the electrorotation and dielectrophoretic properties of yeast cells while the flexibility of the developed microtitre plate is demonstrated by the selective separation of live yeast from polystyrene microbeads.

  10. Adaptive Algorithms for Automated Processing of Document Images

    DTIC Science & Technology

    2011-01-01

    ABSTRACT Title of dissertation: ADAPTIVE ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES Mudit Agrawal, Doctor of Philosophy, 2011...2011 4. TITLE AND SUBTITLE Adaptive Algorithms for Automated Processing of Document Images 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES by Mudit Agrawal Dissertation submitted to the Faculty of the Graduate School of the University

  11. Automated processing of endoscopic surgical instruments.

    PubMed

    Roth, K; Sieber, J P; Schrimm, H; Heeg, P; Buess, G

    1994-10-01

    This paper deals with the requirements for automated processing of endoscopic surgical instruments. After a brief analysis of the current problems, solutions are discussed. Test-procedures have been developed to validate the automated processing, so that the cleaning results are guaranteed and reproducable. Also a device for testing and cleaning was designed together with Netzsch Newamatic and PCI, called TC-MIC, to automate processing and reduce manual work.

  12. Multi-canister overpack project -- verification and validation, MCNP 4A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldmann, L.H.

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less

  13. Proof-of-concept automation of propellant processing

    NASA Technical Reports Server (NTRS)

    Ramohalli, Kumar; Schallhorn, P. A.

    1989-01-01

    For space-based propellant production, automation of the process is needed. Currently, all phases of terrestrial production have some form of human interaction. A mixer was acquired to help perform the tasks of automation. A heating system to be used with the mixer was designed, built, and installed. Tests performed on the heating system verify design criteria. An IBM PS/2 personal computer was acquired for the future automation work. It is hoped that some the mixing process itself will be automated. This is a concept demonstration task; proving that propellant production can be automated reliably.

  14. Comparison of two automated instruments for Epstein-Barr virus serology in a large adult hospital and implementation of an Epstein-Barr virus nuclear antigen-based testing algorithm.

    PubMed

    Al Sidairi, Hilal; Binkhamis, Khalifa; Jackson, Colleen; Roberts, Catherine; Heinstein, Charles; MacDonald, Jimmy; Needle, Robert; Hatchette, Todd F; LeBlanc, Jason J

    2017-11-01

    Serology remains the mainstay for diagnosis of Epstein-Barr virus (EBV) infection. This study compared two automated platforms (BioPlex 2200 and Architect i2000SR) to test three EBV serological markers: viral capsid antigen (VCA) immunoglobulins of class M (IgM), VCA immunoglobulins of class G (IgG) and EBV nuclear antigen-1 (EBNA-1) IgG. Using sera from 65 patients at various stages of EBV disease, BioPlex demonstrated near-perfect agreement for all EBV markers compared to a consensus reference. The agreement for Architect was near-perfect for VCA IgG and EBNA-1 IgG, and substantial for VCA IgM despite five equivocal results. Since the majority of testing in our hospital was from adults with EBNA-1 IgG positive results, post-implementation analysis of an EBNA-based algorithm showed advantages over parallel testing of the three serologic markers. This small verification demonstrated that both automated systems for EBV serology had good performance for all EBV markers, and an EBNA-based testing algorithm is ideal for an adult hospital.

  15. Foundations of the Bandera Abstraction Tools

    NASA Technical Reports Server (NTRS)

    Hatcliff, John; Dwyer, Matthew B.; Pasareanu, Corina S.; Robby

    2003-01-01

    Current research is demonstrating that model-checking and other forms of automated finite-state verification can be effective for checking properties of software systems. Due to the exponential costs associated with model-checking, multiple forms of abstraction are often necessary to obtain system models that are tractable for automated checking. The Bandera Tool Set provides multiple forms of automated support for compiling concurrent Java software systems to models that can be supplied to several different model-checking tools. In this paper, we describe the foundations of Bandera's data abstraction mechanism which is used to reduce the cardinality (and the program's state-space) of data domains in software to be model-checked. From a technical standpoint, the form of data abstraction used in Bandera is simple, and it is based on classical presentations of abstract interpretation. We describe the mechanisms that Bandera provides for declaring abstractions, for attaching abstractions to programs, and for generating abstracted programs and properties. The contributions of this work are the design and implementation of various forms of tool support required for effective application of data abstraction to software components written in a programming language like Java which has a rich set of linguistic features.

  16. The Evolution of the NASA Commercial Crew Program Mission Assurance Process

    NASA Technical Reports Server (NTRS)

    Canfield, Amy C.

    2016-01-01

    In 2010, the National Aeronautics and Space Administration (NASA) established the Commercial Crew Program (CCP) in order to provide human access to the International Space Station and low Earth orbit via the commercial (non-governmental) sector. A particular challenge to NASA has been how to determine that the Commercial Provider's transportation system complies with programmatic safety requirements. The process used in this determination is the Safety Technical Review Board which reviews and approves provider submitted hazard reports. One significant product of the review is a set of hazard control verifications. In past NASA programs, 100% of these safety critical verifications were typically confirmed by NASA. The traditional Safety and Mission Assurance (S&MA) model does not support the nature of the CCP. To that end, NASA S&MA is implementing a Risk Based Assurance process to determine which hazard control verifications require NASA authentication. Additionally, a Shared Assurance Model is also being developed to efficiently use the available resources to execute the verifications.

  17. Space shuttle engineering and operations support. Avionics system engineering

    NASA Technical Reports Server (NTRS)

    Broome, P. A.; Neubaur, R. J.; Welsh, R. T.

    1976-01-01

    The shuttle avionics integration laboratory (SAIL) requirements for supporting the Spacelab/orbiter avionics verification process are defined. The principal topics are a Spacelab avionics hardware assessment, test operations center/electronic systems test laboratory (TOC/ESL) data processing requirements definition, SAIL (Building 16) payload accommodations study, and projected funding and test scheduling. Because of the complex nature of the Spacelab/orbiter computer systems, the PCM data link, and the high rate digital data system hardware/software relationships, early avionics interface verification is required. The SAIL is a prime candidate test location to accomplish this early avionics verification.

  18. Improvement on post-OPC verification efficiency for contact/via coverage check by final CD biasing of metal lines and considering their location on the metal layout

    NASA Astrophysics Data System (ADS)

    Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong

    2011-04-01

    As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model application is presented.

  19. Exploring implementation practices in results-based financing: the case of the verification in Benin.

    PubMed

    Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier

    2017-03-14

    Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks causes a further verticalization of the health system. Our results highlight the potential disconnect between the theory of change behind RBF and the actual scheme's implementation. The implications are relevant at methodological level, stressing the importance of analyzing implementation processes to fully understand results, as well as at operational level, pointing to the need to carefully adapt the design of RBF schemes (including verification and other key functions) to the context and to allow room to iteratively modify it during implementation. They also question whether the rationale for thorough and costly verification is justified, or rather adaptations are possible.

  20. Using formal methods for content validation of medical procedure documents.

    PubMed

    Cota, Érika; Ribeiro, Leila; Bezerra, Jonas Santos; Costa, Andrei; da Silva, Rosiana Estefane; Cota, Gláucia

    2017-08-01

    We propose the use of a formal approach to support content validation of a standard operating procedure (SOP) for a therapeutic intervention. Such an approach provides a useful tool to identify ambiguities, omissions and inconsistencies, and improves the applicability and efficacy of documents in the health settings. We apply and evaluate a methodology originally proposed for the verification of software specification documents to a specific SOP. The verification methodology uses the graph formalism to model the document. Semi-automatic analysis identifies possible problems in the model and in the original document. The verification is an iterative process that identifies possible faults in the original text that should be revised by its authors and/or specialists. The proposed method was able to identify 23 possible issues in the original document (ambiguities, omissions, redundant information, and inaccuracies, among others). The formal verification process aided the specialists to consider a wider range of usage scenarios and to identify which instructions form the kernel of the proposed SOP and which ones represent additional or required knowledge that are mandatory for the correct application of the medical document. By using the proposed verification process, a simpler and yet more complete SOP could be produced. As consequence, during the validation process the experts received a more mature document and could focus on the technical aspects of the procedure itself. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System

    NASA Astrophysics Data System (ADS)

    Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li

    The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.

  2. The NASA Commercial Crew Program (CCP) Mission Assurance Process

    NASA Technical Reports Server (NTRS)

    Canfield, Amy

    2016-01-01

    In 2010, NASA established the Commercial Crew Program in order to provide human access to the International Space Station and low earth orbit via the commercial (non-governmental) sector. A particular challenge to NASA has been how to determine the commercial providers transportation system complies with Programmatic safety requirements. The process used in this determination is the Safety Technical Review Board which reviews and approves provider submitted Hazard Reports. One significant product of the review is a set of hazard control verifications. In past NASA programs, 100 percent of these safety critical verifications were typically confirmed by NASA. The traditional Safety and Mission Assurance (SMA) model does not support the nature of the Commercial Crew Program. To that end, NASA SMA is implementing a Risk Based Assurance (RBA) process to determine which hazard control verifications require NASA authentication. Additionally, a Shared Assurance Model is also being developed to efficiently use the available resources to execute the verifications. This paper will describe the evolution of the CCP Mission Assurance process from the beginning of the Program to its current incarnation. Topics to be covered include a short history of the CCP; the development of the Programmatic mission assurance requirements; the current safety review process; a description of the RBA process and its products and ending with a description of the Shared Assurance Model.

  3. Post-OPC verification using a full-chip pattern-based simulation verification method

    NASA Astrophysics Data System (ADS)

    Hung, Chi-Yuan; Wang, Ching-Heng; Ma, Cliff; Zhang, Gary

    2005-11-01

    In this paper, we evaluated and investigated techniques for performing fast full-chip post-OPC verification using a commercial product platform. A number of databases from several technology nodes, i.e. 0.13um, 0.11um and 90nm are used in the investigation. Although it has proven that for most cases, our OPC technology is robust in general, due to the variety of tape-outs with complicated design styles and technologies, it is difficult to develop a "complete or bullet-proof" OPC algorithm that would cover every possible layout patterns. In the evaluation, among dozens of databases, some OPC databases were found errors by Model-based post-OPC checking, which could cost significantly in manufacturing - reticle, wafer process, and more importantly the production delay. From such a full-chip OPC database verification, we have learned that optimizing OPC models and recipes on a limited set of test chip designs may not provide sufficient coverage across the range of designs to be produced in the process. And, fatal errors (such as pinch or bridge) or poor CD distribution and process-sensitive patterns may still occur. As a result, more than one reticle tape-out cycle is not uncommon to prove models and recipes that approach the center of process for a range of designs. So, we will describe a full-chip pattern-based simulation verification flow serves both OPC model and recipe development as well as post OPC verification after production release of the OPC. Lastly, we will discuss the differentiation of the new pattern-based and conventional edge-based verification tools and summarize the advantages of our new tool and methodology: 1). Accuracy: Superior inspection algorithms, down to 1nm accuracy with the new "pattern based" approach 2). High speed performance: Pattern-centric algorithms to give best full-chip inspection efficiency 3). Powerful analysis capability: Flexible error distribution, grouping, interactive viewing and hierarchical pattern extraction to narrow down to unique patterns/cells.

  4. Evaluation of STAT medication ordering process in a community hospital

    PubMed Central

    Walsh., Kim; Schwartz., Barbara

    Background: In most health care facilities, problems related to delays in STAT medication order processing time are of common concern. Objective: The purpose of this study was to evaluate processing time for STAT orders at Kimball Medical Center. Methods: All STAT orders were reviewed to determine processing time; order processing time was also stratified by physician order entry (physician entered (PE) orders vs. non-physician entered (NPE) orders). Collected data included medication ordered, indication, time ordered, time verified by pharmacist, time sent from pharmacy, and time charted as given to the patient. Results: A total of 502 STAT orders were reviewed and 389 orders were included for analysis. Overall, median time was 29 minutes, IQR 16–63; p<0.0001.). The time needed to process NPE orders was significantly less than that needed for PE orders (median 27 vs. 34 minutes; p=0.026). In terms of NPE orders, the median total time required to process STAT orders for medications available in the Automated Dispensing Devices (ADM) was within 30 minutes, while that required to process orders for medications not available in the ADM was significantly greater than 30 minutes. For PE orders, the median total time required to process orders for medications available in the ADM (i.e., not requiring pharmacy involvement) was significantly greater than 30 minutes. [Median time = 34 minutes (p<0.001)]. Conclusion: We conclude that STAT order processing time may be improved by increasing the availability of medications in ADM, and pharmacy involvement in the verification process. PMID:27382418

  5. Design Authority in the Test Programme Definition: The Alenia Spazio Experience

    NASA Astrophysics Data System (ADS)

    Messidoro, P.; Sacchi, E.; Beruto, E.; Fleming, P.; Marucchi Chierro, P.-P.

    2004-08-01

    In addition, being the Verification and Test Programme a significant part of the spacecraft development life cycle in terms of cost and time, very often the subject of the mentioned discussion has the objective to optimize the verification campaign by possible deletion or limitation of some testing activities. The increased market pressure to reduce the project's schedule and cost is originating a dialecting process inside the project teams, involving program management and design authorities, in order to optimize the verification and testing programme. The paper introduces the Alenia Spazio experience in this context, coming from the real project life on different products and missions (science, TLC, EO, manned, transportation, military, commercial, recurrent and one-of-a-kind). Usually the applicable verification and testing standards (e.g. ECSS-E-10 part 2 "Verification" and ECSS-E-10 part 3 "Testing" [1]) are tailored to the specific project on the basis of its peculiar mission constraints. The Model Philosophy and the associated verification and test programme are defined following an iterative process which suitably combines several aspects (including for examples test requirements and facilities) as shown in Fig. 1 (from ECSS-E-10). The considered cases are mainly oriented to the thermal and mechanical verification, where the benefits of possible test programme optimizations are more significant. Considering the thermal qualification and acceptance testing (i.e. Thermal Balance and Thermal Vacuum) the lessons learned originated by the development of several satellites are presented together with the corresponding recommended approaches. In particular the cases are indicated in which a proper Thermal Balance Test is mandatory and others, in presence of more recurrent design, where a qualification by analysis could be envisaged. The importance of a proper Thermal Vacuum exposure for workmanship verification is also highlighted. Similar considerations are summarized for the mechanical testing with particular emphasis on the importance of Modal Survey, Static and Sine Vibration Tests in the qualification stage in combination with the effectiveness of Vibro-Acoustic Test in acceptance. The apparent relative importance of the Sine Vibration Test for workmanship verification in specific circumstances is also highlighted. Fig. 1. Model philosophy, Verification and Test Programme definition The verification of the project requirements is planned through a combination of suitable verification methods (in particular Analysis and Test) at the different verification levels (from System down to Equipment), in the proper verification stages (e.g. in Qualification and Acceptance).

  6. Automated Chromium Plating Line for Gun Barrels

    DTIC Science & Technology

    1979-09-01

    consistent pretreatments and bath dwell times. Some of the advantages of automated processing include increased productivity (average of 20^) due to...when automated processing procedures’ are used. The current method of applying chromium electrodeposits to gun tubes is a manual, batch operation...currently practiced with rotary swaged gun tubes would substantially reduce the difficulties in automated processing . RECOMMENDATIONS

  7. Characterisation

    DTIC Science & Technology

    2007-03-01

    Characterisation. In Nanotechnology Aerospace Applications – 2006 (pp. 4-1 – 4-8). Educational Notes RTO-EN-AVT-129bis, Paper 4. Neuilly-sur-Seine, France: RTO...the Commercialisation Processes Concept IDEA Proof-of- Principle Trial Samples Engineering Verification Samples Design Verification Samples...SEIC Systems Engineering for commercialisation Design Houses, Engineering & R&D USERS & Integrators SE S U R Integrators Fabs & Wafer Processing Die

  8. Integration of mask and silicon metrology in DFM

    NASA Astrophysics Data System (ADS)

    Matsuoka, Ryoichi; Mito, Hiroaki; Sugiyama, Akiyuki; Toyoda, Yasutaka

    2009-03-01

    We have developed a highly integrated method of mask and silicon metrology. The method adopts a metrology management system based on DBM (Design Based Metrology). This is the high accurate contouring created by an edge detection algorithm used in mask CD-SEM and silicon CD-SEM. We have inspected the high accuracy, stability and reproducibility in the experiments of integration. The accuracy is comparable with that of the mask and silicon CD-SEM metrology. In this report, we introduce the experimental results and the application. As shrinkage of design rule for semiconductor device advances, OPC (Optical Proximity Correction) goes aggressively dense in RET (Resolution Enhancement Technology). However, from the view point of DFM (Design for Manufacturability), the cost of data process for advanced MDP (Mask Data Preparation) and mask producing is a problem. Such trade-off between RET and mask producing is a big issue in semiconductor market especially in mask business. Seeing silicon device production process, information sharing is not completely organized between design section and production section. Design data created with OPC and MDP should be linked to process control on production. But design data and process control data are optimized independently. Thus, we provided a solution of DFM: advanced integration of mask metrology and silicon metrology. The system we propose here is composed of followings. 1) Design based recipe creation: Specify patterns on the design data for metrology. This step is fully automated since they are interfaced with hot spot coordinate information detected by various verification methods. 2) Design based image acquisition: Acquire the images of mask and silicon automatically by a recipe based on the pattern design of CD-SEM.It is a robust automated step because a wide range of design data is used for the image acquisition. 3) Contour profiling and GDS data generation: An image profiling process is applied to the acquired image based on the profiling method of the field proven CD metrology algorithm. The detected edges are then converted to GDSII format, which is a standard format for a design data, and utilized for various DFM systems such as simulation. Namely, by integrating pattern shapes of mask and silicon formed during a manufacturing process into GDSII format, it makes it possible to bridge highly accurate pattern profile information over to the design field of various EDA systems. These are fully integrated into design data and automated. Bi-directional cross probing between mask data and process control data is allowed by linking them. This method is a solution for total optimization that covers Design, MDP, mask production and silicon device producing. This method therefore is regarded as a strategic DFM approach in the semiconductor metrology.

  9. Automated management of radioactive sources in Saudi Arabia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Kheliewi, Abdullah S.; Jamil, M. F.; Basar, M. R.

    2014-09-30

    For usage of radioactive substances, any facility has to register and take license from relevant authority of the country in which such facility is operating. In the Kingdom of Saudi Arabia (KSA), the authority for managing radioactive sources and providing licenses to organizations for its usage is the National Center of Radiation Protection (NCRP). This paper describes the system that automates registration and licensing process of the National Center of Radiation Protection. To provide 24×7 accesses to all the customers of NCRP, system is developed as web-based application that provide facility to online register, request license, renew license, check requestmore » status, view historical data and reports etc. and other features are provided as Electronic Services that would be accessible to users via internet. The system also was designed to streamline and optimize internal operations of NCRP besides providing ease of access to its customers by implementing a defined workflow through which every registration and license request will be routed. In addition to manual payment option, the system would also be integrated with SADAD (online payment system) that will avoid lengthy and cumbersome procedures associated with manual payment mechanism. Using SADAD payment option license fee could be paid through internet/ATM machine or branch of any designated bank, Payment will be instantly notified to NCRP hence delay in funds transfer and verification of invoice could be avoided, SADAD integration is discussed later in the document.« less

  10. The MCNP6 Analytic Criticality Benchmark Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-06-16

    Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less

  11. Automation in School Library Media Centers.

    ERIC Educational Resources Information Center

    Driver, Russell W.; Driver, Mary Anne

    1982-01-01

    Surveys the historical development of automated technical processing in schools and notes the impact of this automation in a number of cases. Speculations about the future involvement of school libraries in automated processing and networking are included. Thirty references are listed. (BBM)

  12. Advancing haemostasis automation--successful implementation of robotic centrifugation and sample processing in a tertiary service hospital.

    PubMed

    Sédille-Mostafaie, Nazanin; Engler, Hanna; Lutz, Susanne; Korte, Wolfgang

    2013-06-01

    Laboratories today face increasing pressure to automate operations due to increasing workloads and the need to reduce expenditure. Few studies to date have focussed on the laboratory automation of preanalytical coagulation specimen processing. In the present study, we examined whether a clinical chemistry automation protocol meets the preanalytical requirements for the analyses of coagulation. During the implementation of laboratory automation, we began to operate a pre- and postanalytical automation system. The preanalytical unit processes blood specimens for chemistry, immunology and coagulation by automated specimen processing. As the production of platelet-poor plasma is highly dependent on optimal centrifugation, we examined specimen handling under different centrifugation conditions in order to produce optimal platelet deficient plasma specimens. To this end, manually processed models centrifuged at 1500 g for 5 and 20 min were compared to an automated centrifugation model at 3000 g for 7 min. For analytical assays that are performed frequently enough to be targets for full automation, Passing-Bablok regression analysis showed close agreement between different centrifugation methods, with a correlation coefficient between 0.98 and 0.99 and a bias between -5% and +6%. For seldom performed assays that do not mandate full automation, the Passing-Bablok regression analysis showed acceptable to poor agreement between different centrifugation methods. A full automation solution is suitable and can be recommended for frequent haemostasis testing.

  13. A Secure, Intelligent, and Smart-Sensing Approach for Industrial System Automation and Transmission over Unsecured Wireless Networks

    PubMed Central

    Shahzad, Aamir; Lee, Malrey; Xiong, Neal Naixue; Jeong, Gisung; Lee, Young-Keun; Choi, Jae-Young; Mahesar, Abdul Wheed; Ahmad, Iftikhar

    2016-01-01

    In Industrial systems, Supervisory control and data acquisition (SCADA) system, the pseudo-transport layer of the distributed network protocol (DNP3) performs the functions of the transport layer and network layer of the open systems interconnection (OSI) model. This study used a simulation design of water pumping system, in-which the network nodes are directly and wirelessly connected with sensors, and are monitored by the main controller, as part of the wireless SCADA system. This study also intends to focus on the security issues inherent in the pseudo-transport layer of the DNP3 protocol. During disassembly and reassembling processes, the pseudo-transport layer keeps track of the bytes sequence. However, no mechanism is available that can verify the message or maintain the integrity of the bytes in the bytes received/transmitted from/to the data link layer or in the send/respond from the main controller/sensors. To properly and sequentially keep track of the bytes, a mechanism is required that can perform verification while bytes are received/transmitted from/to the lower layer of the DNP3 protocol or the send/respond to/from field sensors. For security and byte verification purposes, a mechanism needs to be proposed for the pseudo-transport layer, by employing cryptography algorithm. A dynamic choice security buffer (SB) is designed and employed during the security development. To achieve the desired goals of the proposed study, a pseudo-transport layer stack model is designed using the DNP3 protocol open library and the security is deployed and tested, without changing the original design. PMID:26950129

  14. A Secure, Intelligent, and Smart-Sensing Approach for Industrial System Automation and Transmission over Unsecured Wireless Networks.

    PubMed

    Shahzad, Aamir; Lee, Malrey; Xiong, Neal Naixue; Jeong, Gisung; Lee, Young-Keun; Choi, Jae-Young; Mahesar, Abdul Wheed; Ahmad, Iftikhar

    2016-03-03

    In Industrial systems, Supervisory control and data acquisition (SCADA) system, the pseudo-transport layer of the distributed network protocol (DNP3) performs the functions of the transport layer and network layer of the open systems interconnection (OSI) model. This study used a simulation design of water pumping system, in-which the network nodes are directly and wirelessly connected with sensors, and are monitored by the main controller, as part of the wireless SCADA system. This study also intends to focus on the security issues inherent in the pseudo-transport layer of the DNP3 protocol. During disassembly and reassembling processes, the pseudo-transport layer keeps track of the bytes sequence. However, no mechanism is available that can verify the message or maintain the integrity of the bytes in the bytes received/transmitted from/to the data link layer or in the send/respond from the main controller/sensors. To properly and sequentially keep track of the bytes, a mechanism is required that can perform verification while bytes are received/transmitted from/to the lower layer of the DNP3 protocol or the send/respond to/from field sensors. For security and byte verification purposes, a mechanism needs to be proposed for the pseudo-transport layer, by employing cryptography algorithm. A dynamic choice security buffer (SB) is designed and employed during the security development. To achieve the desired goals of the proposed study, a pseudo-transport layer stack model is designed using the DNP3 protocol open library and the security is deployed and tested, without changing the original design.

  15. Demonstration of the feasibility of automated silicon solar cell fabrication

    NASA Technical Reports Server (NTRS)

    Taylor, W. E.; Schwartz, F. M.

    1975-01-01

    A study effort was undertaken to determine the process, steps and design requirements of an automated silicon solar cell production facility. Identification of the key process steps was made and a laboratory model was conceptually designed to demonstrate the feasibility of automating the silicon solar cell fabrication process. A detailed laboratory model was designed to demonstrate those functions most critical to the question of solar cell fabrication process automating feasibility. The study and conceptual design have established the technical feasibility of automating the solar cell manufacturing process to produce low cost solar cells with improved performance. Estimates predict an automated process throughput of 21,973 kilograms of silicon a year on a three shift 49-week basis, producing 4,747,000 hexagonal cells (38mm/side), a total of 3,373 kilowatts at an estimated manufacturing cost of $0.866 per cell or $1.22 per watt.

  16. Automated Space Processing Payloads Study. Volume 1: Executive Summary

    NASA Technical Reports Server (NTRS)

    1975-01-01

    An investigation is described which examined the extent to which the experiment hardware and operational requirements can be met by automatic control and material handling devices; payload and system concepts are defined which make extensive use of automation technology. Topics covered include experiment requirements and hardware data, capabilities and characteristics of industrial automation equipment and controls, payload grouping, automated payload conceptual design, space processing payload preliminary design, automated space processing payloads for early shuttle missions, and cost and scheduling.

  17. Understanding human management of automation errors

    PubMed Central

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  18. Understanding human management of automation errors.

    PubMed

    McBride, Sara E; Rogers, Wendy A; Fisk, Arthur D

    2014-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance.

  19. 49 CFR 40.327 - When must the MRO report medical information gathered in the verification process?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 1 2011-10-01 2011-10-01 false When must the MRO report medical information gathered in the verification process? 40.327 Section 40.327 Transportation Office of the Secretary of Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Confidentiality and Release of Information § 40.327 When must...

  20. The use of robots for arms control treaty verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michalowski, S.J.

    1991-01-01

    Many aspects of the superpower relationship now present a new set of challenges and opportunities, including the vital area of arms control. This report addresses one such possibility: the use of robots for the verification of arms control treaties. The central idea of this report is far from commonly-accepted. In fact, it was only encountered once in bibliographic review phase of the project. Nonetheless, the incentive for using robots is simple and coincides with that of industrial applications: to replace or supplement human activity in the performance of tasks for which human participation is unnecessary, undesirable, impossible, too dangerous ormore » too expensive. As in industry, robots should replace workers (in this case, arms control inspectors) only when questions of efficiency, reliability, safety, security and cost-effectiveness have been answered satisfactorily. In writing this report, it is not our purpose to strongly advocate the application of robots in verification. Rather, we wish to explore the significant aspects, pro and con, of applying experience from the field of flexible automation to the complex task of assuring arms control treaty compliance. We want to establish a framework for further discussion of this topic and to define criteria for evaluating future proposals. The authors' expertise is in robots, not arms control. His practical experience has been in developing systems for use in the rehabilitation of severely disabled persons (such as quadriplegics), who can use robots for assistance during activities of everyday living, as well as in vocational applications. This creates a special interest in implementations that, in some way, include a human operator in the control scheme of the robot. As we hope to show in this report, such as interactive systems offer the greatest promise of making a contribution to the challenging problems of treaty verification. 15 refs.« less

Top