Sample records for timed model checking

  1. The Priority Inversion Problem and Real-Time Symbolic Model Checking

    DTIC Science & Technology

    1993-04-23

    real time systems unpredictable in subtle ways. This makes it more difficult to implement and debug such systems. Our work discusses this problem and presents one possible solution. The solution is formalized and verified using temporal logic model checking techniques. In order to perform the verification, the BDD-based symbolic model checking algorithm given in previous works was extended to handle real-time properties using the bounded until operator. We believe that this algorithm, which is based on discrete time, is able to handle many real-time properties

  2. Compositional schedulability analysis of real-time actor-based systems.

    PubMed

    Jaghoori, Mohammad Mahdi; de Boer, Frank; Longuet, Delphine; Chothia, Tom; Sirjani, Marjan

    2017-01-01

    We present an extension of the actor model with real-time, including deadlines associated with messages, and explicit application-level scheduling policies, e.g.,"earliest deadline first" which can be associated with individual actors. Schedulability analysis in this setting amounts to checking whether, given a scheduling policy for each actor, every task is processed within its designated deadline. To check schedulability, we introduce a compositional automata-theoretic approach, based on maximal use of model checking combined with testing. Behavioral interfaces define what an actor expects from the environment, and the deadlines for messages given these assumptions. We use model checking to verify that actors match their behavioral interfaces. We extend timed automata refinement with the notion of deadlines and use it to define compatibility of actor environments with the behavioral interfaces. Model checking of compatibility is computationally hard, so we propose a special testing process. We show that the analyses are decidable and automate the process using the Uppaal model checker.

  3. Program Model Checking: A Practitioner's Guide

    NASA Technical Reports Server (NTRS)

    Pressburger, Thomas T.; Mansouri-Samani, Masoud; Mehlitz, Peter C.; Pasareanu, Corina S.; Markosian, Lawrence Z.; Penix, John J.; Brat, Guillaume P.; Visser, Willem C.

    2008-01-01

    Program model checking is a verification technology that uses state-space exploration to evaluate large numbers of potential program executions. Program model checking provides improved coverage over testing by systematically evaluating all possible test inputs and all possible interleavings of threads in a multithreaded system. Model-checking algorithms use several classes of optimizations to reduce the time and memory requirements for analysis, as well as heuristics for meaningful analysis of partial areas of the state space Our goal in this guidebook is to assemble, distill, and demonstrate emerging best practices for applying program model checking. We offer it as a starting point and introduction for those who want to apply model checking to software verification and validation. The guidebook will not discuss any specific tool in great detail, but we provide references for specific tools.

  4. 75 FR 27406 - Airworthiness Directives; Bombardier, Inc. Model BD-100-1A10 (Challenger 300) Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-17

    ... BD- 100 Time Limits/Maintenance Checks. The actions described in this service information are... Challenger 300 BD-100 Time Limits/Maintenance Checks. (1) For the new tasks identified in Bombardier TR 5-2... Requirements,'' in Part 2 of Chapter 5 of Bombardier Challenger 300 BD-100 Time Limits/ Maintenance Checks...

  5. A Novel Method to Verify Multilevel Computational Models of Biological Systems Using Multiscale Spatio-Temporal Meta Model Checking

    PubMed Central

    Gilbert, David

    2016-01-01

    Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour, the Xenopus laevis cell cycle and the acute inflammation of the gut and lung. Our methodology and software will enable computational biologists to efficiently develop reliable multilevel computational models of biological systems. PMID:27187178

  6. A Novel Method to Verify Multilevel Computational Models of Biological Systems Using Multiscale Spatio-Temporal Meta Model Checking.

    PubMed

    Pârvu, Ovidiu; Gilbert, David

    2016-01-01

    Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour, the Xenopus laevis cell cycle and the acute inflammation of the gut and lung. Our methodology and software will enable computational biologists to efficiently develop reliable multilevel computational models of biological systems.

  7. 76 FR 53348 - Airworthiness Directives; BAE SYSTEMS (Operations) Limited Model BAe 146 Airplanes and Model Avro...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-26

    ... Maintenance Manual (AMM) includes chapters 05-10 ``Time Limits'', 05-15 ``Critical Design Configuration... 05, ``Time Limits/Maintenance Checks,'' of BAe 146 Series/AVRO 146-RJ Series Aircraft Maintenance... Chapter 05, ``Time Limits/ Maintenance Checks,'' of the BAE SYSTEMS (Operations) Limited BAe 146 Series...

  8. 77 FR 20520 - Airworthiness Directives; Bombardier, Inc. Model BD-100-1A10 (Challenger 300) Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-05

    ... Bombardier Challenger 300 BD-100 Time Limits/Maintenance Checks Manual. For this task, the initial compliance..., of Part 2, of the Bombardier Challenger 300 BD-100 Time Limits/Maintenance Checks Manual, the general.../Maintenance Checks Manual, provided that the relevant information in the general revision is identical to that...

  9. Finding Feasible Abstract Counter-Examples

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Dwyer, Matthew B.; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2002-01-01

    A strength of model checking is its ability to automate the detection of subtle system errors and produce traces that exhibit those errors. Given the high computational cost of model checking most researchers advocate the use of aggressive property-preserving abstractions. Unfortunately, the more aggressively a system is abstracted the more infeasible behavior it will have. Thus, while abstraction enables efficient model checking it also threatens the usefulness of model checking as a defect detection tool, since it may be difficult to determine whether a counter-example is feasible and hence worth developer time to analyze. We have explored several strategies for addressing this problem by extending an explicit-state model checker, Java PathFinder (JPF), to search for and analyze counter-examples in the presence of abstractions. We demonstrate that these techniques effectively preserve the defect detection ability of model checking in the presence of aggressive abstraction by applying them to check properties of several abstracted multi-threaded Java programs. These new capabilities are not specific to JPF and can be easily adapted to other model checking frameworks; we describe how this was done for the Bandera toolset.

  10. Symbolic LTL Compilation for Model Checking: Extended Abstract

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Y.; Vardi, Moshe Y.

    2007-01-01

    In Linear Temporal Logic (LTL) model checking, we check LTL formulas representing desired behaviors against a formal model of the system designed to exhibit these behaviors. To accomplish this task, the LTL formulas must be translated into automata [21]. We focus on LTL compilation by investigating LTL satisfiability checking via a reduction to model checking. Having shown that symbolic LTL compilation algorithms are superior to explicit automata construction algorithms for this task [16], we concentrate here on seeking a better symbolic algorithm.We present experimental data comparing algorithmic variations such as normal forms, encoding methods, and variable ordering and examine their effects on performance metrics including processing time and scalability. Safety critical systems, such as air traffic control, life support systems, hazardous environment controls, and automotive control systems, pervade our daily lives, yet testing and simulation alone cannot adequately verify their reliability [3]. Model checking is a promising approach to formal verification for safety critical systems which involves creating a formal mathematical model of the system and translating desired safety properties into a formal specification for this model. The complement of the specification is then checked against the system model. When the model does not satisfy the specification, model-checking tools accompany this negative answer with a counterexample, which points to an inconsistency between the system and the desired behaviors and aids debugging efforts.

  11. Toward improved design of check dam systems: A case study in the Loess Plateau, China

    NASA Astrophysics Data System (ADS)

    Pal, Debasish; Galelli, Stefano; Tang, Honglei; Ran, Qihua

    2018-04-01

    Check dams are one of the most common strategies for controlling sediment transport in erosion prone areas, along with soil and water conservation measures. However, existing mathematical models that simulate sediment production and delivery are often unable to simulate how the storage capacity of check dams varies with time. To explicitly account for this process-and to support the design of check dam systems-we developed a modelling framework consisting of two components, namely (1) the spatially distributed Soil Erosion and Sediment Delivery Model (WaTEM/SEDEM), and (2) a network-based model of check dam storage dynamics. The two models are run sequentially, with the second model receiving the initial sediment input to check dams from WaTEM/SEDEM. The framework is first applied to Shejiagou catchment, a 4.26 km2 area located in the Loess Plateau, China, where we study the effect of the existing check dam system on sediment dynamics. Results show that the deployment of check dams altered significantly the sediment delivery ratio of the catchment. Furthermore, the network-based model reveals a large variability in the life expectancy of check dams and abrupt changes in their filling rates. The application of the framework to six alternative check dam deployment scenarios is then used to illustrate its usefulness for planning purposes, and to derive some insights on the effect of key decision variables, such as the number, size, and site location of check dams. Simulation results suggest that better performance-in terms of life expectancy and sediment delivery ratio-could have been achieved with an alternative deployment strategy.

  12. Analyzing Phylogenetic Trees with Timed and Probabilistic Model Checking: The Lactose Persistence Case Study.

    PubMed

    Requeno, José Ignacio; Colom, José Manuel

    2014-12-01

    Model checking is a generic verification technique that allows the phylogeneticist to focus on models and specifications instead of on implementation issues. Phylogenetic trees are considered as transition systems over which we interrogate phylogenetic questions written as formulas of temporal logic. Nonetheless, standard logics become insufficient for certain practices of phylogenetic analysis since they do not allow the inclusion of explicit time and probabilities. The aim of this paper is to extend the application of model checking techniques beyond qualitative phylogenetic properties and adapt the existing logical extensions and tools to the field of phylogeny. The introduction of time and probabilities in phylogenetic specifications is motivated by the study of a real example: the analysis of the ratio of lactose intolerance in some populations and the date of appearance of this phenotype.

  13. Analyzing phylogenetic trees with timed and probabilistic model checking: the lactose persistence case study.

    PubMed

    Requeno, José Ignacio; Colom, José Manuel

    2014-10-23

    Model checking is a generic verification technique that allows the phylogeneticist to focus on models and specifications instead of on implementation issues. Phylogenetic trees are considered as transition systems over which we interrogate phylogenetic questions written as formulas of temporal logic. Nonetheless, standard logics become insufficient for certain practices of phylogenetic analysis since they do not allow the inclusion of explicit time and probabilities. The aim of this paper is to extend the application of model checking techniques beyond qualitative phylogenetic properties and adapt the existing logical extensions and tools to the field of phylogeny. The introduction of time and probabilities in phylogenetic specifications is motivated by the study of a real example: the analysis of the ratio of lactose intolerance in some populations and the date of appearance of this phenotype.

  14. Posterior Predictive Checks for Conditional Independence between Response Time and Accuracy

    ERIC Educational Resources Information Center

    Bolsinova, Maria; Tijmstra, Jesper

    2016-01-01

    Conditional independence (CI) between response time and response accuracy is a fundamental assumption of many joint models for time and accuracy used in educational measurement. In this study, posterior predictive checks (PPCs) are proposed for testing this assumption. These PPCs are based on three discrepancy measures reflecting different…

  15. An Integrated Environment for Efficient Formal Design and Verification

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.

  16. An approach to checking case-crossover analyses based on equivalence with time-series methods.

    PubMed

    Lu, Yun; Symons, James Morel; Geyh, Alison S; Zeger, Scott L

    2008-03-01

    The case-crossover design has been increasingly applied to epidemiologic investigations of acute adverse health effects associated with ambient air pollution. The correspondence of the design to that of matched case-control studies makes it inferentially appealing for epidemiologic studies. Case-crossover analyses generally use conditional logistic regression modeling. This technique is equivalent to time-series log-linear regression models when there is a common exposure across individuals, as in air pollution studies. Previous methods for obtaining unbiased estimates for case-crossover analyses have assumed that time-varying risk factors are constant within reference windows. In this paper, we rely on the connection between case-crossover and time-series methods to illustrate model-checking procedures from log-linear model diagnostics for time-stratified case-crossover analyses. Additionally, we compare the relative performance of the time-stratified case-crossover approach to time-series methods under 3 simulated scenarios representing different temporal patterns of daily mortality associated with air pollution in Chicago, Illinois, during 1995 and 1996. Whenever a model-be it time-series or case-crossover-fails to account appropriately for fluctuations in time that confound the exposure, the effect estimate will be biased. It is therefore important to perform model-checking in time-stratified case-crossover analyses rather than assume the estimator is unbiased.

  17. A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process

    NASA Technical Reports Server (NTRS)

    Wang, Yi; Tamai, Tetsuo

    2009-01-01

    Since the complexity of software systems continues to grow, most engineers face two serious problems: the state space explosion problem and the problem of how to debug systems. In this paper, we propose a game-theoretic approach to full branching time model checking on three-valued semantics. The three-valued models and logics provide successful abstraction that overcomes the state space explosion problem. The game style model checking that generates counter-examples can guide refinement or identify validated formulas, which solves the system debugging problem. Furthermore, output of our game style method will give significant information to engineers in detecting where errors have occurred and what the causes of the errors are.

  18. Perpetual Model Validation

    DTIC Science & Technology

    2017-03-01

    models of software execution, for example memory access patterns, to check for security intrusions. Additional research was performed to tackle the...considered using indirect models of software execution, for example memory access patterns, to check for security intrusions. Additional research ...deterioration for example , no longer corresponds to the model used during verification time. Finally, the research looked at ways to combine hybrid systems

  19. Model Checking the Remote Agent Planner

    NASA Technical Reports Server (NTRS)

    Khatib, Lina; Muscettola, Nicola; Havelund, Klaus; Norvig, Peter (Technical Monitor)

    2001-01-01

    This work tackles the problem of using Model Checking for the purpose of verifying the HSTS (Scheduling Testbed System) planning system. HSTS is the planner and scheduler of the remote agent autonomous control system deployed in Deep Space One (DS1). Model Checking allows for the verification of domain models as well as planning entries. We have chosen the real-time model checker UPPAAL for this work. We start by motivating our work in the introduction. Then we give a brief description of HSTS and UPPAAL. After that, we give a sketch for the mapping of HSTS models into UPPAAL and we present samples of plan model properties one may want to verify.

  20. The Automation of Nowcast Model Assessment Processes

    DTIC Science & Technology

    2016-09-01

    that will automate real-time WRE-N model simulations, collect and quality control check weather observations for assimilation and verification, and...domains centered near White Sands Missile Range, New Mexico, where the Meteorological Sensor Array (MSA) will be located. The MSA will provide...observations and performing quality -control checks for the pre-forecast data assimilation period. 2. Run the WRE-N model to generate model forecast data

  1. Proceedings of the First NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  2. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  3. 75 FR 43801 - Airworthiness Directives; Eurocopter France (ECF) Model EC225LP Helicopters

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ... time. Also, we use inspect rather than check when referring to an action required by a mechanic as... the various levels of government. Therefore, I certify this AD: 1. Is not a ``significant regulatory... compliance time. Also, we use inspect rather than check when referring to an action required by a mechanic as...

  4. Sediment trapping efficiency of adjustable check dam in laboratory and field experiment

    NASA Astrophysics Data System (ADS)

    Wang, Chiang; Chen, Su-Chin; Lu, Sheng-Jui

    2014-05-01

    Check dam has been constructed at mountain area to block debris flow, but has been filled after several events and lose its function of trapping. For the reason, the main facilities of our research is the adjustable steel slit check dam, which with the advantages of fast building, easy to remove or adjust it function. When we can remove transverse beams to drain sediments off and keep the channel continuity. We constructed adjustable steel slit check dam on the Landow torrent, Huisun Experiment Forest station as the prototype to compare with model in laboratory. In laboratory experiments, the Froude number similarity was used to design the dam model. The main comparisons focused on types of sediment trapping and removing, sediment discharge, and trapping rate of slit check dam. In different types of removing transverse beam showed different kind of sediment removal and differences on rate of sediment removing, removing rate, and particle size distribution. The sediment discharge in check dam with beams is about 40%~80% of check dam without beams. Furthermore, the spacing of beams is considerable factor to the sediment discharge. In field experiment, this research uses time-lapse photography to record the adjustable steel slit check dam on the Landow torrent. The typhoon Soulik made rainfall amounts of 600 mm in eight hours and induced debris flow in Landow torrent. Image data of time-lapse photography demonstrated that after several sediment transport event the adjustable steel slit check dam was buried by debris flow. The result of lab and field experiments: (1)Adjustable check dam could trap boulders and stop woody debris flow and flush out fine sediment to supply the need of downstream river. (2)The efficiency of sediment trapping in adjustable check dam with transverse beams was significantly improved. (3)The check dam without transverse beams can remove the sediment and keep the ecosystem continuity.

  5. 76 FR 477 - Airworthiness Directives; Bombardier, Inc. Model CL-600-2A12 (CL-601) and CL-600-2B16 (CL-601-3A...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-05

    ... to these aircraft if Bombardier Service Bulletin (SB) 601-0590 [Scheduled Maintenance Instructions... information: Challenger 601 Time Limits/Maintenance Checks, PSP 601-5, Revision 38, dated June 19, 2009. Challenger 601 Time Limits/Maintenance Checks, PSP 601A-5, Revision 34, dated June 19, 2009. Challenger 604...

  6. Bounded Parametric Model Checking for Elementary Net Systems

    NASA Astrophysics Data System (ADS)

    Knapik, Michał; Szreter, Maciej; Penczek, Wojciech

    Bounded Model Checking (BMC) is an efficient verification method for reactive systems. BMC has been applied so far to verification of properties expressed in (timed) modal logics, but never to their parametric extensions. In this paper we show, for the first time that BMC can be extended to PRTECTL - a parametric extension of the existential version of CTL. To this aim we define a bounded semantics and a translation from PRTECTL to SAT. The implementation of the algorithm for Elementary Net Systems is presented, together with some experimental results.

  7. Efficient Translation of LTL Formulae into Buchi Automata

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Lerda, Flavio

    2001-01-01

    Model checking is a fully automated technique for checking that a system satisfies a set of required properties. With explicit-state model checkers, properties are typically defined in linear-time temporal logic (LTL), and are translated into B chi automata in order to be checked. This report presents how we have combined and improved existing techniques to obtain an efficient LTL to B chi automata translator. In particular, we optimize the core of existing tableau-based approaches to generate significantly smaller automata. Our approach has been implemented and is being released as part of the Java PathFinder software (JPF), an explicit state model checker under development at the NASA Ames Research Center.

  8. Petri Net and Probabilistic Model Checking Based Approach for the Modelling, Simulation and Verification of Internet Worm Propagation

    PubMed Central

    Razzaq, Misbah; Ahmad, Jamil

    2015-01-01

    Internet worms are analogous to biological viruses since they can infect a host and have the ability to propagate through a chosen medium. To prevent the spread of a worm or to grasp how to regulate a prevailing worm, compartmental models are commonly used as a means to examine and understand the patterns and mechanisms of a worm spread. However, one of the greatest challenge is to produce methods to verify and validate the behavioural properties of a compartmental model. This is why in this study we suggest a framework based on Petri Nets and Model Checking through which we can meticulously examine and validate these models. We investigate Susceptible-Exposed-Infectious-Recovered (SEIR) model and propose a new model Susceptible-Exposed-Infectious-Recovered-Delayed-Quarantined (Susceptible/Recovered) (SEIDQR(S/I)) along with hybrid quarantine strategy, which is then constructed and analysed using Stochastic Petri Nets and Continuous Time Markov Chain. The analysis shows that the hybrid quarantine strategy is extremely effective in reducing the risk of propagating the worm. Through Model Checking, we gained insight into the functionality of compartmental models. Model Checking results validate simulation ones well, which fully support the proposed framework. PMID:26713449

  9. Petri Net and Probabilistic Model Checking Based Approach for the Modelling, Simulation and Verification of Internet Worm Propagation.

    PubMed

    Razzaq, Misbah; Ahmad, Jamil

    2015-01-01

    Internet worms are analogous to biological viruses since they can infect a host and have the ability to propagate through a chosen medium. To prevent the spread of a worm or to grasp how to regulate a prevailing worm, compartmental models are commonly used as a means to examine and understand the patterns and mechanisms of a worm spread. However, one of the greatest challenge is to produce methods to verify and validate the behavioural properties of a compartmental model. This is why in this study we suggest a framework based on Petri Nets and Model Checking through which we can meticulously examine and validate these models. We investigate Susceptible-Exposed-Infectious-Recovered (SEIR) model and propose a new model Susceptible-Exposed-Infectious-Recovered-Delayed-Quarantined (Susceptible/Recovered) (SEIDQR(S/I)) along with hybrid quarantine strategy, which is then constructed and analysed using Stochastic Petri Nets and Continuous Time Markov Chain. The analysis shows that the hybrid quarantine strategy is extremely effective in reducing the risk of propagating the worm. Through Model Checking, we gained insight into the functionality of compartmental models. Model Checking results validate simulation ones well, which fully support the proposed framework.

  10. Temporal Precedence Checking for Switched Models and its Application to a Parallel Landing Protocol

    NASA Technical Reports Server (NTRS)

    Duggirala, Parasara Sridhar; Wang, Le; Mitra, Sayan; Viswanathan, Mahesh; Munoz, Cesar A.

    2014-01-01

    This paper presents an algorithm for checking temporal precedence properties of nonlinear switched systems. This class of properties subsume bounded safety and capture requirements about visiting a sequence of predicates within given time intervals. The algorithm handles nonlinear predicates that arise from dynamics-based predictions used in alerting protocols for state-of-the-art transportation systems. It is sound and complete for nonlinear switch systems that robustly satisfy the given property. The algorithm is implemented in the Compare Execute Check Engine (C2E2) using validated simulations. As a case study, a simplified model of an alerting system for closely spaced parallel runways is considered. The proposed approach is applied to this model to check safety properties of the alerting logic for different operating conditions such as initial velocities, bank angles, aircraft longitudinal separation, and runway separation.

  11. Using State Merging and State Pruning to Address the Path Explosion Problem Faced by Symbolic Execution

    DTIC Science & Technology

    2014-06-19

    urgent and compelling. Recent efforts in this area automate program analysis techniques using model checking and symbolic execution [2, 5–7]. These...bounded model checking tool for x86 binary programs developed at the Air Force Institute of Technology (AFIT). Jiseki creates a bit-vector logic model based...assume there are n different paths through the function foo . The program could potentially call the function foo a bound number of times, resulting in n

  12. Temporal Specification and Verification of Real-Time Systems.

    DTIC Science & Technology

    1991-08-30

    of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .

  13. Model Checking - My 27-Year Quest to Overcome the State Explosion Problem

    NASA Technical Reports Server (NTRS)

    Clarke, Ed

    2009-01-01

    Model Checking is an automatic verification technique for state-transition systems that are finite=state or that have finite-state abstractions. In the early 1980 s in a series of joint papers with my graduate students E.A. Emerson and A.P. Sistla, we proposed that Model Checking could be used for verifying concurrent systems and gave algorithms for this purpose. At roughly the same time, Joseph Sifakis and his student J.P. Queille at the University of Grenoble independently developed a similar technique. Model Checking has been used successfully to reason about computer hardware and communication protocols and is beginning to be used for verifying computer software. Specifications are written in temporal logic, which is particularly valuable for expressing concurrency properties. An intelligent, exhaustive search is used to determine if the specification is true or not. If the specification is not true, the Model Checker will produce a counterexample execution trace that shows why the specification does not hold. This feature is extremely useful for finding obscure errors in complex systems. The main disadvantage of Model Checking is the state-explosion problem, which can occur if the system under verification has many processes or complex data structures. Although the state-explosion problem is inevitable in worst case, over the past 27 years considerable progress has been made on the problem for certain classes of state-transition systems that occur often in practice. In this talk, I will describe what Model Checking is, how it works, and the main techniques that have been developed for combating the state explosion problem.

  14. Timing analysis by model checking

    NASA Technical Reports Server (NTRS)

    Naydich, Dimitri; Guaspari, David

    2000-01-01

    The safety of modern avionics relies on high integrity software that can be verified to meet hard real-time requirements. The limits of verification technology therefore determine acceptable engineering practice. To simplify verification problems, safety-critical systems are commonly implemented under the severe constraints of a cyclic executive, which make design an expensive trial-and-error process highly intolerant of change. Important advances in analysis techniques, such as rate monotonic analysis (RMA), have provided a theoretical and practical basis for easing these onerous restrictions. But RMA and its kindred have two limitations: they apply only to verifying the requirement of schedulability (that tasks meet their deadlines) and they cannot be applied to many common programming paradigms. We address both these limitations by applying model checking, a technique with successful industrial applications in hardware design. Model checking algorithms analyze finite state machines, either by explicit state enumeration or by symbolic manipulation. Since quantitative timing properties involve a potentially unbounded state variable (a clock), our first problem is to construct a finite approximation that is conservative for the properties being analyzed-if the approximation satisfies the properties of interest, so does the infinite model. To reduce the potential for state space explosion we must further optimize this finite model. Experiments with some simple optimizations have yielded a hundred-fold efficiency improvement over published techniques.

  15. Model Checking with Edge-Valued Decision Diagrams

    NASA Technical Reports Server (NTRS)

    Roux, Pierre; Siminiceanu, Radu I.

    2010-01-01

    We describe an algebra of Edge-Valued Decision Diagrams (EVMDDs) to encode arithmetic functions and its implementation in a model checking library. We provide efficient algorithms for manipulating EVMDDs and review the theoretical time complexity of these algorithms for all basic arithmetic and relational operators. We also demonstrate that the time complexity of the generic recursive algorithm for applying a binary operator on EVMDDs is no worse than that of Multi- Terminal Decision Diagrams. We have implemented a new symbolic model checker with the intention to represent in one formalism the best techniques available at the moment across a spectrum of existing tools. Compared to the CUDD package, our tool is several orders of magnitude faster

  16. MPST Software: grl_pef_check

    NASA Technical Reports Server (NTRS)

    Call, Jared A.; Kwok, John H.; Fisher, Forest W.

    2013-01-01

    This innovation is a tool used to verify and validate spacecraft sequences at the predicted events file (PEF) level for the GRAIL (Gravity Recovery and Interior Laboratory, see http://www.nasa. gov/mission_pages/grail/main/index. html) mission as part of the Multi-Mission Planning and Sequencing Team (MPST) operations process to reduce the possibility for errors. This tool is used to catch any sequence related errors or issues immediately after the seqgen modeling to streamline downstream processes. This script verifies and validates the seqgen modeling for the GRAIL MPST process. A PEF is provided as input, and dozens of checks are performed on it to verify and validate the command products including command content, command ordering, flight-rule violations, modeling boundary consistency, resource limits, and ground commanding consistency. By performing as many checks as early in the process as possible, grl_pef_check streamlines the MPST task of generating GRAIL command and modeled products on an aggressive schedule. By enumerating each check being performed, and clearly stating the criteria and assumptions made at each step, grl_pef_check can be used as a manual checklist as well as an automated tool. This helper script was written with a focus on enabling the user with the information they need in order to evaluate a sequence quickly and efficiently, while still keeping them informed and active in the overall sequencing process. grl_pef_check verifies and validates the modeling and sequence content prior to investing any more effort into the build. There are dozens of various items in the modeling run that need to be checked, which is a time-consuming and errorprone task. Currently, no software exists that provides this functionality. Compared to a manual process, this script reduces human error and saves considerable man-hours by automating and streamlining the mission planning and sequencing task for the GRAIL mission.

  17. Model-Checking with Edge-Valued Decision Diagrams

    NASA Technical Reports Server (NTRS)

    Roux, Pierre; Siminiceanu, Radu I.

    2010-01-01

    We describe an algebra of Edge-Valued Decision Diagrams (EVMDDs) to encode arithmetic functions and its implementation in a model checking library along with state-of-the-art algorithms for building the transition relation and the state space of discrete state systems. We provide efficient algorithms for manipulating EVMDDs and give upper bounds of the theoretical time complexity of these algorithms for all basic arithmetic and relational operators. We also demonstrate that the time complexity of the generic recursive algorithm for applying a binary operator on EVMDDs is no worse than that of Multi-Terminal Decision Diagrams. We have implemented a new symbolic model checker with the intention to represent in one formalism the best techniques available at the moment across a spectrum of existing tools: EVMDDs for encoding arithmetic expressions, identity-reduced MDDs for representing the transition relation, and the saturation algorithm for reachability analysis. We compare our new symbolic model checking EVMDD library with the widely used CUDD package and show that, in many cases, our tool is several orders of magnitude faster than CUDD.

  18. A Model-Driven Approach for Telecommunications Network Services Definition

    NASA Astrophysics Data System (ADS)

    Chiprianov, Vanea; Kermarrec, Yvon; Alff, Patrick D.

    Present day Telecommunications market imposes a short concept-to-market time for service providers. To reduce it, we propose a computer-aided, model-driven, service-specific tool, with support for collaborative work and for checking properties on models. We started by defining a prototype of the Meta-model (MM) of the service domain. Using this prototype, we defined a simple graphical modeling language specific for service designers. We are currently enlarging the MM of the domain using model transformations from Network Abstractions Layers (NALs). In the future, we will investigate approaches to ensure the support for collaborative work and for checking properties on models.

  19. Model analysis of check dam impacts on long-term sediment and water budgets in southeast Arizona, USA

    USGS Publications Warehouse

    Norman, Laura M.; Niraula, Rewati

    2016-01-01

    The objective of this study was to evaluate the effect of check dam infrastructure on soil and water conservation at the catchment scale using the Soil and Water Assessment Tool (SWAT). This paired watershed study includes a watershed treated with over 2000 check dams and a Control watershed which has none, in the West Turkey Creek watershed, Southeast Arizona, USA. SWAT was calibrated for streamflow using discharge documented during the summer of 2013 at the Control site. Model results depict the necessity to eliminate lateral flow from SWAT models of aridland environments, the urgency to standardize geospatial soils data, and the care for which modelers must document altering parameters when presenting findings. Performance was assessed using the percent bias (PBIAS), with values of ±2.34%. The calibrated model was then used to examine the impacts of check dams at the Treated watershed. Approximately 630 tons of sediment is estimated to be stored behind check dams in the Treated watershed over the 3-year simulation, increasing water quality for fish habitat. A minimum precipitation event of 15 mm was necessary to instigate the detachment of soil, sediments, or rock from the study area, which occurred 2% of the time. The resulting watershed model is useful as a predictive framework and decision-support tool to consider long-term impacts of restoration and potential for future restoration.

  20. Evaluation of properties over phylogenetic trees using stochastic logics.

    PubMed

    Requeno, José Ignacio; Colom, José Manuel

    2016-06-14

    Model checking has been recently introduced as an integrated framework for extracting information of the phylogenetic trees using temporal logics as a querying language, an extension of modal logics that imposes restrictions of a boolean formula along a path of events. The phylogenetic tree is considered a transition system modeling the evolution as a sequence of genomic mutations (we understand mutation as different ways that DNA can be changed), while this kind of logics are suitable for traversing it in a strict and exhaustive way. Given a biological property that we desire to inspect over the phylogeny, the verifier returns true if the specification is satisfied or a counterexample that falsifies it. However, this approach has been only considered over qualitative aspects of the phylogeny. In this paper, we repair the limitations of the previous framework for including and handling quantitative information such as explicit time or probability. To this end, we apply current probabilistic continuous-time extensions of model checking to phylogenetics. We reinterpret a catalog of qualitative properties in a numerical way, and we also present new properties that couldn't be analyzed before. For instance, we obtain the likelihood of a tree topology according to a mutation model. As case of study, we analyze several phylogenies in order to obtain the maximum likelihood with the model checking tool PRISM. In addition, we have adapted the software for optimizing the computation of maximum likelihoods. We have shown that probabilistic model checking is a competitive framework for describing and analyzing quantitative properties over phylogenetic trees. This formalism adds soundness and readability to the definition of models and specifications. Besides, the existence of model checking tools hides the underlying technology, omitting the extension, upgrade, debugging and maintenance of a software tool to the biologists. A set of benchmarks justify the feasibility of our approach.

  1. An experimental manipulation of responsibility in children: a test of the inflated responsibility model of obsessive-compulsive disorder.

    PubMed

    Reeves, J; Reynolds, S; Coker, S; Wilson, C

    2010-09-01

    The objective of this study was to investigate whether Salkovskis (1985) inflated responsibility model of obsessive-compulsive disorder (OCD) applied to children. In an experimental design, 81 children aged 9-12 years were randomly allocated to three conditions: an inflated responsibility group, a moderate responsibility group, and a reduced responsibility group. In all groups children were asked to sort sweets according to whether or not they contained nuts. At baseline the groups did not differ on children's self reported anxiety, depression, obsessive-compulsive symptoms or on inflated responsibility beliefs. The experimental manipulation successfully changed children's perceptions of responsibility. During the sorting task time taken to complete the task, checking behaviours, hesitations, and anxiety were recorded. There was a significant effect of responsibility level on the behavioural variables of time taken, hesitations and check; as perceived responsibility increased children took longer to complete the task and checked and hesitated more often. There was no between-group difference in children's self reported state anxiety. The results offer preliminary support for the link between inflated responsibility and increased checking behaviours in children and add to the small but growing literature suggesting that cognitive models of OCD may apply to children. (c) 2010 Elsevier Ltd. All rights reserved.

  2. The Infobiotics Workbench: an integrated in silico modelling platform for Systems and Synthetic Biology.

    PubMed

    Blakes, Jonathan; Twycross, Jamie; Romero-Campero, Francisco Jose; Krasnogor, Natalio

    2011-12-01

    The Infobiotics Workbench is an integrated software suite incorporating model specification, simulation, parameter optimization and model checking for Systems and Synthetic Biology. A modular model specification allows for straightforward creation of large-scale models containing many compartments and reactions. Models are simulated either using stochastic simulation or numerical integration, and visualized in time and space. Model parameters and structure can be optimized with evolutionary algorithms, and model properties calculated using probabilistic model checking. Source code and binaries for Linux, Mac and Windows are available at http://www.infobiotics.org/infobiotics-workbench/; released under the GNU General Public License (GPL) version 3. Natalio.Krasnogor@nottingham.ac.uk.

  3. Verification and Planning Based on Coinductive Logic Programming

    NASA Technical Reports Server (NTRS)

    Bansal, Ajay; Min, Richard; Simon, Luke; Mallya, Ajay; Gupta, Gopal

    2008-01-01

    Coinduction is a powerful technique for reasoning about unfounded sets, unbounded structures, infinite automata, and interactive computations [6]. Where induction corresponds to least fixed point's semantics, coinduction corresponds to greatest fixed point semantics. Recently coinduction has been incorporated into logic programming and an elegant operational semantics developed for it [11, 12]. This operational semantics is the greatest fix point counterpart of SLD resolution (SLD resolution imparts operational semantics to least fix point based computations) and is termed co- SLD resolution. In co-SLD resolution, a predicate goal p( t) succeeds if it unifies with one of its ancestor calls. In addition, rational infinite terms are allowed as arguments of predicates. Infinite terms are represented as solutions to unification equations and the occurs check is omitted during the unification process. Coinductive Logic Programming (Co-LP) and Co-SLD resolution can be used to elegantly perform model checking and planning. A combined SLD and Co-SLD resolution based LP system forms the common basis for planning, scheduling, verification, model checking, and constraint solving [9, 4]. This is achieved by amalgamating SLD resolution, co-SLD resolution, and constraint logic programming [13] in a single logic programming system. Given that parallelism in logic programs can be implicitly exploited [8], complex, compute-intensive applications (planning, scheduling, model checking, etc.) can be executed in parallel on multi-core machines. Parallel execution can result in speed-ups as well as in larger instances of the problems being solved. In the remainder we elaborate on (i) how planning can be elegantly and efficiently performed under real-time constraints, (ii) how real-time systems can be elegantly and efficiently model- checked, as well as (iii) how hybrid systems can be verified in a combined system with both co-SLD and SLD resolution. Implementations of co-SLD resolution as well as preliminary implementations of the planning and verification applications have been developed [4]. Co-LP and Model Checking: The vast majority of properties that are to be verified can be classified into safety properties and liveness properties. It is well known within model checking that safety properties can be verified by reachability analysis, i.e, if a counter-example to the property exists, it can be finitely determined by enumerating all the reachable states of the Kripke structure.

  4. Symbolic discrete event system specification

    NASA Technical Reports Server (NTRS)

    Zeigler, Bernard P.; Chi, Sungdo

    1992-01-01

    Extending discrete event modeling formalisms to facilitate greater symbol manipulation capabilities is important to further their use in intelligent control and design of high autonomy systems. An extension to the DEVS formalism that facilitates symbolic expression of event times by extending the time base from the real numbers to the field of linear polynomials over the reals is defined. A simulation algorithm is developed to generate the branching trajectories resulting from the underlying nondeterminism. To efficiently manage symbolic constraints, a consistency checking algorithm for linear polynomial constraints based on feasibility checking algorithms borrowed from linear programming has been developed. The extended formalism offers a convenient means to conduct multiple, simultaneous explorations of model behaviors. Examples of application are given with concentration on fault model analysis.

  5. Discordance between 'actual' and 'scheduled' check-in times at a heart failure clinic

    PubMed Central

    Joyce, Emer; Gandesbery, Benjamin T.; Blackstone, Eugene H.; Taylor, David O.; Tang, W. H. Wilson; Starling, Randall C.; Hachamovitch, Rory

    2017-01-01

    Introduction A 2015 Institute Of Medicine statement “Transforming Health Care Scheduling and Access: Getting to Now”, has increased concerns regarding patient wait times. Although waiting times have been widely studied, little attention has been paid to the role of patient arrival times as a component of this phenomenon. To this end, we investigated patterns of patient arrival at scheduled ambulatory heart failure (HF) clinic appointments and studied its predictors. We hypothesized that patients are more likely to arrive later than scheduled, with progressively later arrivals later in the day. Methods and results Using a business intelligence database we identified 6,194 unique patients that visited the Cleveland Clinic Main Campus HF clinic between January, 2015 and January, 2017. This clinic served both as a tertiary referral center and a community HF clinic. Transplant and left ventricular assist device (LVAD) visits were excluded. Punctuality was defined as the difference between ‘actual’ and ‘scheduled’ check-in times, whereby negative values (i.e., early punctuality) were patients who checked-in early. Contrary to our hypothesis, we found that patients checked-in late only a minority of the time (38% of visits). Additionally, examining punctuality by appointment hour slot we found that patients scheduled after 8AM had progressively earlier check-in times as the day progressed (P < .001 for trend). In both a Random Forest-Regression framework and linear regression models the most important risk-adjusted predictors of early punctuality were: later in the day appointment hour slot, patient having previously been to the hospital, age in the early 70s, and white race. Conclusions Patients attending a mixed population ambulatory HF clinic check-in earlier than scheduled times, with progressive discrepant intervals throughout the day. This finding may have significant implications for provider utilization and resource planning in order to maximize clinic efficiency. The impact of elective early arrival on patient’s perceived wait times requires further study. PMID:29136649

  6. Discordance between 'actual' and 'scheduled' check-in times at a heart failure clinic.

    PubMed

    Gorodeski, Eiran Z; Joyce, Emer; Gandesbery, Benjamin T; Blackstone, Eugene H; Taylor, David O; Tang, W H Wilson; Starling, Randall C; Hachamovitch, Rory

    2017-01-01

    A 2015 Institute Of Medicine statement "Transforming Health Care Scheduling and Access: Getting to Now", has increased concerns regarding patient wait times. Although waiting times have been widely studied, little attention has been paid to the role of patient arrival times as a component of this phenomenon. To this end, we investigated patterns of patient arrival at scheduled ambulatory heart failure (HF) clinic appointments and studied its predictors. We hypothesized that patients are more likely to arrive later than scheduled, with progressively later arrivals later in the day. Using a business intelligence database we identified 6,194 unique patients that visited the Cleveland Clinic Main Campus HF clinic between January, 2015 and January, 2017. This clinic served both as a tertiary referral center and a community HF clinic. Transplant and left ventricular assist device (LVAD) visits were excluded. Punctuality was defined as the difference between 'actual' and 'scheduled' check-in times, whereby negative values (i.e., early punctuality) were patients who checked-in early. Contrary to our hypothesis, we found that patients checked-in late only a minority of the time (38% of visits). Additionally, examining punctuality by appointment hour slot we found that patients scheduled after 8AM had progressively earlier check-in times as the day progressed (P < .001 for trend). In both a Random Forest-Regression framework and linear regression models the most important risk-adjusted predictors of early punctuality were: later in the day appointment hour slot, patient having previously been to the hospital, age in the early 70s, and white race. Patients attending a mixed population ambulatory HF clinic check-in earlier than scheduled times, with progressive discrepant intervals throughout the day. This finding may have significant implications for provider utilization and resource planning in order to maximize clinic efficiency. The impact of elective early arrival on patient's perceived wait times requires further study.

  7. Modelling spoilage of fresh turbot and evaluation of a time-temperature integrator (TTI) label under fluctuating temperature.

    PubMed

    Nuin, Maider; Alfaro, Begoña; Cruz, Ziortza; Argarate, Nerea; George, Susie; Le Marc, Yvan; Olley, June; Pin, Carmen

    2008-10-31

    Kinetic models were developed to predict the microbial spoilage and the sensory quality of fresh fish and to evaluate the efficiency of a commercial time-temperature integrator (TTI) label, Fresh Check(R), to monitor shelf life. Farmed turbot (Psetta maxima) samples were packaged in PVC film and stored at 0, 5, 10 and 15 degrees C. Microbial growth and sensory attributes were monitored at regular time intervals. The response of the Fresh Check device was measured at the same temperatures during the storage period. The sensory perception was quantified according to a global sensory indicator obtained by principal component analysis as well as to the Quality Index Method, QIM, as described by Rahman and Olley [Rahman, H.A., Olley, J., 1984. Assessment of sensory techniques for quality assessment of Australian fish. CSIRO Tasmanian Regional Laboratory. Occasional paper n. 8. Available from the Australian Maritime College library. Newnham. Tasmania]. Both methods were found equally valid to monitor the loss of sensory quality. The maximum specific growth rate of spoilage bacteria, the rate of change of the sensory indicators and the rate of change of the colour measurements of the TTI label were modelled as a function of temperature. The temperature had a similar effect on the bacteria, sensory and Fresh Check kinetics. At the time of sensory rejection, the bacterial load was ca. 10(5)-10(6) cfu/g. The end of shelf life indicated by the Fresh Check label was close to the sensory rejection time. The performance of the models was validated under fluctuating temperature conditions by comparing the predicted and measured values for all microbial, sensory and TTI responses. The models have been implemented in a Visual Basic add-in for Excel called "Fish Shelf Life Prediction (FSLP)". This program predicts sensory acceptability and growth of spoilage bacteria in fish and the response of the TTI at constant and fluctuating temperature conditions. The program is freely available at http://www.azti.es/muestracontenido.asp?idcontenido=980&content=15&nodo1=30&nodo2=0.

  8. Visual Predictive Check in Models with Time-Varying Input Function.

    PubMed

    Largajolli, Anna; Bertoldo, Alessandra; Campioni, Marco; Cobelli, Claudio

    2015-11-01

    The nonlinear mixed effects models are commonly used modeling techniques in the pharmaceutical research as they enable the characterization of the individual profiles together with the population to which the individuals belong. To ensure a correct use of them is fundamental to provide powerful diagnostic tools that are able to evaluate the predictive performance of the models. The visual predictive check (VPC) is a commonly used tool that helps the user to check by visual inspection if the model is able to reproduce the variability and the main trend of the observed data. However, the simulation from the model is not always trivial, for example, when using models with time-varying input function (IF). In this class of models, there is a potential mismatch between each set of simulated parameters and the associated individual IF which can cause an incorrect profile simulation. We introduce a refinement of the VPC by taking in consideration a correlation term (the Mahalanobis or normalized Euclidean distance) that helps the association of the correct IF with the individual set of simulated parameters. We investigate and compare its performance with the standard VPC in models of the glucose and insulin system applied on real and simulated data and in a simulated pharmacokinetic/pharmacodynamic (PK/PD) example. The newly proposed VPC performance appears to be better with respect to the standard VPC especially for the models with big variability in the IF where the probability of simulating incorrect profiles is higher.

  9. MOM: A meteorological data checking expert system in CLIPS

    NASA Technical Reports Server (NTRS)

    Odonnell, Richard

    1990-01-01

    Meteorologists have long faced the problem of verifying the data they use. Experience shows that there is a sizable number of errors in the data reported by meteorological observers. This is unacceptable for computer forecast models, which depend on accurate data for accurate results. Most errors that occur in meteorological data are obvious to the meteorologist, but time constraints prevent hand-checking. For this reason, it is necessary to have a 'front end' to the computer model to ensure the accuracy of input. Various approaches to automatic data quality control have been developed by several groups. MOM is a rule-based system implemented in CLIPS and utilizing 'consistency checks' and 'range checks'. The system is generic in the sense that it knows some meteorological principles, regardless of specific station characteristics. Specific constraints kept as CLIPS facts in a separate file provide for system flexibility. Preliminary results show that the expert system has detected some inconsistencies not noticed by a local expert.

  10. Tobacco outlet density, retailer cigarette sales without ID checks and enforcement of underage tobacco laws: associations with youths' cigarette smoking and beliefs.

    PubMed

    Lipperman-Kreda, Sharon; Grube, Joel W; Friend, Karen B; Mair, Christina

    2016-03-01

    To estimate the relationships of tobacco outlet density, cigarette sales without ID checks and local enforcement of underage tobacco laws with youth's life-time cigarette smoking, perceived availability of tobacco and perceived enforcement of underage tobacco laws and changes over time. The study involved: (a) three annual telephone surveys, (b) two annual purchase surveys in 2000 tobacco outlets and (c) interviews with key informants from local law enforcement agencies. Analyses were multi-level models (city, individual, time). A sample of 50 mid-sized non-contiguous cities in California, USA. A total of 1478 youths (aged 13-16 at wave 1, 52.2% male); 1061 participated in all waves. Measures at the individual level included life-time cigarette smoking, perceived availability and perceived enforcement. City-level measures included tobacco outlet density, cigarette sales without ID checks and compliance checks. Outlet density was associated positively with life-time smoking [OR = 1.12, P < 0.01]. An interaction between outlet density and wave (OR = 0.96, P < 0.05) suggested that higher density was associated more closely with life-time smoking at the earlier waves when respondents were younger. Greater density was associated positively with perceived availability (β = 0.02, P < 0.05) and negatively with perceived enforcement (β = -0.02, P < 0.01). Sales rate without checking IDs was related to greater perceived availability (β = 0.01, P < 0.01) and less perceived enforcement (β = -0.01, P < 0.01). Enforcement of underage tobacco laws was related positively to perceived enforcement (β = 0.06, P < 0.05). Higher tobacco outlet density may contribute to life-time smoking among youths. Density, sales without ID checks and enforcement levels may influence beliefs about access to cigarettes and enforcement of underage tobacco sales laws. © 2015 Society for the Study of Addiction.

  11. Rewriting Modulo SMT

    NASA Technical Reports Server (NTRS)

    Rocha, Camilo; Meseguer, Jose; Munoz, Cesar A.

    2013-01-01

    Combining symbolic techniques such as: (i) SMT solving, (ii) rewriting modulo theories, and (iii) model checking can enable the analysis of infinite-state systems outside the scope of each such technique. This paper proposes rewriting modulo SMT as a new technique combining the powers of (i)-(iii) and ideally suited to model and analyze infinite-state open systems; that is, systems that interact with a non-deterministic environment. Such systems exhibit both internal non-determinism due to the system, and external non-determinism due to the environment. They are not amenable to finite-state model checking analysis because they typically are infinite-state. By being reducible to standard rewriting using reflective techniques, rewriting modulo SMT can both naturally model and analyze open systems without requiring any changes to rewriting-based reachability analysis techniques for closed systems. This is illustrated by the analysis of a real-time system beyond the scope of timed automata methods.

  12. Development of a Model of Soldier Effectiveness: Retranslation Materials and Results

    DTIC Science & Technology

    1987-05-01

    covering financial responsibility, particularly the family checking account . Consequent- ly, the bad check rate for the unit drop- ped from 70 a month...Alcohol, and Aggressive Acts " Showing prudence in financial management and responsibility in personal/family matters; avoiding alcohol and other drugs or...threatening others, etc. versus " Acting irresponsibly in financial or personal/family affairs such that command time is required to counsel or otherwise

  13. Do alcohol compliance checks decrease underage sales at neighboring establishments?

    PubMed

    Erickson, Darin J; Smolenski, Derek J; Toomey, Traci L; Carlin, Bradley P; Wagenaar, Alexander C

    2013-11-01

    Underage alcohol compliance checks conducted by law enforcement agencies can reduce the likelihood of illegal alcohol sales at checked alcohol establishments, and theory suggests that an alcohol establishment that is checked may warn nearby establishments that compliance checks are being conducted in the area. In this study, we examined whether the effects of compliance checks diffuse to neighboring establishments. We used data from the Complying with the Minimum Drinking Age trial, which included more than 2,000 compliance checks conducted at more than 900 alcohol establishments. The primary outcome was the sale of alcohol to a pseudo-underage buyer without the need for age identification. A multilevel logistic regression was used to model the effect of a compliance check at each establishment as well as the effect of compliance checks at neighboring establishments within 500 m (stratified into four equal-radius concentric rings), after buyer, license, establishment, and community-level variables were controlled for. We observed a decrease in the likelihood of establishments selling alcohol to underage youth after they had been checked by law enforcement, but these effects quickly decayed over time. Establishments that had a close neighbor (within 125 m) checked in the past 90 days were also less likely to sell alcohol to young-appearing buyers. The spatial effect of compliance checks on other establishments decayed rapidly with increasing distance. Results confirm the hypothesis that the effects of police compliance checks do spill over to neighboring establishments. These findings have implications for the development of an optimal schedule of police compliance checks.

  14. Statistical mechanics of broadcast channels using low-density parity-check codes.

    PubMed

    Nakamura, Kazutaka; Kabashima, Yoshiyuki; Morelos-Zaragoza, Robert; Saad, David

    2003-03-01

    We investigate the use of Gallager's low-density parity-check (LDPC) codes in a degraded broadcast channel, one of the fundamental models in network information theory. Combining linear codes is a standard technique in practical network communication schemes and is known to provide better performance than simple time sharing methods when algebraic codes are used. The statistical physics based analysis shows that the practical performance of the suggested method, achieved by employing the belief propagation algorithm, is superior to that of LDPC based time sharing codes while the best performance, when received transmissions are optimally decoded, is bounded by the time sharing limit.

  15. Principles of continuous quality improvement applied to intravenous therapy.

    PubMed

    Dunavin, M K; Lane, C; Parker, P E

    1994-01-01

    Documentation of the application of the principles of continuous quality improvement (CQI) to the health care setting is crucial for understanding the transition from traditional management models to CQI models. A CQI project was designed and implemented by the IV Therapy Department at Lawrence Memorial Hospital to test the application of these principles to intravenous therapy and as a learning tool for the entire organization. Through a prototype inventory project, significant savings in cost and time were demonstrated using check sheets, flow diagrams, control charts, and other statistical tools, as well as using the Plan-Do-Check-Act cycle. As a result, a primary goal, increased time for direct patient care, was achieved. Eight hours per week in nursing time was saved, relationships between two work areas were improved, and $6,000 in personnel costs, storage space, and inventory were saved.

  16. Model Checking Satellite Operational Procedures

    NASA Astrophysics Data System (ADS)

    Cavaliere, Federico; Mari, Federico; Melatti, Igor; Minei, Giovanni; Salvo, Ivano; Tronci, Enrico; Verzino, Giovanni; Yushtein, Yuri

    2011-08-01

    We present a model checking approach for the automatic verification of satellite operational procedures (OPs). Building a model for a complex system as a satellite is a hard task. We overcome this obstruction by using a suitable simulator (SIMSAT) for the satellite. Our approach aims at improving OP quality assurance by automatic exhaustive exploration of all possible simulation scenarios. Moreover, our solution decreases OP verification costs by using a model checker (CMurphi) to automatically drive the simulator. We model OPs as user-executed programs observing the simulator telemetries and sending telecommands to the simulator. In order to assess feasibility of our approach we present experimental results on a simple meaningful scenario. Our results show that we can save up to 90% of verification time.

  17. Addressing Dynamic Issues of Program Model Checking

    NASA Technical Reports Server (NTRS)

    Lerda, Flavio; Visser, Willem

    2001-01-01

    Model checking real programs has recently become an active research area. Programs however exhibit two characteristics that make model checking difficult: the complexity of their state and the dynamic nature of many programs. Here we address both these issues within the context of the Java PathFinder (JPF) model checker. Firstly, we will show how the state of a Java program can be encoded efficiently and how this encoding can be exploited to improve model checking. Next we show how to use symmetry reductions to alleviate some of the problems introduced by the dynamic nature of Java programs. Lastly, we show how distributed model checking of a dynamic program can be achieved, and furthermore, how dynamic partitions of the state space can improve model checking. We support all our findings with results from applying these techniques within the JPF model checker.

  18. Extensions to the visual predictive check to facilitate model performance evaluation.

    PubMed

    Post, Teun M; Freijer, Jan I; Ploeger, Bart A; Danhof, Meindert

    2008-04-01

    The Visual Predictive Check (VPC) is a valuable and supportive instrument for evaluating model performance. However in its most commonly applied form, the method largely depends on a subjective comparison of the distribution of the simulated data with the observed data, without explicitly quantifying and relating the information in both. In recent adaptations to the VPC this drawback is taken into consideration by presenting the observed and predicted data as percentiles. In addition, in some of these adaptations the uncertainty in the predictions is represented visually. However, it is not assessed whether the expected random distribution of the observations around the predicted median trend is realised in relation to the number of observations. Moreover the influence of and the information residing in missing data at each time point is not taken into consideration. Therefore, in this investigation the VPC is extended with two methods to support a less subjective and thereby more adequate evaluation of model performance: (i) the Quantified Visual Predictive Check (QVPC) and (ii) the Bootstrap Visual Predictive Check (BVPC). The QVPC presents the distribution of the observations as a percentage, thus regardless the density of the data, above and below the predicted median at each time point, while also visualising the percentage of unavailable data. The BVPC weighs the predicted median against the 5th, 50th and 95th percentiles resulting from a bootstrap of the observed data median at each time point, while accounting for the number and the theoretical position of unavailable data. The proposed extensions to the VPC are illustrated by a pharmacokinetic simulation example and applied to a pharmacodynamic disease progression example.

  19. Integrated Formal Analysis of Timed-Triggered Ethernet

    NASA Technical Reports Server (NTRS)

    Dutertre, Bruno; Shankar, Nstarajan; Owre, Sam

    2012-01-01

    We present new results related to the verification of the Timed-Triggered Ethernet (TTE) clock synchronization protocol. This work extends previous verification of TTE based on model checking. We identify a suboptimal design choice in a compression function used in clock synchronization, and propose an improvement. We compare the original design and the improved definition using the SAL model checker.

  20. Model Checking Abstract PLEXIL Programs with SMART

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.

    2007-01-01

    We describe a method to automatically generate discrete-state models of abstract Plan Execution Interchange Language (PLEXIL) programs that can be analyzed using model checking tools. Starting from a high-level description of a PLEXIL program or a family of programs with common characteristics, the generator lays the framework that models the principles of program execution. The concrete parts of the program are not automatically generated, but require the modeler to introduce them by hand. As a case study, we generate models to verify properties of the PLEXIL macro constructs that are introduced as shorthand notation. After an exhaustive analysis, we conclude that the macro definitions obey the intended semantics and behave as expected, but contingently on a few specific requirements on the timing semantics of micro-steps in the concrete executive implementation.

  1. Verifying Multi-Agent Systems via Unbounded Model Checking

    NASA Technical Reports Server (NTRS)

    Kacprzak, M.; Lomuscio, A.; Lasica, T.; Penczek, W.; Szreter, M.

    2004-01-01

    We present an approach to the problem of verification of epistemic properties in multi-agent systems by means of symbolic model checking. In particular, it is shown how to extend the technique of unbounded model checking from a purely temporal setting to a temporal-epistemic one. In order to achieve this, we base our discussion on interpreted systems semantics, a popular semantics used in multi-agent systems literature. We give details of the technique and show how it can be applied to the well known train, gate and controller problem. Keywords: model checking, unbounded model checking, multi-agent systems

  2. Superposition-Based Analysis of First-Order Probabilistic Timed Automata

    NASA Astrophysics Data System (ADS)

    Fietzke, Arnaud; Hermanns, Holger; Weidenbach, Christoph

    This paper discusses the analysis of first-order probabilistic timed automata (FPTA) by a combination of hierarchic first-order superposition-based theorem proving and probabilistic model checking. We develop the overall semantics of FPTAs and prove soundness and completeness of our method for reachability properties. Basically, we decompose FPTAs into their time plus first-order logic aspects on the one hand, and their probabilistic aspects on the other hand. Then we exploit the time plus first-order behavior by hierarchic superposition over linear arithmetic. The result of this analysis is the basis for the construction of a reachability equivalent (to the original FPTA) probabilistic timed automaton to which probabilistic model checking is finally applied. The hierarchic superposition calculus required for the analysis is sound and complete on the first-order formulas generated from FPTAs. It even works well in practice. We illustrate the potential behind it with a real-life DHCP protocol example, which we analyze by means of tool chain support.

  3. Scaling in the Donangelo-Sneppen model for evolution of money

    NASA Astrophysics Data System (ADS)

    Stauffer, Dietrich; P. Radomski, Jan

    2001-03-01

    The evolution of money from unsuccessful barter attempts, as modeled by Donangelo and Sneppen, is modified by a deterministic instead of a probabilistic selection of the most desired product as money. We check in particular the characteristic times of the model as a function of system size.

  4. Implementing Model-Check for Employee and Management Satisfaction

    NASA Technical Reports Server (NTRS)

    Jones, Corey; LaPha, Steven

    2013-01-01

    This presentation will discuss methods to which ModelCheck can be implemented to not only improve model quality, but also satisfy both employees and management through different sets of quality checks. This approach allows a standard set of modeling practices to be upheld throughout a company, with minimal interaction required by the end user. The presenter will demonstrate how to create multiple ModelCheck standards, preventing users from evading the system, and how it can improve the quality of drawings and models.

  5. Design of experiments enhanced statistical process control for wind tunnel check standard testing

    NASA Astrophysics Data System (ADS)

    Phillips, Ben D.

    The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.

  6. Combining Static Analysis and Model Checking for Software Analysis

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2003-01-01

    We present an iterative technique in which model checking and static analysis are combined to verify large software systems. The role of the static analysis is to compute partial order information which the model checker uses to reduce the state space. During exploration, the model checker also computes aliasing information that it gives to the static analyzer which can then refine its analysis. The result of this refined analysis is then fed back to the model checker which updates its partial order reduction. At each step of this iterative process, the static analysis computes optimistic information which results in an unsafe reduction of the state space. However we show that the process converges to a fired point at which time the partial order information is safe and the whole state space is explored.

  7. Enabling model checking for collaborative process analysis: from BPMN to `Network of Timed Automata'

    NASA Astrophysics Data System (ADS)

    Mallek, Sihem; Daclin, Nicolas; Chapurlat, Vincent; Vallespir, Bruno

    2015-04-01

    Interoperability is a prerequisite for partners involved in performing collaboration. As a consequence, the lack of interoperability is now considered a major obstacle. The research work presented in this paper aims to develop an approach that allows specifying and verifying a set of interoperability requirements to be satisfied by each partner in the collaborative process prior to process implementation. To enable the verification of these interoperability requirements, it is necessary first and foremost to generate a model of the targeted collaborative process; for this research effort, the standardised language BPMN 2.0 is used. Afterwards, a verification technique must be introduced, and model checking is the preferred option herein. This paper focuses on application of the model checker UPPAAL in order to verify interoperability requirements for the given collaborative process model. At first, this step entails translating the collaborative process model from BPMN into a UPPAAL modelling language called 'Network of Timed Automata'. Second, it becomes necessary to formalise interoperability requirements into properties with the dedicated UPPAAL language, i.e. the temporal logic TCTL.

  8. 12 CFR Appendix C to Part 229 - Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... in different states or check processing regions)]. If you make the deposit in person to one of our...] Substitute Checks and Your Rights What Is a Substitute Check? To make check processing faster, federal law...

  9. Application of conditional moment tests to model checking for generalized linear models.

    PubMed

    Pan, Wei

    2002-06-01

    Generalized linear models (GLMs) are increasingly being used in daily data analysis. However, model checking for GLMs with correlated discrete response data remains difficult. In this paper, through a case study on marginal logistic regression using a real data set, we illustrate the flexibility and effectiveness of using conditional moment tests (CMTs), along with other graphical methods, to do model checking for generalized estimation equation (GEE) analyses. Although CMTs provide an array of powerful diagnostic tests for model checking, they were originally proposed in the econometrics literature and, to our knowledge, have never been applied to GEE analyses. CMTs cover many existing tests, including the (generalized) score test for an omitted covariate, as special cases. In summary, we believe that CMTs provide a class of useful model checking tools.

  10. Take the Reins on Model Quality with ModelCHECK and Gatekeeper

    NASA Technical Reports Server (NTRS)

    Jones, Corey

    2012-01-01

    Model quality and consistency has been an issue for us due to the diverse experience level and imaginative modeling techniques of our users. Fortunately, setting up ModelCHECK and Gatekeeper to enforce our best practices has helped greatly, but it wasn't easy. There were many challenges associated with setting up ModelCHECK and Gatekeeper including: limited documentation, restrictions within ModelCHECK, and resistance from end users. However, we consider ours a success story. In this presentation we will describe how we overcame these obstacles and present some of the details of how we configured them to work for us.

  11. Intra-Urban Human Mobility and Activity Transition: Evidence from Social Media Check-In Data

    PubMed Central

    Wu, Lun; Zhi, Ye; Sui, Zhengwei; Liu, Yu

    2014-01-01

    Most existing human mobility literature focuses on exterior characteristics of movements but neglects activities, the driving force that underlies human movements. In this research, we combine activity-based analysis with a movement-based approach to model the intra-urban human mobility observed from about 15 million check-in records during a yearlong period in Shanghai, China. The proposed model is activity-based and includes two parts: the transition of travel demands during a specific time period and the movement between locations. For the first part, we find the transition probability between activities varies over time, and then we construct a temporal transition probability matrix to represent the transition probability of travel demands during a time interval. For the second part, we suggest that the travel demands can be divided into two classes, locationally mandatory activity (LMA) and locationally stochastic activity (LSA), according to whether the demand is associated with fixed location or not. By judging the combination of predecessor activity type and successor activity type we determine three trip patterns, each associated with a different decay parameter. To validate the model, we adopt the mechanism of an agent-based model and compare the simulated results with the observed pattern from the displacement distance distribution, the spatio-temporal distribution of activities, and the temporal distribution of travel demand transitions. The results show that the simulated patterns fit the observed data well, indicating that these findings open new directions for combining activity-based analysis with a movement-based approach using social media check-in data. PMID:24824892

  12. NASTRAN data generation and management using interactive graphics

    NASA Technical Reports Server (NTRS)

    Smootkatow, M.; Cooper, B. M.

    1972-01-01

    A method of using an interactive graphics device to generate a large portion of the input bulk data with visual checks of the structure and the card images is described. The generation starts from GRID and PBAR cards. The visual checks result from a three-dimensional display of the model in any rotated position. By detailing the steps, the time saving and cost effectiveness of this method may be judged, and its potential as a useful tool for the structural analyst may be established.

  13. Method and system to perform energy-extraction based active noise control

    NASA Technical Reports Server (NTRS)

    Kelkar, Atul (Inventor); Joshi, Suresh M. (Inventor)

    2009-01-01

    A method to provide active noise control to reduce noise and vibration in reverberant acoustic enclosures such as aircraft, vehicles, appliances, instruments, industrial equipment and the like is presented. A continuous-time multi-input multi-output (MIMO) state space mathematical model of the plant is obtained via analytical modeling and system identification. Compensation is designed to render the mathematical model passive in the sense of mathematical system theory. The compensated system is checked to ensure robustness of the passive property of the plant. The check ensures that the passivity is preserved if the mathematical model parameters are perturbed from nominal values. A passivity-based controller is designed and verified using numerical simulations and then tested. The controller is designed so that the resulting closed-loop response shows the desired noise reduction.

  14. 40 CFR 60.2780 - What must I include in the deviation report?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Compliance Times for Commercial and Industrial Solid Waste Incineration Units Model Rule-Recordkeeping and... downtime associated with zero, span, and other routine calibration checks). (f) Whether each deviation...

  15. Litho hotspots fixing using model based algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Meili; Yu, Shirui; Mao, Zhibiao; Shafee, Marwa; Madkour, Kareem; ElManhawy, Wael; Kwan, Joe; Hu, Xinyi; Wan, Qijian; Du, Chunshan

    2017-04-01

    As technology advances, IC designs are getting more sophisticated, thus it becomes more critical and challenging to fix printability issues in the design flow. Running lithography checks before tapeout is now mandatory for designers, which creates a need for more advanced and easy-to-use techniques for fixing hotspots found after lithographic simulation without creating a new design rule checking (DRC) violation or generating a new hotspot. This paper presents a new methodology for fixing hotspots on layouts while using the same engine currently used to detect the hotspots. The fix is achieved by applying minimum movement of edges causing the hotspot, with consideration of DRC constraints. The fix is internally simulated by the lithographic simulation engine to verify that the hotspot is eliminated and that no new hotspot is generated by the new edge locations. Hotspot fix checking is enhanced by adding DRC checks to the litho-friendly design (LFD) rule file to guarantee that any fix options that violate DRC checks are removed from the output hint file. This extra checking eliminates the need to re-run both DRC and LFD checks to ensure the change successfully fixed the hotspot, which saves time and simplifies the designer's workflow. This methodology is demonstrated on industrial designs, where the fixing rate of single and dual layer hotspots is reported.

  16. Modelling road accidents: An approach using structural time series

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  17. Model checking for linear temporal logic: An efficient implementation

    NASA Technical Reports Server (NTRS)

    Sherman, Rivi; Pnueli, Amir

    1990-01-01

    This report provides evidence to support the claim that model checking for linear temporal logic (LTL) is practically efficient. Two implementations of a linear temporal logic model checker is described. One is based on transforming the model checking problem into a satisfiability problem; the other checks an LTL formula for a finite model by computing the cross-product of the finite state transition graph of the program with a structure containing all possible models for the property. An experiment was done with a set of mutual exclusion algorithms and tested safety and liveness under fairness for these algorithms.

  18. Logic Model Checking of Time-Periodic Real-Time Systems

    NASA Technical Reports Server (NTRS)

    Florian, Mihai; Gamble, Ed; Holzmann, Gerard

    2012-01-01

    In this paper we report on the work we performed to extend the logic model checker SPIN with built-in support for the verification of periodic, real-time embedded software systems, as commonly used in aircraft, automobiles, and spacecraft. We first extended the SPIN verification algorithms to model priority based scheduling policies. Next, we added a library to support the modeling of periodic tasks. This library was used in a recent application of the SPIN model checker to verify the engine control software of an automobile, to study the feasibility of software triggers for unintended acceleration events.

  19. Verus: A Tool for Quantitative Analysis of Finite-State Real-Time Systems.

    DTIC Science & Technology

    1996-08-12

    Symbolic model checking is a technique for verifying finite-state concurrent systems that has been extended to handle real - time systems . Models with...up to 10(exp 30) states can often be verified in minutes. In this paper, we present a new tool to analyze real - time systems , based on this technique...We have designed a language, called Verus, for the description of real - time systems . Such a description is compiled into a state-transition graph and

  20. 78 FR 9783 - Airworthiness Directives; Bombardier, Inc. Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-12

    ...) has been made to the Time Limits/ Maintenance Checks (TLMC) manual to introduce a new Airworthiness... have already been done. (g) Time Limits/Maintenance Checks (TLMC) Manual Revision Within 60 days after... Challenger Time Limits/Maintenance Checks Manual, PSP 605. (ii) Canadair Challenger Temporary Revision 5-250...

  1. 12 CFR Appendix C to Part 229 - Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy Disclosure and Notices C Appendix C to Part 229 Banks and... OF FUNDS AND COLLECTION OF CHECKS (REGULATION CC) Pt. 229, App. C Appendix C to Part 229—Model...

  2. A Categorization of Dynamic Analyzers

    NASA Technical Reports Server (NTRS)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input/output data.

  3. Model Checking a Byzantine-Fault-Tolerant Self-Stabilizing Protocol for Distributed Clock Synchronization Systems

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.

    2007-01-01

    This report presents the mechanical verification of a simplified model of a rapid Byzantine-fault-tolerant self-stabilizing protocol for distributed clock synchronization systems. This protocol does not rely on any assumptions about the initial state of the system. This protocol tolerates bursts of transient failures, and deterministically converges within a time bound that is a linear function of the self-stabilization period. A simplified model of the protocol is verified using the Symbolic Model Verifier (SMV) [SMV]. The system under study consists of 4 nodes, where at most one of the nodes is assumed to be Byzantine faulty. The model checking effort is focused on verifying correctness of the simplified model of the protocol in the presence of a permanent Byzantine fault as well as confirmation of claims of determinism and linear convergence with respect to the self-stabilization period. Although model checking results of the simplified model of the protocol confirm the theoretical predictions, these results do not necessarily confirm that the protocol solves the general case of this problem. Modeling challenges of the protocol and the system are addressed. A number of abstractions are utilized in order to reduce the state space. Also, additional innovative state space reduction techniques are introduced that can be used in future verification efforts applied to this and other protocols.

  4. The influence of social anxiety on the body checking behaviors of female college students.

    PubMed

    White, Emily K; Warren, Cortney S

    2014-09-01

    Social anxiety and eating pathology frequently co-occur. However, there is limited research examining the relationship between anxiety and body checking, aside from one study in which social physique anxiety partially mediated the relationship between body checking cognitions and body checking behavior (Haase, Mountford, & Waller, 2007). In an independent sample of 567 college women, we tested the fit of Haase and colleagues' foundational model but did not find evidence of mediation. Thus we tested the fit of an expanded path model that included eating pathology and clinical impairment. In the best-fitting path model (CFI=.991; RMSEA=.083) eating pathology and social physique anxiety positively predicted body checking, and body checking positively predicted clinical impairment. Therefore, women who endorse social physique anxiety may be more likely to engage in body checking behaviors and experience impaired psychosocial functioning. Published by Elsevier Ltd.

  5. Examining Passenger Flow Choke Points at Airports Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Brown, Jeremy R.; Madhavan, Poomima

    2011-01-01

    The movement of passengers through an airport quickly, safely, and efficiently is the main function of the various checkpoints (check-in, security. etc) found in airports. Human error combined with other breakdowns in the complex system of the airport can disrupt passenger flow through the airport leading to lengthy waiting times, missing luggage and missed flights. In this paper we present a model of passenger flow through an airport using discrete event simulation that will provide a closer look into the possible reasons for breakdowns and their implications for passenger flow. The simulation is based on data collected at Norfolk International Airport (ORF). The primary goal of this simulation is to present ways to optimize the work force to keep passenger flow smooth even during peak travel times and for emergency preparedness at ORF in case of adverse events. In this simulation we ran three different scenarios: real world, increased check-in stations, and multiple waiting lines. Increased check-in stations increased waiting time and instantaneous utilization. while the multiple waiting lines decreased both the waiting time and instantaneous utilization. This simulation was able to show how different changes affected the passenger flow through the airport.

  6. 12 CFR Appendix C to Part 229 - Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... in different states or check processing regions)]. If you make the deposit in person to one of our... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... Your Rights What Is a Substitute Check? To make check processing faster, federal law permits banks to...

  7. 12 CFR Appendix C to Part 229 - Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... in different states or check processing regions)]. If you make the deposit in person to one of our... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... Your Rights What Is a Substitute Check? To make check processing faster, federal law permits banks to...

  8. 12 CFR Appendix C to Part 229 - Model Availability Policy Disclosures, Clauses, and Notices; Model Substitute Check Policy...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... in different states or check processing regions)]. If you make the deposit in person to one of our... processing regions)]. If you make the deposit in person to one of our employees, funds from the following... Your Rights What Is a Substitute Check? To make check processing faster, federal law permits banks to...

  9. Nonparametric model validations for hidden Markov models with applications in financial econometrics.

    PubMed

    Zhao, Zhibiao

    2011-06-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise.

  10. Determination of MLC model parameters for Monaco using commercial diode arrays.

    PubMed

    Kinsella, Paul; Shields, Laura; McCavana, Patrick; McClean, Brendan; Langan, Brian

    2016-07-08

    Multileaf collimators (MLCs) need to be characterized accurately in treatment planning systems to facilitate accurate intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT). The aim of this study was to examine the use of MapCHECK 2 and ArcCHECK diode arrays for optimizing MLC parameters in Monaco X-ray voxel Monte Carlo (XVMC) dose calculation algorithm. A series of radiation test beams designed to evaluate MLC model parameters were delivered to MapCHECK 2, ArcCHECK, and EBT3 Gafchromic film for comparison. Initial comparison of the calculated and ArcCHECK-measured dose distributions revealed it was unclear how to change the MLC parameters to gain agreement. This ambiguity arose due to an insufficient sampling of the test field dose distributions and unexpected discrepancies in the open parts of some test fields. Consequently, the XVMC MLC parameters were optimized based on MapCHECK 2 measurements. Gafchromic EBT3 film was used to verify the accuracy of MapCHECK 2 measured dose distributions. It was found that adjustment of the MLC parameters from their default values resulted in improved global gamma analysis pass rates for MapCHECK 2 measurements versus calculated dose. The lowest pass rate of any MLC-modulated test beam improved from 68.5% to 93.5% with 3% and 2 mm gamma criteria. Given the close agreement of the optimized model to both MapCHECK 2 and film, the optimized model was used as a benchmark to highlight the relatively large discrepancies in some of the test field dose distributions found with ArcCHECK. Comparison between the optimized model-calculated dose and ArcCHECK-measured dose resulted in global gamma pass rates which ranged from 70.0%-97.9% for gamma criteria of 3% and 2 mm. The simple square fields yielded high pass rates. The lower gamma pass rates were attributed to the ArcCHECK overestimating the dose in-field for the rectangular test fields whose long axis was parallel to the long axis of the ArcCHECK. Considering ArcCHECK measurement issues and the lower gamma pass rates for the MLC-modulated test beams, it was concluded that MapCHECK 2 was a more suitable detector than ArcCHECK for the optimization process. © 2016 The Authors

  11. Propel: Tools and Methods for Practical Source Code Model Checking

    NASA Technical Reports Server (NTRS)

    Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem

    2003-01-01

    The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.

  12. Modeling criterion shifts and target checking in prospective memory monitoring.

    PubMed

    Horn, Sebastian S; Bayen, Ute J

    2015-01-01

    Event-based prospective memory (PM) involves remembering to perform intended actions after a delay. An important theoretical issue is whether and how people monitor the environment to execute an intended action when a target event occurs. Performing a PM task often increases the latencies in ongoing tasks. However, little is known about the reasons for this cost effect. This study uses diffusion model analysis to decompose monitoring processes in the PM paradigm. Across 4 experiments, performing a PM task increased latencies in an ongoing lexical decision task. A large portion of this effect was explained by consistent increases in boundary separation; additional increases in nondecision time emerged in a nonfocal PM task and explained variance in PM performance (Experiment 1), likely reflecting a target-checking strategy before and after the ongoing decision (Experiment 2). However, we found that possible target-checking strategies may depend on task characteristics. That is, instructional emphasis on the importance of ongoing decisions (Experiment 3) or the use of focal targets (Experiment 4) eliminated the contribution of nondecision time to the cost of PM, but left participants in a mode of increased cautiousness. The modeling thus sheds new light on the cost effect seen in many PM studies and suggests that people approach ongoing activities more cautiously when they need to remember an intended action. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  13. Current Use of Underage Alcohol Compliance Checks by Enforcement Agencies in the U.S.

    PubMed Central

    Erickson, Darin J.; Lenk, Kathleen M.; Sanem, Julia R.; Nelson, Toben F.; Jones-Webb, Rhonda; Toomey, Traci L.

    2014-01-01

    Background Compliance checks conducted by law enforcement agents can significantly reduce the likelihood of illegal alcohol sales to underage individuals, but these checks need to be conducted using optimal methods to maintain effectiveness. Materials and Methods We conducted a national survey of local and state enforcement agencies in 2010–2011 to assess: (1) how many agencies are currently conducting underage alcohol compliance checks, (2) how many agencies that conduct compliance checks use optimal methods—including checking all establishments in the jurisdiction, conducting checks at least 3–4 times per year, conducting follow-up checks within 3 months, and penalizing the licensee (not only the server/clerk) for failing a compliance check, and (3) characteristics of the agencies that conduct compliance checks. Results Just over one third of local law enforcement agencies and over two thirds of state agencies reported conducting compliance checks. However, only a small percentage of the agencies (4–6%) reported using all of the optimal methods to maximize effectiveness of these compliance checks. Local law enforcement agencies with an alcohol-related division, those with at least one full-time officer assigned to work on alcohol, and those in larger communities were significantly more likely to conduct compliance checks. State agencies with more full-time agents and those located in states where the state agency or both state and local enforcement agencies have primary responsibility (vs. only the local law agency) for enforcing alcohol retail laws were also more likely to conduct compliance checks; however, these agency characteristics did not remain statistically significant in the multivariate analyses. Conclusions Continued effort is needed to increase the number of local and state agencies conducting compliance checks using optimal methods to reduce youth access to alcohol. PMID:24716443

  14. Current use of underage alcohol compliance checks by enforcement agencies in the United States.

    PubMed

    Erickson, Darin J; Lenk, Kathleen M; Sanem, Julia R; Nelson, Toben F; Jones-Webb, Rhonda; Toomey, Traci L

    2014-06-01

    Compliance checks conducted by law enforcement agents can significantly reduce the likelihood of illegal alcohol sales to underage individuals, but these checks need to be conducted using optimal methods to maintain effectiveness. We conducted a national survey of local and state enforcement agencies from 2010 to 2011 to assess: (i) how many agencies are currently conducting underage alcohol compliance checks, (ii) how many agencies that conduct compliance checks use optimal methods-including checking all establishments in the jurisdiction, conducting checks at least 3 to 4 times per year, conducting follow-up checks within 3 months, and penalizing the licensee (not only the server/clerk) for failing a compliance check, and (iii) characteristics of the agencies that conduct compliance checks. Just over one-third of local law enforcement agencies and over two-thirds of state agencies reported conducting compliance checks. However, only a small percentage of the agencies (4 to 6%) reported using all of the optimal methods to maximize effectiveness of these compliance checks. Local law enforcement agencies with an alcohol-related division, those with at least 1 full-time officer assigned to work on alcohol, and those in larger communities were significantly more likely to conduct compliance checks. State agencies with more full-time agents and those located in states where the state agency or both state and local enforcement agencies have primary responsibility (vs. only the local law agency) for enforcing alcohol retail laws were also more likely to conduct compliance checks; however, these agency characteristics did not remain statistically significant in the multivariate analyses. Continued effort is needed to increase the number of local and state agencies conducting compliance checks using optimal methods to reduce youth access to alcohol. Copyright © 2014 by the Research Society on Alcoholism.

  15. Software Model Checking of ARINC-653 Flight Code with MCP

    NASA Technical Reports Server (NTRS)

    Thompson, Sarah J.; Brat, Guillaume; Venet, Arnaud

    2010-01-01

    The ARINC-653 standard defines a common interface for Integrated Modular Avionics (IMA) code. In particular, ARINC-653 Part 1 specifies a process- and partition-management API that is analogous to POSIX threads, but with certain extensions and restrictions intended to support the implementation of high reliability flight code. MCP is a software model checker, developed at NASA Ames, that provides capabilities for model checking C and C++ source code. In this paper, we present recent work aimed at implementing extensions to MCP that support ARINC-653, and we discuss the challenges and opportunities that consequentially arise. Providing support for ARINC-653 s time and space partitioning is nontrivial, though there are implicit benefits for partial order reduction possible as a consequence of the API s strict interprocess communication policy.

  16. Program Model Checking as a New Trend

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper introduces a special section of STTT (International Journal on Software Tools for Technology Transfer) containing a selection of papers that were presented at the 7th International SPIN workshop, Stanford, August 30 - September 1, 2000. The workshop was named SPIN Model Checking and Software Verification, with an emphasis on model checking of programs. The paper outlines the motivation for stressing software verification, rather than only design and model verification, by presenting the work done in the Automated Software Engineering group at NASA Ames Research Center within the last 5 years. This includes work in software model checking, testing like technologies and static analysis.

  17. Socioeconomic differences in health check-ups and medically certified sickness absence: a 10-year follow-up among middle-aged municipal employees in Finland.

    PubMed

    Piha, Kustaa; Sumanen, Hilla; Lahelma, Eero; Rahkonen, Ossi

    2017-04-01

    There is contradictory evidence on the association between health check-ups and future morbidity. Among the general population, those with high socioeconomic position participate more often in health check-ups. The main aims of this study were to analyse if attendance to health check-ups are socioeconomically patterned and affect sickness absence over a 10-year follow-up. This register-based follow-up study included municipal employees of the City of Helsinki. 13 037 employees were invited to age-based health check-up during 2000-2002, with a 62% attendance rate. Education, occupational class and individual income were used to measure socioeconomic position. Medically certified sickness absence of 4 days or more was measured and controlled for at the baseline and used as an outcome over follow-up. The mean follow-up time was 7.5 years. Poisson regression was used. Men and employees with lower socioeconomic position participated more actively in health check-ups. Among women, non-attendance to health check-up predicted higher sickness absence during follow-up (relative risk =1.26, 95% CI 1.17 to 1.37) in the fully adjusted model. Health check-ups were not effective in reducing socioeconomic differences in sickness absence. Age-based health check-ups reduced subsequent sickness absence and should be promoted. Attendance to health check-ups should be as high as possible. Contextual factors need to be taken into account when applying the results in interventions in other settings. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  18. UTP and Temporal Logic Model Checking

    NASA Astrophysics Data System (ADS)

    Anderson, Hugh; Ciobanu, Gabriel; Freitas, Leo

    In this paper we give an additional perspective to the formal verification of programs through temporal logic model checking, which uses Hoare and He Unifying Theories of Programming (UTP). Our perspective emphasizes the use of UTP designs, an alphabetised relational calculus expressed as a pre/post condition pair of relations, to verify state or temporal assertions about programs. The temporal model checking relation is derived from a satisfaction relation between the model and its properties. The contribution of this paper is that it shows a UTP perspective to temporal logic model checking. The approach includes the notion of efficiency found in traditional model checkers, which reduced a state explosion problem through the use of efficient data structures

  19. 40 CFR 60.2770 - What information must I include in my annual report?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... and Compliance Times for Commercial and Industrial Solid Waste Incineration Units Model Rule... inoperative, except for zero (low-level) and high-level checks. (3) The date, time, and duration that each... of control if any of the following occur. (1) The zero (low-level), mid-level (if applicable), or...

  20. 75 FR 28480 - Airworthiness Directives; Airbus Model A300 Series Airplanes; Model A300 B4-600, B4-600R, F4-600R...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-21

    ... pressurise the hydraulic reservoirs, due to leakage of the Crissair reservoir air pressurisation check valves. * * * The leakage of the check valves was caused by an incorrect spring material. The affected Crissair check valves * * * were then replaced with improved check valves P/N [part number] 2S2794-1 * * *. More...

  1. Assessment of check-dam groundwater recharge with water-balance calculations

    NASA Astrophysics Data System (ADS)

    Djuma, Hakan; Bruggeman, Adriana; Camera, Corrado; Eliades, Marinos

    2017-04-01

    Studies on the enhancement of groundwater recharge by check-dams in arid and semi-arid environments mainly focus on deriving water infiltration rates from the check-dam ponding areas. This is usually achieved by applying simple water balance models, more advanced models (e.g., two dimensional groundwater models) and field tests (e.g., infiltrometer test or soil pit tests). Recharge behind the check-dam can be affected by the built-up of sediment as a result of erosion in the upstream watershed area. This natural process can increase the uncertainty in the estimates of the recharged water volume, especially for water balance calculations. Few water balance field studies of individual check-dams have been presented in the literature and none of them presented associated uncertainties of their estimates. The objectives of this study are i) to assess the effect of a check-dam on groundwater recharge from an ephemeral river; and ii) to assess annual sedimentation at the check-dam during a 4-year period. The study was conducted on a check-dam in the semi-arid island of Cyprus. Field campaigns were carried out to measure water flow, water depth and check-dam topography in order to establish check-dam water height, volume, evaporation, outflow and recharge relations. Topographic surveys were repeated at the end of consecutive hydrological years to estimate the sediment built up in the reservoir area of the check dam. Also, sediment samples were collected from the check-dam reservoir area for bulk-density analyses. To quantify the groundwater recharge, a water balance model was applied at two locations: at the check-dam and corresponding reservoir area, and at a 4-km stretch of the river bed without check-dam. Results showed that a check-dam with a storage capacity of 25,000 m3 was able to recharge to the aquifer, in four years, a total of 12 million m3 out of the 42 million m3 of measured (or modelled) streamflow. Recharge from the analyzed 4-km long river section without check-dam was estimated to be 1 million m3. Upper and lower limits of prediction intervals were computed to assess the uncertainties of the results. The model was rerun with these values and resulted in recharge values of 0.4 m3 as lower and 38 million m3 as upper limit. The sediment survey in the check-dam reservoir area showed that the reservoir area was filled with 2,000 to 3,000 tons of sediment after one rainfall season. This amount of sediment corresponds to 0.2 to 2 t h-1 y-1 sediment yield at the watershed level and reduces the check-dam storage capacity by approximately 10%. Results indicate that check-dams are valuable structures for increasing groundwater resources, but special attention should be given to soil erosion occurring in the upstream area and the resulting sediment built-up in the check-dam reservoir area. This study has received funding from the EU FP7 RECARE Project (GA 603498)

  2. Efficient model checking of network authentication protocol based on SPIN

    NASA Astrophysics Data System (ADS)

    Tan, Zhi-hua; Zhang, Da-fang; Miao, Li; Zhao, Dan

    2013-03-01

    Model checking is a very useful technique for verifying the network authentication protocols. In order to improve the efficiency of modeling and verification on the protocols with the model checking technology, this paper first proposes a universal formalization description method of the protocol. Combined with the model checker SPIN, the method can expediently verify the properties of the protocol. By some modeling simplified strategies, this paper can model several protocols efficiently, and reduce the states space of the model. Compared with the previous literature, this paper achieves higher degree of automation, and better efficiency of verification. Finally based on the method described in the paper, we model and verify the Privacy and Key Management (PKM) authentication protocol. The experimental results show that the method of model checking is effective, which is useful for the other authentication protocols.

  3. A voice-actuated wind tunnel model leak checking system

    NASA Technical Reports Server (NTRS)

    Larson, William E.

    1989-01-01

    A computer program has been developed that improves the efficiency of wind tunnel model leak checking. The program uses a voice recognition unit to relay a technician's commands to the computer. The computer, after receiving a command, can respond to the technician via a voice response unit. Information about the model pressure orifice being checked is displayed on a gas-plasma terminal. On command, the program records up to 30 seconds of pressure data. After the recording is complete, the raw data and a straight line fit of the data are plotted on the terminal. This allows the technician to make a decision on the integrity of the orifice being checked. All results of the leak check program are stored in a database file that can be listed on the line printer for record keeping purposes or displayed on the terminal to help the technician find unchecked orifices. This program allows one technician to check a model for leaks instead of the two or three previously required.

  4. Use of posterior predictive checks as an inferential tool for investigating individual heterogeneity in animal population vital rates

    PubMed Central

    Chambert, Thierry; Rotella, Jay J; Higgs, Megan D

    2014-01-01

    The investigation of individual heterogeneity in vital rates has recently received growing attention among population ecologists. Individual heterogeneity in wild animal populations has been accounted for and quantified by including individually varying effects in models for mark–recapture data, but the real need for underlying individual effects to account for observed levels of individual variation has recently been questioned by the work of Tuljapurkar et al. (Ecology Letters, 12, 93, 2009) on dynamic heterogeneity. Model-selection approaches based on information criteria or Bayes factors have been used to address this question. Here, we suggest that, in addition to model-selection, model-checking methods can provide additional important insights to tackle this issue, as they allow one to evaluate a model's misfit in terms of ecologically meaningful measures. Specifically, we propose the use of posterior predictive checks to explicitly assess discrepancies between a model and the data, and we explain how to incorporate model checking into the inferential process used to assess the practical implications of ignoring individual heterogeneity. Posterior predictive checking is a straightforward and flexible approach for performing model checks in a Bayesian framework that is based on comparisons of observed data to model-generated replications of the data, where parameter uncertainty is incorporated through use of the posterior distribution. If discrepancy measures are chosen carefully and are relevant to the scientific context, posterior predictive checks can provide important information allowing for more efficient model refinement. We illustrate this approach using analyses of vital rates with long-term mark–recapture data for Weddell seals and emphasize its utility for identifying shortfalls or successes of a model at representing a biological process or pattern of interest. We show how posterior predictive checks can be used to strengthen inferences in ecological studies. We demonstrate the application of this method on analyses dealing with the question of individual reproductive heterogeneity in a population of Antarctic pinnipeds. PMID:24834335

  5. Development of flank wear model of cutting tool by using adaptive feedback linear control system on machining AISI D2 steel and AISI 4340 steel

    NASA Astrophysics Data System (ADS)

    Orra, Kashfull; Choudhury, Sounak K.

    2016-12-01

    The purpose of this paper is to build an adaptive feedback linear control system to check the variation of cutting force signal to improve the tool life. The paper discusses the use of transfer function approach in improving the mathematical modelling and adaptively controlling the process dynamics of the turning operation. The experimental results shows to be in agreement with the simulation model and error obtained is less than 3%. The state space approach model used in this paper successfully check the adequacy of the control system through controllability and observability test matrix and can be transferred from one state to another by appropriate input control in a finite time. The proposed system can be implemented to other machining process under varying range of cutting conditions to improve the efficiency and observability of the system.

  6. Model Checking Temporal Logic Formulas Using Sticker Automata

    PubMed Central

    Feng, Changwei; Wu, Huanmei

    2017-01-01

    As an important complex problem, the temporal logic model checking problem is still far from being fully resolved under the circumstance of DNA computing, especially Computation Tree Logic (CTL), Interval Temporal Logic (ITL), and Projection Temporal Logic (PTL), because there is still a lack of approaches for DNA model checking. To address this challenge, a model checking method is proposed for checking the basic formulas in the above three temporal logic types with DNA molecules. First, one-type single-stranded DNA molecules are employed to encode the Finite State Automaton (FSA) model of the given basic formula so that a sticker automaton is obtained. On the other hand, other single-stranded DNA molecules are employed to encode the given system model so that the input strings of the sticker automaton are obtained. Next, a series of biochemical reactions are conducted between the above two types of single-stranded DNA molecules. It can then be decided whether the system satisfies the formula or not. As a result, we have developed a DNA-based approach for checking all the basic formulas of CTL, ITL, and PTL. The simulated results demonstrate the effectiveness of the new method. PMID:29119114

  7. Foundations of the Bandera Abstraction Tools

    NASA Technical Reports Server (NTRS)

    Hatcliff, John; Dwyer, Matthew B.; Pasareanu, Corina S.; Robby

    2003-01-01

    Current research is demonstrating that model-checking and other forms of automated finite-state verification can be effective for checking properties of software systems. Due to the exponential costs associated with model-checking, multiple forms of abstraction are often necessary to obtain system models that are tractable for automated checking. The Bandera Tool Set provides multiple forms of automated support for compiling concurrent Java software systems to models that can be supplied to several different model-checking tools. In this paper, we describe the foundations of Bandera's data abstraction mechanism which is used to reduce the cardinality (and the program's state-space) of data domains in software to be model-checked. From a technical standpoint, the form of data abstraction used in Bandera is simple, and it is based on classical presentations of abstract interpretation. We describe the mechanisms that Bandera provides for declaring abstractions, for attaching abstractions to programs, and for generating abstracted programs and properties. The contributions of this work are the design and implementation of various forms of tool support required for effective application of data abstraction to software components written in a programming language like Java which has a rich set of linguistic features.

  8. Full implementation of a distributed hydrological model based on check dam trapped sediment volumes

    NASA Astrophysics Data System (ADS)

    Bussi, Gianbattista; Francés, Félix

    2014-05-01

    Lack of hydrometeorological data is one of the most compelling limitations to the implementation of distributed environmental models. Mediterranean catchments, in particular, are characterised by high spatial variability of meteorological phenomena and soil characteristics, which may prevents from transferring model calibrations from a fully gauged catchment to a totally o partially ungauged one. For this reason, new sources of data are required in order to extend the use of distributed models to non-monitored or low-monitored areas. An important source of information regarding the hydrological and sediment cycle is represented by sediment deposits accumulated at the bottom of reservoirs. Since the 60s, reservoir sedimentation volumes were used as proxy data for the estimation of inter-annual total sediment yield rates, or, in more recent years, as a reference measure of the sediment transport for sediment model calibration and validation. Nevertheless, the possibility of using such data for constraining the calibration of a hydrological model has not been exhaustively investigated so far. In this study, the use of nine check dam reservoir sedimentation volumes for hydrological and sedimentological model calibration and spatio-temporal validation was examined. Check dams are common structures in Mediterranean areas, and are a potential source of spatially distributed information regarding both hydrological and sediment cycle. In this case-study, the TETIS hydrological and sediment model was implemented in a medium-size Mediterranean catchment (Rambla del Poyo, Spain) by taking advantage of sediment deposits accumulated behind the check dams located in the catchment headwaters. Reservoir trap efficiency was taken into account by coupling the TETIS model with a pond trap efficiency model. The model was calibrated by adjusting some of its parameters in order to reproduce the total sediment volume accumulated behind a check dam. Then, the model was spatially validated by obtaining the simulated sedimentation volume at the other eight check dams and comparing it to the observed sedimentation volumes. Lastly, the simulated water discharge at the catchment outlet was compared with observed water discharge records in order to check the hydrological sub-model behaviour. Model results provided highly valuable information concerning the spatial distribution of soil erosion and sediment transport. Spatial validation of the sediment sub-model provided very good results at seven check dams out of nine. This study shows that check dams can be a useful tool also for constraining hydrological model calibration, as model results agree with water discharge observations. In fact, the hydrological model validation at a downstream water flow gauge obtained a Nash-Sutcliffe efficiency of 0.8. This technique is applicable to all catchments with presence of check dams, and only requires rainfall and temperature data and soil characteristics maps.

  9. Nonparametric model validations for hidden Markov models with applications in financial econometrics

    PubMed Central

    Zhao, Zhibiao

    2011-01-01

    We address the nonparametric model validation problem for hidden Markov models with partially observable variables and hidden states. We achieve this goal by constructing a nonparametric simultaneous confidence envelope for transition density function of the observable variables and checking whether the parametric density estimate is contained within such an envelope. Our specification test procedure is motivated by a functional connection between the transition density of the observable variables and the Markov transition kernel of the hidden states. Our approach is applicable for continuous time diffusion models, stochastic volatility models, nonlinear time series models, and models with market microstructure noise. PMID:21750601

  10. Time series behaviour of the number of Air Asia passengers: A distributional approach

    NASA Astrophysics Data System (ADS)

    Asrah, Norhaidah Mohd; Djauhari, Maman Abdurachman

    2013-09-01

    The common practice to time series analysis is by fitting a model and then further analysis is conducted on the residuals. However, if we know the distributional behavior of time series, the analyses in model identification, parameter estimation, and model checking are more straightforward. In this paper, we show that the number of Air Asia passengers can be represented as a geometric Brownian motion process. Therefore, instead of using the standard approach in model fitting, we use an appropriate transformation to come up with a stationary, normally distributed and even independent time series. An example in forecasting the number of Air Asia passengers will be given to illustrate the advantages of the method.

  11. Improving the Enterprise Requirements and Acquisition Model’s Developmental Test and Evaluation Process Fidelity

    DTIC Science & Technology

    2014-03-27

    and excluded from the model. The “Check SVR Loop” prevents programs from failing the SVR a second time. If a program has not previously failed the SVR...and Acquisition Management Plan Initiative. Briefing, Peterson AFB, CO: HQ AFSPC/A5X, 2011. Gilmore, Michael J., Key Issues Causing Prgram Delays

  12. TEAMS Model Analyzer

    NASA Technical Reports Server (NTRS)

    Tijidjian, Raffi P.

    2010-01-01

    The TEAMS model analyzer is a supporting tool developed to work with models created with TEAMS (Testability, Engineering, and Maintenance System), which was developed by QSI. In an effort to reduce the time spent in the manual process that each TEAMS modeler must perform in the preparation of reporting for model reviews, a new tool has been developed as an aid to models developed in TEAMS. The software allows for the viewing, reporting, and checking of TEAMS models that are checked into the TEAMS model database. The software allows the user to selectively model in a hierarchical tree outline view that displays the components, failure modes, and ports. The reporting features allow the user to quickly gather statistics about the model, and generate an input/output report pertaining to all of the components. Rules can be automatically validated against the model, with a report generated containing resulting inconsistencies. In addition to reducing manual effort, this software also provides an automated process framework for the Verification and Validation (V&V) effort that will follow development of these models. The aid of such an automated tool would have a significant impact on the V&V process.

  13. Algebraic model checking for Boolean gene regulatory networks.

    PubMed

    Tran, Quoc-Nam

    2011-01-01

    We present a computational method in which modular and Groebner bases (GB) computation in Boolean rings are used for solving problems in Boolean gene regulatory networks (BN). In contrast to other known algebraic approaches, the degree of intermediate polynomials during the calculation of Groebner bases using our method will never grow resulting in a significant improvement in running time and memory space consumption. We also show how calculation in temporal logic for model checking can be done by means of our direct and efficient Groebner basis computation in Boolean rings. We present our experimental results in finding attractors and control strategies of Boolean networks to illustrate our theoretical arguments. The results are promising. Our algebraic approach is more efficient than the state-of-the-art model checker NuSMV on BNs. More importantly, our approach finds all solutions for the BN problems.

  14. Towards Symbolic Model Checking for Multi-Agent Systems via OBDDs

    NASA Technical Reports Server (NTRS)

    Raimondi, Franco; Lomunscio, Alessio

    2004-01-01

    We present an algorithm for model checking temporal-epistemic properties of multi-agent systems, expressed in the formalism of interpreted systems. We first introduce a technique for the translation of interpreted systems into boolean formulae, and then present a model-checking algorithm based on this translation. The algorithm is based on OBDD's, as they offer a compact and efficient representation for boolean formulae.

  15. Quality monitored distributed voting system

    DOEpatents

    Skogmo, David

    1997-01-01

    A quality monitoring system can detect certain system faults and fraud attempts in a distributed voting system. The system uses decoy voters to cast predetermined check ballots. Absent check ballots can indicate system faults. Altered check ballots can indicate attempts at counterfeiting votes. The system can also cast check ballots at predetermined times to provide another check on the distributed voting system.

  16. Preventing youth access to alcohol: outcomes from a multi-community time-series trial*.

    PubMed

    Wagenaar, Alexander C; Toomey, Traci L; Erickson, Darin J

    2005-03-01

    AIMS/INTERVENTION: The Complying with the Minimum Drinking Age project (CMDA) is a community trial designed to test effects of two interventions designed to reduce alcohol sales to minors: (1) training for management of retail alcohol establishments and (2) enforcement checks of alcohol establishments. CMDA is a multi-community time-series quasi-experimental trial with a nested cohort design. CMDA was implemented in 20 cities in four geographic areas in the US Midwest. The core outcome, propensity for alcohol sales to minors, was directly tested with research staff who attempted to purchase alcohol without showing age identification using a standardized protocol in 602 on-premise and 340 off-premise alcohol establishments. Data were collected every other week in all communities over 4 years. Mixed-model regression and Box-Jenkins time-series analyses were used to assess short- and long-term establishment-specific and general community-level effects of the two interventions. Effects of the training intervention were mixed. Specific deterrent effects were observed for enforcement checks, with an immediate 17% reduction in likelihood of sales to minors. These effects decayed entirely within 3 months in off-premise establishments and to an 8.2% reduction in on-premise establishments. Enforcement checks prevent alcohol sales to minors. At the intensity levels tested, enforcement primarily affected specific establishments checked, with limited diffusion to the whole community. Finally, most of the enforcement effect decayed within 3 months, suggesting that a regular schedule of enforcement is necessary to maintain deterrence.

  17. 76 FR 64785 - Airworthiness Directives; Empresa Brasileira de Aeronautica S.A. (EMBRAER) Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-19

    ...--Internal, of Chapter 5, Time Limits Maintenance Checks, of EMBRAER EMB145 Aircraft Maintenance Manual, Part...--Tail Cone Fairing--Internal, of Chapter 5, Time Limits Maintenance Checks, of EMBRAER EMB145 Aircraft... Maintenance Checks, of EMBRAER EMB145 Aircraft Maintenance Manual, Part II, AMM-145/1124, Revision 54, dated...

  18. RealSurf - A Tool for the Interactive Visualization of Mathematical Models

    NASA Astrophysics Data System (ADS)

    Stussak, Christian; Schenzel, Peter

    For applications in fine art, architecture and engineering it is often important to visualize and to explore complex mathematical models. In former times there were static models of them collected in museums respectively in mathematical institutes. In order to check their properties for esthetical reasons it could be helpful to explore them interactively in 3D in real time. For the class of implicitly given algebraic surfaces we developed the tool RealSurf. Here we give an introduction to the program and some hints for the design of interesting surfaces.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, J; Pompos, A; Jiang, S

    Purpose: To put forth an innovative clinical paradigm for weekly chart checking so that treatment status is periodically checked accurately and efficiently. This study also aims to help optimize the chart checking clinical workflow in a busy radiation therapy clinic. Methods: It is mandated by the Texas Administrative code to check patient charts of radiation therapy once a week or every five fractions, however it varies drastically among institutions in terms of when and how it is done. Some do it every day, but a lot of efforts are wasted on opening ineligible charts; some do it on a fixedmore » day but the distribution of intervals between subsequent checks is not optimal. To establish an optimal chart checking procedure, a new paradigm was developed to achieve 1) charts are checked more accurately and more efficiently; 2) charts are checked on optimal days without any miss; 3) workload is evened out throughout a week when multiple physicists are involved. All active charts will be accessed by querying the R&V system. Priority is assigned to each chart based on the number of days before the next due date followed by sorting and workload distribution steps. New charts are also taken into account when distributing the workload so it is reasonably even throughout the week. Results: Our clinical workflow became more streamlined and smooth. In addition, charts get checked in a more timely fashion so that errors would get caught earlier should they occur. Conclusion: We developed a new weekly chart checking diagram. It helps physicists check charts in a timely manner, saves their time in busy clinics, and consequently reduces possible errors.« less

  20. ANSYS duplicate finite-element checker routine

    NASA Technical Reports Server (NTRS)

    Ortega, R.

    1995-01-01

    An ANSYS finite-element code routine to check for duplicated elements within the volume of a three-dimensional (3D) finite-element mesh was developed. The routine developed is used for checking floating elements within a mesh, identically duplicated elements, and intersecting elements with a common face. A space shuttle main engine alternate turbopump development high pressure oxidizer turbopump finite-element model check using the developed subroutine is discussed. Finally, recommendations are provided for duplicate element checking of 3D finite-element models.

  1. Quality monitored distributed voting system

    DOEpatents

    Skogmo, D.

    1997-03-18

    A quality monitoring system can detect certain system faults and fraud attempts in a distributed voting system. The system uses decoy voters to cast predetermined check ballots. Absent check ballots can indicate system faults. Altered check ballots can indicate attempts at counterfeiting votes. The system can also cast check ballots at predetermined times to provide another check on the distributed voting system. 6 figs.

  2. Pharmacist and Technician Perceptions of Tech-Check-Tech in Community Pharmacy Practice Settings.

    PubMed

    Frost, Timothy P; Adams, Alex J

    2018-04-01

    Tech-check-tech (TCT) is a practice model in which pharmacy technicians with advanced training can perform final verification of prescriptions that have been previously reviewed for appropriateness by a pharmacist. Few states have adopted TCT in part because of the common view that this model is controversial among members of the profession. This article aims to summarize the existing research on pharmacist and technician perceptions of community pharmacy-based TCT. A literature review was conducted using MEDLINE (January 1990 to August 2016) and Google Scholar (January 1990 to August 2016) using the terms "tech* and check," "tech-check-tech," "checking technician," and "accuracy checking tech*." Of the 7 studies identified we found general agreement among both pharmacists and technicians that TCT in community pharmacy settings can be safely performed. This agreement persisted in studies of theoretical TCT models and in studies assessing participants in actual community-based TCT models. Pharmacists who had previously worked with a checking technician were generally more favorable toward TCT. Both pharmacists and technicians in community pharmacy settings generally perceived TCT to be safe, in both theoretical surveys and in surveys following actual TCT demonstration projects. These perceptions of safety align well with the actual outcomes achieved from community pharmacy TCT studies.

  3. Fall 2014 SEI Research Review Probabilistic Analysis of Time Sensitive Systems

    DTIC Science & Technology

    2014-10-28

    Osmosis SMC Tool Osmosis is a tool for Statistical Model Checking (SMC) with Semantic Importance Sampling. • Input model is written in subset of C...ASSERT() statements in model indicate conditions that must hold. • Input probability distributions defined by the user. • Osmosis returns the...on: – Target relative error, or – Set number of simulations Osmosis Main Algorithm 1 http://dreal.cs.cmu.edu/ (?⃑?): Indicator

  4. Bayesian Hierarchical Air-Sea Interaction Modeling: Application to the Labrador Sea

    NASA Technical Reports Server (NTRS)

    Niiler, Pearn P.

    2002-01-01

    The objectives are to: 1) Organize data from 26 MINIMET drifters in the Labrador Sea, including sensor calibration and error checking of ARGOS transmissions. 2) Produce wind direction, barometer, and sea surface temperature time series. In addition, provide data from historical file of 150 SHARP drifters in the Labrador Sea. 3) Work with data interpretation and data-modeling assimilation issues.

  5. Action-based verification of RTCP-nets with CADP

    NASA Astrophysics Data System (ADS)

    Biernacki, Jerzy; Biernacka, Agnieszka; Szpyrka, Marcin

    2015-12-01

    The paper presents an RTCP-nets' (real-time coloured Petri nets) coverability graphs into Aldebaran format translation algorithm. The approach provides the possibility of automatic RTCP-nets verification using model checking techniques provided by the CADP toolbox. An actual fire alarm control panel system has been modelled and several of its crucial properties have been verified to demonstrate the usability of the approach.

  6. Formal Verification of Quasi-Synchronous Systems

    DTIC Science & Technology

    2015-07-01

    pg. 215-226, Springer-Verlag: London, UK, 2001. [4] Nicolas Halbwachs and Louis Mandel, Simulation and Verification of Asynchronous Systems by...Huang, S. A. Smolka, W. Tan , and S. Tripakis, Deep Random Search for Efficient Model Checking of Timed Automata, in Proceedings of the 13th Monterey

  7. Usage Automata

    NASA Astrophysics Data System (ADS)

    Bartoletti, Massimo

    Usage automata are an extension of finite stata automata, with some additional features (e.g. parameters and guards) that improve their expressivity. Usage automata are expressive enough to model security requirements of real-world applications; at the same time, they are simple enough to be statically amenable, e.g. they can be model-checked against abstractions of program usages. We study here some foundational aspects of usage automata. In particular, we discuss about their expressive power, and about their effective use in run-time mechanisms for enforcing usage policies.

  8. Development of a Novel Floating In-situ Gelling System for Stomach Specific Drug Delivery of the Narrow Absorption Window Drug Baclofen.

    PubMed

    R Jivani, Rishad; N Patel, Chhagan; M Patel, Dashrath; P Jivani, Nurudin

    2010-01-01

    The present study deals with development of a floating in-situ gel of the narrow absorption window drug baclofen. Sodium alginate-based in-situ gelling systems were prepared by dissolving various concentrations of sodium alginate in deionized water, to which varying concentrations of drug and calcium bicarbonate were added. Fourier transform infrared spectroscopy (FTIR) and differential scanning calorimetry (DSC) were used to check the presence of any interaction between the drug and the excipients. A 3(2) full factorial design was used for optimization. The concentrations of sodium alginate (X1) and calcium bicarbonate (X2) were selected as the independent variables. The amount of the drug released after 1 h (Q1) and 10 h (Q10) and the viscosity of the solution were selected as the dependent variables. The gels were studied for their viscosity, in-vitro buoyancy and drug release. Contour plots were drawn for each dependent variable and check-point batches were prepared in order to get desirable release profiles. The drug release profiles were fitted into different kinetic models. The floating lag time and floating time found to be 2 min and 12 h respectively. A decreasing trend in drug release was observed with increasing concentrations of CaCO3. The computed values of Q1 and Q10 for the check-point batch were 25% and 86% respectively, compared to the experimental values of 27.1% and 88.34%. The similarity factor (f 2) for the check-point batch being 80.25 showed that the two dissolution profiles were similar. The drug release from the in-situ gel follows the Higuchi model, which indicates a diffusion-controlled release. A stomach specific in-situ gel of baclofen could be prepared using floating mechanism to increase the residence time of the drug in stomach and thereby increase the absorption.

  9. Spacecraft command verification: The AI solution

    NASA Technical Reports Server (NTRS)

    Fesq, Lorraine M.; Stephan, Amy; Smith, Brian K.

    1990-01-01

    Recently, a knowledge-based approach was used to develop a system called the Command Constraint Checker (CCC) for TRW. CCC was created to automate the process of verifying spacecraft command sequences. To check command files by hand for timing and sequencing errors is a time-consuming and error-prone task. Conventional software solutions were rejected when it was estimated that it would require 36 man-months to build an automated tool to check constraints by conventional methods. Using rule-based representation to model the various timing and sequencing constraints of the spacecraft, CCC was developed and tested in only three months. By applying artificial intelligence techniques, CCC designers were able to demonstrate the viability of AI as a tool to transform difficult problems into easily managed tasks. The design considerations used in developing CCC are discussed and the potential impact of this system on future satellite programs is examined.

  10. Check Yourself: a social marketing campaign to increase syphilis screening in Los Angeles County.

    PubMed

    Plant, Aaron; Javanbakht, Marjan; Montoya, Jorge A; Rotblatt, Harlan; O'Leary, Christopher; Kerndt, Peter R

    2014-01-01

    In 2007, the Los Angeles County Department of Public Health launched Check Yourself, a new social marketing campaign, as part of ongoing efforts to address the persistent syphilis epidemic among men who have sex with men (MSM) in the county. The goals of the campaign were to increase syphilis testing and knowledge among MSM. Check Yourself was planned with careful attention to the principles of social marketing, including formative research, market segmentation, and an emphasis on building a strong brand. A cross-sectional survey using a time-location sample was conducted in 2009 for the evaluation. The survey assessed demographics, syphilis knowledge, and recent syphilis testing as well as unaided awareness, aided awareness, and confirmed awareness, meaning that a person had both awareness of the campaign and could correctly identify that the campaign was about syphilis. The total sample size was 306. Unaided awareness for Check Yourself was 20.7%, and aided awareness was 67.5%, bringing total campaign awareness to 88.2%; confirmed awareness was 30.4%. Unaided campaign awareness was associated with syphilis knowledge and important risk behaviors for syphilis, indicating that the campaign reached an appropriate audience. Total awareness was not associated with recent syphilis testing in a multivariate model. However, MSM with confirmed awareness were more than 6 times more likely to have been recently tested. The evaluation of Check Yourself found that the campaign had a very strong brand among MSM. Although total awareness was not associated with syphilis testing, confirmed awareness, a more robust measure, was strongly associated.

  11. An efficient algorithm for computing fixed length attractors based on bounded model checking in synchronous Boolean networks with biochemical applications.

    PubMed

    Li, X Y; Yang, G W; Zheng, D S; Guo, W S; Hung, W N N

    2015-04-28

    Genetic regulatory networks are the key to understanding biochemical systems. One condition of the genetic regulatory network under different living environments can be modeled as a synchronous Boolean network. The attractors of these Boolean networks will help biologists to identify determinant and stable factors. Existing methods identify attractors based on a random initial state or the entire state simultaneously. They cannot identify the fixed length attractors directly. The complexity of including time increases exponentially with respect to the attractor number and length of attractors. This study used the bounded model checking to quickly locate fixed length attractors. Based on the SAT solver, we propose a new algorithm for efficiently computing the fixed length attractors, which is more suitable for large Boolean networks and numerous attractors' networks. After comparison using the tool BooleNet, empirical experiments involving biochemical systems demonstrated the feasibility and efficiency of our approach.

  12. Random Testing and Model Checking: Building a Common Framework for Nondeterministic Exploration

    NASA Technical Reports Server (NTRS)

    Groce, Alex; Joshi, Rajeev

    2008-01-01

    Two popular forms of dynamic analysis, random testing and explicit-state software model checking, are perhaps best viewed as search strategies for exploring the state spaces introduced by nondeterminism in program inputs. We present an approach that enables this nondeterminism to be expressed in the SPIN model checker's PROMELA language, and then lets users generate either model checkers or random testers from a single harness for a tested C program. Our approach makes it easy to compare model checking and random testing for models with precisely the same input ranges and probabilities and allows us to mix random testing with model checking's exhaustive exploration of non-determinism. The PROMELA language, as intended in its design, serves as a convenient notation for expressing nondeterminism and mixing random choices with nondeterministic choices. We present and discuss a comparison of random testing and model checking. The results derive from using our framework to test a C program with an effectively infinite state space, a module in JPL's next Mars rover mission. More generally, we show how the ability of the SPIN model checker to call C code can be used to extend SPIN's features, and hope to inspire others to use the same methods to implement dynamic analyses that can make use of efficient state storage, matching, and backtracking.

  13. Approximate Model Checking of PCTL Involving Unbounded Path Properties

    NASA Astrophysics Data System (ADS)

    Basu, Samik; Ghosh, Arka P.; He, Ru

    We study the problem of applying statistical methods for approximate model checking of probabilistic systems against properties encoded as PCTL formulas. Such approximate methods have been proposed primarily to deal with state-space explosion that makes the exact model checking by numerical methods practically infeasible for large systems. However, the existing statistical methods either consider a restricted subset of PCTL, specifically, the subset that can only express bounded until properties; or rely on user-specified finite bound on the sample path length. We propose a new method that does not have such restrictions and can be effectively used to reason about unbounded until properties. We approximate probabilistic characteristics of an unbounded until property by that of a bounded until property for a suitably chosen value of the bound. In essence, our method is a two-phase process: (a) the first phase is concerned with identifying the bound k 0; (b) the second phase computes the probability of satisfying the k 0-bounded until property as an estimate for the probability of satisfying the corresponding unbounded until property. In both phases, it is sufficient to verify bounded until properties which can be effectively done using existing statistical techniques. We prove the correctness of our technique and present its prototype implementations. We empirically show the practical applicability of our method by considering different case studies including a simple infinite-state model, and large finite-state models such as IPv4 zeroconf protocol and dining philosopher protocol modeled as Discrete Time Markov chains.

  14. CheckMATE 2: From the model to the limit

    NASA Astrophysics Data System (ADS)

    Dercks, Daniel; Desai, Nishita; Kim, Jong Soo; Rolbiecki, Krzysztof; Tattersall, Jamie; Weber, Torsten

    2017-12-01

    We present the latest developments to the CheckMATE program that allows models of new physics to be easily tested against the recent LHC data. To achieve this goal, the core of CheckMATE now contains over 60 LHC analyses of which 12 are from the 13 TeV run. The main new feature is that CheckMATE 2 now integrates the Monte Carlo event generation via MadGraph5_aMC@NLO and Pythia 8. This allows users to go directly from a SLHA file or UFO model to the result of whether a model is allowed or not. In addition, the integration of the event generation leads to a significant increase in the speed of the program. Many other improvements have also been made, including the possibility to now combine signal regions to give a total likelihood for a model.

  15. Coverage Metrics for Model Checking

    NASA Technical Reports Server (NTRS)

    Penix, John; Visser, Willem; Norvig, Peter (Technical Monitor)

    2001-01-01

    When using model checking to verify programs in practice, it is not usually possible to achieve complete coverage of the system. In this position paper we describe ongoing research within the Automated Software Engineering group at NASA Ames on the use of test coverage metrics to measure partial coverage and provide heuristic guidance for program model checking. We are specifically interested in applying and developing coverage metrics for concurrent programs that might be used to support certification of next generation avionics software.

  16. Review of the energy check of an electron-only linear accelerator over a 6 year period: sensitivity of the technique to energy shift.

    PubMed

    Biggs, Peter J

    2003-04-01

    The calibration and monthly QA of an electron-only linear accelerator dedicated to intra-operative radiation therapy has been reviewed. Since this machine is calibrated prior to every procedure, there was no necessity to adjust the output calibration at any time except, perhaps, when the magnetron is changed, provided the machine output is reasonably stable. This gives a unique opportunity to study the dose output of the machine per monitor unit, variation in the timer error, flatness and symmetry of the beam and the energy check as a function of time. The results show that, although the dose per monitor unit varied within +/- 2%, the timer error within +/- 0.005 MU and the asymmetry within 1-2%, none of these parameters showed any systematic change with time. On the other hand, the energy check showed a linear drift with time for 6, 9, and 12 MeV (2.1, 3.5, and 2.5%, respectively, over 5 years), while at 15 and 18 MeV, the energy check was relatively constant. It is further shown that based on annual calibrations and RPC TLD checks, the energy of each beam is constant and that therefore the energy check is an exquisitely sensitive one. The consistency of the independent checks is demonstrated.

  17. Posterior Predictive Model Checking in Bayesian Networks

    ERIC Educational Resources Information Center

    Crawford, Aaron

    2014-01-01

    This simulation study compared the utility of various discrepancy measures within a posterior predictive model checking (PPMC) framework for detecting different types of data-model misfit in multidimensional Bayesian network (BN) models. The investigated conditions were motivated by an applied research program utilizing an operational complex…

  18. Analyzing the cost of screening selectee and non-selectee baggage.

    PubMed

    Virta, Julie L; Jacobson, Sheldon H; Kobza, John E

    2003-10-01

    Determining how to effectively operate security devices is as important to overall system performance as developing more sensitive security devices. In light of recent federal mandates for 100% screening of all checked baggage, this research studies the trade-offs between screening only selectee checked baggage and screening both selectee and non-selectee checked baggage for a single baggage screening security device deployed at an airport. This trade-off is represented using a cost model that incorporates the cost of the baggage screening security device, the volume of checked baggage processed through the device, and the outcomes that occur when the device is used. The cost model captures the cost of deploying, maintaining, and operating a single baggage screening security device over a one-year period. The study concludes that as excess baggage screening capacity is used to screen non-selectee checked bags, the expected annual cost increases, the expected annual cost per checked bag screened decreases, and the expected annual cost per expected number of threats detected in the checked bags screened increases. These results indicate that the marginal increase in security per dollar spent is significantly lower when non-selectee checked bags are screened than when only selectee checked bags are screened.

  19. Re-engineering pre-employment check-up systems: a model for improving health services.

    PubMed

    Rateb, Said Abdel Hakim; El Nouman, Azza Abdel Razek; Rateb, Moshira Abdel Hakim; Asar, Mohamed Naguib; El Amin, Ayman Mohammed; Gad, Saad abdel Aziz; Mohamed, Mohamed Salah Eldin

    2011-01-01

    The purpose of this paper is to develop a model for improving health services provided by the pre-employment medical fitness check-up system affiliated to Egypt's Health Insurance Organization (HIO). Operations research, notably system re-engineering, is used in six randomly selected centers and findings before and after re-engineering are compared. The re-engineering model follows a systems approach, focusing on three areas: structure, process and outcome. The model is based on six main components: electronic booking, standardized check-up processes, protected medical documents, advanced archiving through an electronic content management (ECM) system, infrastructure development, and capacity building. The model originates mainly from customer needs and expectations. The centers' monthly customer flow increased significantly after re-engineering. The mean time spent per customer cycle improved after re-engineering--18.3 +/- 5.5 minutes as compared to 48.8 +/- 14.5 minutes before. Appointment delay was also significantly decreased from an average 18 to 6.2 days. Both beneficiaries and service providers were significantly more satisfied with the services after re-engineering. The model proves that re-engineering program costs are exceeded by increased revenue. Re-engineering in this study involved multiple structure and process elements. The literature review did not reveal similar re-engineering healthcare packages. Therefore, each element was compared separately. This model is highly recommended for improving service effectiveness and efficiency. This research is the first in Egypt to apply the re-engineering approach to public health systems. Developing user-friendly models for service improvement is an added value.

  20. 40 CFR 86.327-79 - Quench checks; NOX analyzer.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quench checks; NOX analyzer. (a) Perform the reaction chamber quench check for each model of high vacuum reaction chamber analyzer prior to initial use. (b) Perform the reaction chamber quench check for each new analyzer that has an ambient pressure or “soft vacuum” reaction chamber prior to initial use. Additionally...

  1. 40 CFR 86.327-79 - Quench checks; NOX analyzer.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quench checks; NOX analyzer. (a) Perform the reaction chamber quench check for each model of high vacuum reaction chamber analyzer prior to initial use. (b) Perform the reaction chamber quench check for each new analyzer that has an ambient pressure or “soft vacuum” reaction chamber prior to initial use. Additionally...

  2. Accurate LC peak boundary detection for ¹⁶O/¹⁸O labeled LC-MS data.

    PubMed

    Cui, Jian; Petritis, Konstantinos; Tegeler, Tony; Petritis, Brianne; Ma, Xuepo; Jin, Yufang; Gao, Shou-Jiang S J; Zhang, Jianqiu Michelle

    2013-01-01

    In liquid chromatography-mass spectrometry (LC-MS), parts of LC peaks are often corrupted by their co-eluting peptides, which results in increased quantification variance. In this paper, we propose to apply accurate LC peak boundary detection to remove the corrupted part of LC peaks. Accurate LC peak boundary detection is achieved by checking the consistency of intensity patterns within peptide elution time ranges. In addition, we remove peptides with erroneous mass assignment through model fitness check, which compares observed intensity patterns to theoretically constructed ones. The proposed algorithm can significantly improve the accuracy and precision of peptide ratio measurements.

  3. Accurate LC Peak Boundary Detection for 16 O/ 18 O Labeled LC-MS Data

    PubMed Central

    Cui, Jian; Petritis, Konstantinos; Tegeler, Tony; Petritis, Brianne; Ma, Xuepo; Jin, Yufang; Gao, Shou-Jiang (SJ); Zhang, Jianqiu (Michelle)

    2013-01-01

    In liquid chromatography-mass spectrometry (LC-MS), parts of LC peaks are often corrupted by their co-eluting peptides, which results in increased quantification variance. In this paper, we propose to apply accurate LC peak boundary detection to remove the corrupted part of LC peaks. Accurate LC peak boundary detection is achieved by checking the consistency of intensity patterns within peptide elution time ranges. In addition, we remove peptides with erroneous mass assignment through model fitness check, which compares observed intensity patterns to theoretically constructed ones. The proposed algorithm can significantly improve the accuracy and precision of peptide ratio measurements. PMID:24115998

  4. Alteration of Box-Jenkins methodology by implementing genetic algorithm method

    NASA Astrophysics Data System (ADS)

    Ismail, Zuhaimy; Maarof, Mohd Zulariffin Md; Fadzli, Mohammad

    2015-02-01

    A time series is a set of values sequentially observed through time. The Box-Jenkins methodology is a systematic method of identifying, fitting, checking and using integrated autoregressive moving average time series model for forecasting. Box-Jenkins method is an appropriate for a medium to a long length (at least 50) time series data observation. When modeling a medium to a long length (at least 50), the difficulty arose in choosing the accurate order of model identification level and to discover the right parameter estimation. This presents the development of Genetic Algorithm heuristic method in solving the identification and estimation models problems in Box-Jenkins. Data on International Tourist arrivals to Malaysia were used to illustrate the effectiveness of this proposed method. The forecast results that generated from this proposed model outperformed single traditional Box-Jenkins model.

  5. Predictive modeling of addiction lapses in a mobile health application.

    PubMed

    Chih, Ming-Yuan; Patton, Timothy; McTavish, Fiona M; Isham, Andrew J; Judkins-Fisher, Chris L; Atwood, Amy K; Gustafson, David H

    2014-01-01

    The chronically relapsing nature of alcoholism leads to substantial personal, family, and societal costs. Addiction-comprehensive health enhancement support system (A-CHESS) is a smartphone application that aims to reduce relapse. To offer targeted support to patients who are at risk of lapses within the coming week, a Bayesian network model to predict such events was constructed using responses on 2,934 weekly surveys (called the Weekly Check-in) from 152 alcohol-dependent individuals who recently completed residential treatment. The Weekly Check-in is a self-monitoring service, provided in A-CHESS, to track patients' recovery progress. The model showed good predictability, with the area under receiver operating characteristic curve of 0.829 in the 10-fold cross-validation and 0.912 in the external validation. The sensitivity/specificity table assists the tradeoff decisions necessary to apply the model in practice. This study moves us closer to the goal of providing lapse prediction so that patients might receive more targeted and timely support. © 2013.

  6. Predictive Modeling of Addiction Lapses in a Mobile Health Application

    PubMed Central

    Chih, Ming-Yuan; Patton, Timothy; McTavish, Fiona M.; Isham, Andrew; Judkins-Fisher, Chris L.; Atwood, Amy K.; Gustafson, David H.

    2013-01-01

    The chronically relapsing nature of alcoholism leads to substantial personal, family, and societal costs. Addiction-Comprehensive Health Enhancement Support System (A-CHESS) is a smartphone application that aims to reduce relapse. To offer targeted support to patients who are at risk of lapses within the coming week, a Bayesian network model to predict such events was constructed using responses on 2,934 weekly surveys (called the Weekly Check-in) from 152 alcohol-dependent individuals who recently completed residential treatment. The Weekly Check-in is a self-monitoring service, provided in A-CHESS, to track patients’ recovery progress. The model showed good predictability, with the area under receiver operating characteristic curve of 0.829 in the 10-fold cross-validation and 0.912 in the external validation. The sensitivity/specificity table assists the tradeoff decisions necessary to apply the model in practice. This study moves us closer to the goal of providing lapse prediction so that patients might receive more targeted and timely support. PMID:24035143

  7. Additive mixed effect model for recurrent gap time data.

    PubMed

    Ding, Jieli; Sun, Liuquan

    2017-04-01

    Gap times between recurrent events are often of primary interest in medical and observational studies. The additive hazards model, focusing on risk differences rather than risk ratios, has been widely used in practice. However, the marginal additive hazards model does not take the dependence among gap times into account. In this paper, we propose an additive mixed effect model to analyze gap time data, and the proposed model includes a subject-specific random effect to account for the dependence among the gap times. Estimating equation approaches are developed for parameter estimation, and the asymptotic properties of the resulting estimators are established. In addition, some graphical and numerical procedures are presented for model checking. The finite sample behavior of the proposed methods is evaluated through simulation studies, and an application to a data set from a clinic study on chronic granulomatous disease is provided.

  8. Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking

    NASA Technical Reports Server (NTRS)

    Turgeon, Gregory; Price, Petra

    2010-01-01

    A feasibility study was performed on a representative aerospace system to determine the following: (1) the benefits and limitations to using SCADE , a commercially available tool for model checking, in comparison to using a proprietary tool that was studied previously [1] and (2) metrics for performing the model checking and for assessing the findings. This study was performed independently of the development task by a group unfamiliar with the system, providing a fresh, external perspective free from development bias.

  9. 78 FR 17865 - Airworthiness Directives; PILATUS AIRCRAFT LTD. Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-25

    ... TBO were moved from Chapter 5: Time Limits/Maintenance Checks, to Chapter 4: Structural, Component and... Directives; PILATUS AIRCRAFT LTD. Airplanes AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Final... all PILATUS AIRCRAFT LTD. Models PC-12, PC-12/45, and PC-12/47 airplanes. This AD results from...

  10. Sub-pixel analysis to support graphic security after scanning at low resolution

    NASA Astrophysics Data System (ADS)

    Haas, Bertrand; Cordery, Robert; Gou, Hongmei; Decker, Steve

    2006-02-01

    Whether in the domain of audio, video or finance, our world tends to become increasingly digital. However, for diverse reasons, the transition from analog to digital is often much extended in time, and proceeds by long steps (and sometimes never completes). One such step is the conversion of information on analog media to digital information. We focus in this paper on the conversion (scanning) of printed documents to digital images. Analog media have the advantage over digital channels that they can harbor much imperceptible information that can be used for fraud detection and forensic purposes. But this secondary information usually fails to be retrieved during the conversion step. This is particularly relevant since the Check-21 act (Check Clearing for the 21st Century act) became effective in 2004 and allows images of checks to be handled by banks as usual paper checks. We use here this situation of check scanning as our primary benchmark for graphic security features after scanning. We will first present a quick review of the most common graphic security features currently found on checks, with their specific purpose, qualities and disadvantages, and we demonstrate their poor survivability after scanning in the average scanning conditions expected from the Check-21 Act. We will then present a novel method of measurement of distances between and rotations of line elements in a scanned image: Based on an appropriate print model, we refine direct measurements to an accuracy beyond the size of a scanning pixel, so we can then determine expected distances, periodicity, sharpness and print quality of known characters, symbols and other graphic elements in a document image. Finally we will apply our method to fraud detection of documents after gray-scale scanning at 300dpi resolution. We show in particular that alterations on legitimate checks or copies of checks can be successfully detected by measuring with sub-pixel accuracy the irregularities inherently introduced by the illegitimate process.

  11. Practical Formal Verification of Diagnosability of Large Models via Symbolic Model Checking

    NASA Technical Reports Server (NTRS)

    Cavada, Roberto; Pecheur, Charles

    2003-01-01

    This document reports on the activities carried out during a four-week visit of Roberto Cavada at the NASA Ames Research Center. The main goal was to test the practical applicability of the framework proposed, where a diagnosability problem is reduced to a Symbolic Model Checking problem. Section 2 contains a brief explanation of major techniques currently used in Symbolic Model Checking, and how these techniques can be tuned in order to obtain good performances when using Model Checking tools. Diagnosability is performed on large and structured models of real plants. Section 3 describes how these plants are modeled, and how models can be simplified to improve the performance of Symbolic Model Checkers. Section 4 reports scalability results. Three test cases are briefly presented, and several parameters and techniques have been applied on those test cases in order to produce comparison tables. Furthermore, comparison between several Model Checkers is reported. Section 5 summarizes the application of diagnosability verification to a real application. Several properties have been tested, and results have been highlighted. Finally, section 6 draws some conclusions, and outlines future lines of research.

  12. Modeling and analysis of cell membrane systems with probabilistic model checking

    PubMed Central

    2011-01-01

    Background Recently there has been a growing interest in the application of Probabilistic Model Checking (PMC) for the formal specification of biological systems. PMC is able to exhaustively explore all states of a stochastic model and can provide valuable insight into its behavior which are more difficult to see using only traditional methods for system analysis such as deterministic and stochastic simulation. In this work we propose a stochastic modeling for the description and analysis of sodium-potassium exchange pump. The sodium-potassium pump is a membrane transport system presents in all animal cell and capable of moving sodium and potassium ions against their concentration gradient. Results We present a quantitative formal specification of the pump mechanism in the PRISM language, taking into consideration a discrete chemistry approach and the Law of Mass Action aspects. We also present an analysis of the system using quantitative properties in order to verify the pump reversibility and understand the pump behavior using trend labels for the transition rates of the pump reactions. Conclusions Probabilistic model checking can be used along with other well established approaches such as simulation and differential equations to better understand pump behavior. Using PMC we can determine if specific events happen such as the potassium outside the cell ends in all model traces. We can also have a more detailed perspective on its behavior such as determining its reversibility and why its normal operation becomes slow over time. This knowledge can be used to direct experimental research and make it more efficient, leading to faster and more accurate scientific discoveries. PMID:22369714

  13. Non-equilibrium dog-flea model

    NASA Astrophysics Data System (ADS)

    Ackerson, Bruce J.

    2017-11-01

    We develop the open dog-flea model to serve as a check of proposed non-equilibrium theories of statistical mechanics. The model is developed in detail. Then it is applied to four recent models for non-equilibrium statistical mechanics. Comparison of the dog-flea solution with these different models allows checking claims and giving a concrete example of the theoretical models.

  14. Abstraction and Assume-Guarantee Reasoning for Automated Software Verification

    NASA Technical Reports Server (NTRS)

    Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.

    2004-01-01

    Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.

  15. Incremental checking of Master Data Management model based on contextual graphs

    NASA Astrophysics Data System (ADS)

    Lamolle, Myriam; Menet, Ludovic; Le Duc, Chan

    2015-10-01

    The validation of models is a crucial step in distributed heterogeneous systems. In this paper, an incremental validation method is proposed in the scope of a Model Driven Engineering (MDE) approach, which is used to develop a Master Data Management (MDM) field represented by XML Schema models. The MDE approach presented in this paper is based on the definition of an abstraction layer using UML class diagrams. The validation method aims to minimise the model errors and to optimisethe process of model checking. Therefore, the notion of validation contexts is introduced allowing the verification of data model views. Description logics specify constraints that the models have to check. An experimentation of the approach is presented through an application developed in ArgoUML IDE.

  16. Temporal effects of post-fire check dam construction on soil functionality in SE Spain.

    PubMed

    González-Romero, J; Lucas-Borja, M E; Plaza-Álvarez, P A; Sagra, J; Moya, D; De Las Heras, J

    2018-06-09

    Wildfire has historically been an alteration factor in Mediterranean basins. Despite Mediterranean ecosystems' high resilience, wildfire accelerates erosion and degradation processes, and also affects soil functionality by affecting nutrient cycles and soil structure. In semi-arid Mediterranean basins, check dams are usually built in gullies and channels after fire as a measure against soil erosion. Although check dams have proven efficient action to reduce erosion rates, studies about how they affect soil functionality are lacking. Our approach focuses on how soil functionality, defined as a combination of physico-chemical and biological indicators, is locally affected by check dam construction and the evolution of this effect over time. Soils were sampled in eight check dams in two semi-arid areas at SE Spain, which were affected by wildfire in 2012 and 2016. The study findings reveal that by altering sediments cycle and transport, check dams influence soil's main physico-chemical and biochemical characteristics. Significant differences were found between check dam-affected zones and the control ones for many indicators such as organic matter content, electrical conductivity or enzymatic activity. According to the ANOVA results, interaction between check dams influence and time after fire, was a crucial factor. PCA results clearly showed check-dams influence on soil functionality. Copyright © 2018. Published by Elsevier B.V.

  17. On quality control procedures for solar radiation and meteorological measures, from subhourly to montly average time periods

    NASA Astrophysics Data System (ADS)

    Espinar, B.; Blanc, P.; Wald, L.; Hoyer-Klick, C.; Schroedter-Homscheidt, M.; Wanderer, T.

    2012-04-01

    Meteorological data measured by ground stations are often a key element in the development and validation of methods exploiting satellite images. These data are considered as a reference against which satellite-derived estimates are compared. Long-term radiation and meteorological measurements are available from a large number of measuring stations. However, close examination of the data often reveals a lack of quality, often for extended periods of time. This lack of quality has been the reason, in many cases, of the rejection of large amount of available data. The quality data must be checked before their use in order to guarantee the inputs for the methods used in modelling, monitoring, forecast, etc. To control their quality, data should be submitted to several conditions or tests. After this checking, data that are not flagged by any of the test is released as a plausible data. In this work, it has been performed a bibliographical research of quality control tests for the common meteorological variables (ambient temperature, relative humidity and wind speed) and for the usual solar radiometrical variables (horizontal global and diffuse components of the solar radiation and the beam normal component). The different tests have been grouped according to the variable and the average time period (sub-hourly, hourly, daily and monthly averages). The quality test may be classified as follows: • Range checks: test that verify values are within a specific range. There are two types of range checks, those based on extrema and those based on rare observations. • Step check: test aimed at detecting unrealistic jumps or stagnation in the time series. • Consistency checks: test that verify the relationship between two or more time series. The gathered quality tests are applicable for all latitudes as they have not been optimized regionally nor seasonably with the aim of being generic. They have been applied to ground measurements in several geographic locations, what result in the detection of some control tests that are no longer adequate, due to different reasons. After the modification of some test, based in our experience, a set of quality control tests is now presented, updated according to technology advances and classified. The presented set of quality tests allows radiation and meteorological data to be tested in order to know their plausibility to be used as inputs in theoretical or empirical methods for scientific research. The research leading to those results has partly receive funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement no. 262892 (ENDORSE project).

  18. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification.

    PubMed

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system.

  19. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification

    PubMed Central

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system. PMID:27918594

  20. 45 CFR 2540.204 - When must I conduct a National Service Criminal History Check on an individual in a covered...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... History Check on an individual in a covered position? 2540.204 Section 2540.204 Public Welfare Regulations... When must I conduct a National Service Criminal History Check on an individual in a covered position? (a) Timing of the National Service Criminal History Check Components. (1) You must conduct and review...

  1. A Multidimensional Item Response Model: Constrained Latent Class Analysis Using the Gibbs Sampler and Posterior Predictive Checks.

    ERIC Educational Resources Information Center

    Hoijtink, Herbert; Molenaar, Ivo W.

    1997-01-01

    This paper shows that a certain class of constrained latent class models may be interpreted as a special case of nonparametric multidimensional item response models. Parameters of this latent class model are estimated using an application of the Gibbs sampler, and model fit is investigated using posterior predictive checks. (SLD)

  2. Road sign recognition with fuzzy adaptive pre-processing models.

    PubMed

    Lin, Chien-Chuan; Wang, Ming-Shi

    2012-01-01

    A road sign recognition system based on adaptive image pre-processing models using two fuzzy inference schemes has been proposed. The first fuzzy inference scheme is to check the changes of the light illumination and rich red color of a frame image by the checking areas. The other is to check the variance of vehicle's speed and angle of steering wheel to select an adaptive size and position of the detection area. The Adaboost classifier was employed to detect the road sign candidates from an image and the support vector machine technique was employed to recognize the content of the road sign candidates. The prohibitory and warning road traffic signs are the processing targets in this research. The detection rate in the detection phase is 97.42%. In the recognition phase, the recognition rate is 93.04%. The total accuracy rate of the system is 92.47%. For video sequences, the best accuracy rate is 90.54%, and the average accuracy rate is 80.17%. The average computing time is 51.86 milliseconds per frame. The proposed system can not only overcome low illumination and rich red color around the road sign problems but also offer high detection rates and high computing performance.

  3. Road Sign Recognition with Fuzzy Adaptive Pre-Processing Models

    PubMed Central

    Lin, Chien-Chuan; Wang, Ming-Shi

    2012-01-01

    A road sign recognition system based on adaptive image pre-processing models using two fuzzy inference schemes has been proposed. The first fuzzy inference scheme is to check the changes of the light illumination and rich red color of a frame image by the checking areas. The other is to check the variance of vehicle's speed and angle of steering wheel to select an adaptive size and position of the detection area. The Adaboost classifier was employed to detect the road sign candidates from an image and the support vector machine technique was employed to recognize the content of the road sign candidates. The prohibitory and warning road traffic signs are the processing targets in this research. The detection rate in the detection phase is 97.42%. In the recognition phase, the recognition rate is 93.04%. The total accuracy rate of the system is 92.47%. For video sequences, the best accuracy rate is 90.54%, and the average accuracy rate is 80.17%. The average computing time is 51.86 milliseconds per frame. The proposed system can not only overcome low illumination and rich red color around the road sign problems but also offer high detection rates and high computing performance. PMID:22778650

  4. Construct validity and reliability of the Single Checking Administration of Medications Scale.

    PubMed

    O'Connell, Beverly; Hawkins, Mary; Ockerby, Cherene

    2013-06-01

    Research indicates that single checking of medications is as safe as double checking; however, many nurses are averse to independently checking medications. To assist with the introduction and use of single checking, a measure of nurses' attitudes, the thirteen-item Single Checking Administration of Medications Scale (SCAMS) was developed. We examined the psychometric properties of the SCAMS. Secondary analyses were conducted on data collected from 503 nurses across a large Australian health-care service. Analyses using exploratory and confirmatory factor analyses supported by structural equation modelling resulted in a valid twelve-item SCAMS containing two reliable subscales, the nine-item Attitudes towards single checking and three-item Advantages of single checking subscales. The SCAMS is recommended as a valid and reliable measure for monitoring nurses' attitudes to single checking prior to introducing single checking medications and after its implementation. © 2013 Wiley Publishing Asia Pty Ltd.

  5. Introducing a checking technician allows pharmacists to spend more time on patient-focused activities.

    PubMed

    Napier, Patti; Norris, Pauline; Braund, Rhiannon

    2018-04-01

    Internationally there is an increasing focus on the clinical and cognitive services that pharmacists can provide. Lack of time has been identified as a barrier to pharmacists increasing their clinical activities. Within the pharmacy workplace there are many tasks that can only be performed by a pharmacist. The final accuracy check of a dispensed prescription is currently the sole responsibility of pharmacists in New Zealand. This takes up a significant amount of time during a pharmacist's work day. The introduction of a checking technician role has been suggested to allow pharmacists more time to do more patient focused work. To investigate the amount of time pharmacy staff spend on specific activities and to establish whether the introduction of a checking technician into twelve pilot sites increased the amount of time that the pharmacists could spend on patient focused activities. This study utilised a self-reported work sampling technique in twelve pilot sites, selected from both the hospital and community settings. Work sampling using an electronic device was conducted at two time-points (before the implementation of a Pharmacy Accuracy Checking Technician (PACT) role and when the PACT was in place). Data was collected at 10 min intervals for the period of five days, a working week. Tasks were grouped into patient focused, dispensing and personal activities. The introduction of the PACT into the pilot sites saw a mean increase of 19% in pharmacists' patient focused activities and a mean 20% decrease in dispensing activities. The introduction of a checking technician role into New Zealand pharmacies demonstrated the potential to provide pharmacists with more time to spend on patient focused activities. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Simulation-based model checking approach to cell fate specification during Caenorhabditis elegans vulval development by hybrid functional Petri net with extension.

    PubMed

    Li, Chen; Nagasaki, Masao; Ueno, Kazuko; Miyano, Satoru

    2009-04-27

    Model checking approaches were applied to biological pathway validations around 2003. Recently, Fisher et al. have proved the importance of model checking approach by inferring new regulation of signaling crosstalk in C. elegans and confirming the regulation with biological experiments. They took a discrete and state-based approach to explore all possible states of the system underlying vulval precursor cell (VPC) fate specification for desired properties. However, since both discrete and continuous features appear to be an indispensable part of biological processes, it is more appropriate to use quantitative models to capture the dynamics of biological systems. Our key motivation of this paper is to establish a quantitative methodology to model and analyze in silico models incorporating the use of model checking approach. A novel method of modeling and simulating biological systems with the use of model checking approach is proposed based on hybrid functional Petri net with extension (HFPNe) as the framework dealing with both discrete and continuous events. Firstly, we construct a quantitative VPC fate model with 1761 components by using HFPNe. Secondly, we employ two major biological fate determination rules - Rule I and Rule II - to VPC fate model. We then conduct 10,000 simulations for each of 48 sets of different genotypes, investigate variations of cell fate patterns under each genotype, and validate the two rules by comparing three simulation targets consisting of fate patterns obtained from in silico and in vivo experiments. In particular, an evaluation was successfully done by using our VPC fate model to investigate one target derived from biological experiments involving hybrid lineage observations. However, the understandings of hybrid lineages are hard to make on a discrete model because the hybrid lineage occurs when the system comes close to certain thresholds as discussed by Sternberg and Horvitz in 1986. Our simulation results suggest that: Rule I that cannot be applied with qualitative based model checking, is more reasonable than Rule II owing to the high coverage of predicted fate patterns (except for the genotype of lin-15ko; lin-12ko double mutants). More insights are also suggested. The quantitative simulation-based model checking approach is a useful means to provide us valuable biological insights and better understandings of biological systems and observation data that may be hard to capture with the qualitative one.

  7. Time series regression studies in environmental epidemiology.

    PubMed

    Bhaskaran, Krishnan; Gasparrini, Antonio; Hajat, Shakoor; Smeeth, Liam; Armstrong, Ben

    2013-08-01

    Time series regression studies have been widely used in environmental epidemiology, notably in investigating the short-term associations between exposures such as air pollution, weather variables or pollen, and health outcomes such as mortality, myocardial infarction or disease-specific hospital admissions. Typically, for both exposure and outcome, data are available at regular time intervals (e.g. daily pollution levels and daily mortality counts) and the aim is to explore short-term associations between them. In this article, we describe the general features of time series data, and we outline the analysis process, beginning with descriptive analysis, then focusing on issues in time series regression that differ from other regression methods: modelling short-term fluctuations in the presence of seasonal and long-term patterns, dealing with time varying confounding factors and modelling delayed ('lagged') associations between exposure and outcome. We finish with advice on model checking and sensitivity analysis, and some common extensions to the basic model.

  8. Design of Installing Check Dam Using RAMMS Model in Seorak National Park of South Korea

    NASA Astrophysics Data System (ADS)

    Jun, K.; Tak, W.; JUN, B. H.; Lee, H. J.; KIM, S. D.

    2016-12-01

    Design of Installing Check Dam Using RAMMS Model in Seorak National Park of South Korea Kye-Won Jun*, Won-Jun Tak*, Byong-Hee Jun**, Ho-Jin Lee***, Soung-Doug Kim* *Graduate School of Disaster Prevention, Kangwon National University, 346 Joogang-ro, Samcheok-si, Gangwon-do, Korea **School of Fire and Disaster Protection, Kangwon National University, 346 Joogang-ro, Samcheok-si, Gangwon-do, Korea ***School of Civil Engineering, Chungbuk National University, 1 Chungdae-ro, Seowon-gu, Cheongju, Korea Abstract As more than 64% of the land in South Korea is mountainous area, so many regions in South Korea are exposed to the danger of landslide and debris flow. So it is important to understand the behavior of debris flow in mountainous terrains, the various methods and models are being presented and developed based on the mathematical concept. The purpose of this study is to investigate the regions that experienced the debris flow due to typhoon called Ewiniar and to perform numerical modeling to design and layout of the Check dam for reducing the damage by the debris flow. For the performance of numerical modeling, on-site measurement of the research area was conducted including: topographic investigation, research on bridges in the downstream, and precision LiDAR 3D scanning for composed basic data of numerical modeling. The numerical simulation of this study was performed using RAMMS (Rapid Mass Movements Simulation) model for the analysis of the debris flow. This model applied to the conditions of the Check dam which was installed in the upstream, midstream, and downstream. Considering the reduction effect of debris flow, the expansion of debris flow, and the influence on the bridges in the downstream, proper location of the Check dam was designated. The result of present numerical model showed that when the Check dam was installed in the downstream section, 50 m above the bridge, the reduction effect of the debris flow was higher compared to when the Check dam were installed in other sections. Key words: Debris flow, LiDAR, Check dam, RAMMSAcknowledgementsThis research was supported by a grant [MPSS-NH-2014-74] through the Disaster and Safety Management Institute funded by Ministry of Public Safety and Security of Korean government

  9. Model building strategy for logistic regression: purposeful selection.

    PubMed

    Zhang, Zhongheng

    2016-03-01

    Logistic regression is one of the most commonly used models to account for confounders in medical literature. The article introduces how to perform purposeful selection model building strategy with R. I stress on the use of likelihood ratio test to see whether deleting a variable will have significant impact on model fit. A deleted variable should also be checked for whether it is an important adjustment of remaining covariates. Interaction should be checked to disentangle complex relationship between covariates and their synergistic effect on response variable. Model should be checked for the goodness-of-fit (GOF). In other words, how the fitted model reflects the real data. Hosmer-Lemeshow GOF test is the most widely used for logistic regression model.

  10. HiVy automated translation of stateflow designs for model checking verification

    NASA Technical Reports Server (NTRS)

    Pingree, Paula

    2003-01-01

    tool set enables model checking of finite state machines designs. This is acheived by translating state-chart specifications into the input language of the Spin model checker. An abstract syntax of hierarchical sequential automata (HSA) is provided as an intermediate format tool set.

  11. Immediate Effects of Body Checking Behaviour on Negative and Positive Emotions in Women with Eating Disorders: An Ecological Momentary Assessment Approach.

    PubMed

    Kraus, Nicole; Lindenberg, Julia; Zeeck, Almut; Kosfelder, Joachim; Vocks, Silja

    2015-09-01

    Cognitive-behavioural models of eating disorders state that body checking arises in response to negative emotions in order to reduce the aversive emotional state and is therefore negatively reinforced. This study empirically tests this assumption. For a seven-day period, women with eating disorders (n = 26) and healthy controls (n = 29) were provided with a handheld computer for assessing occurring body checking strategies as well as negative and positive emotions. Serving as control condition, randomized computer-emitted acoustic signals prompted reports on body checking and emotions. There was no difference in the intensity of negative emotions before body checking and in control situations across groups. However, from pre- to post-body checking, an increase in negative emotions was found. This effect was more pronounced in women with eating disorders compared with healthy controls. Results are contradictory to the assumptions of the cognitive-behavioural model, as body checking does not seem to reduce negative emotions. Copyright © 2015 John Wiley & Sons, Ltd and Eating Disorders Association.

  12. The timing of antenatal care initiation and the content of care in Sindh, Pakistan.

    PubMed

    Agha, Sohail; Tappis, Hannah

    2016-07-27

    Policymakers and program planners consider antenatal care (ANC) coverage to be a primary measure of improvements in maternal health. Yet, evidence from multiple countries indicates that ANC coverage has no necessary relationship with the content of services provided. This study examines the relationship between the timing of the first ANC check-up, a potential predictor of the content of services, and the provision of WHO recommended services to women during their pregnancy. The study uses data from a representative household survey of Sindh with a sample comprising of 4,000 women aged 15-49 who had had a live birth in the two years before the survey. The survey obtained information about the elements of care provided during pregnancy, the timing of the first ANC check-up, the number of ANC visits made during the last pregnancy and women's socio-economic and demographic characteristics. Bivariate analysis was conducted to examine the relationship between the proportion of women who receive six WHO recommended services and the timing of their first ANC check-up. Multivariate analysis was conducted to identify predictors of the number of elements of care provided. While most women in Sindh (87 %) receive an ANC check-up, its timing varies by parity, education and household wealth. The median time to the first ANC check-up was 3 months for women in the richest and 7 months for women in the poorest wealth quintiles. In multivariate analysis, wealth, education, parity and age at marriage were significant predictors of the number of elements of care provided. Women who received an early ANC check-up were much more likely to receive WHO recommended services than other women, independent of a range of socio-economic and demographic variables and independent of the number of ANC visits made during pregnancy. In Sindh, the timing of the first ANC check-up has an independent effect on the content of care provided to pregnant women. While it is extremely important that providers are adequately trained and motivated to provide the WHO recommended standards of care, these findings suggest that motivating women to make an early first ANC check-up may be another mechanism through which the quality of care provided may be improved. Such a focus is most likely to benefit the poorest, least educated and highest parity women. Based on these findings, we recommend that routine data collected at health facilities in Pakistan should include the month of pregnancy at the time of the first ANC check-up.

  13. Controlling state explosion during automatic verification of delay-insensitive and delay-constrained VLSI systems using the POM verifier

    NASA Technical Reports Server (NTRS)

    Probst, D.; Jensen, L.

    1991-01-01

    Delay-insensitive VLSI systems have a certain appeal on the ground due to difficulties with clocks; they are even more attractive in space. We answer the question, is it possible to control state explosion arising from various sources during automatic verification (model checking) of delay-insensitive systems? State explosion due to concurrency is handled by introducing a partial-order representation for systems, and defining system correctness as a simple relation between two partial orders on the same set of system events (a graph problem). State explosion due to nondeterminism (chiefly arbitration) is handled when the system to be verified has a clean, finite recurrence structure. Backwards branching is a further optimization. The heart of this approach is the ability, during model checking, to discover a compact finite presentation of the verified system without prior composition of system components. The fully-implemented POM verification system has polynomial space and time performance on traditional asynchronous-circuit benchmarks that are exponential in space and time for other verification systems. We also sketch the generalization of this approach to handle delay-constrained VLSI systems.

  14. Model Diagnostics for Bayesian Networks

    ERIC Educational Resources Information Center

    Sinharay, Sandip

    2006-01-01

    Bayesian networks are frequently used in educational assessments primarily for learning about students' knowledge and skills. There is a lack of works on assessing fit of Bayesian networks. This article employs the posterior predictive model checking method, a popular Bayesian model checking tool, to assess fit of simple Bayesian networks. A…

  15. The dopamine D2/D3 receptor agonist quinpirole increases checking-like behaviour in an operant observing response task with uncertain reinforcement: A novel possible model of OCD?

    PubMed Central

    Eagle, Dawn M.; Noschang, Cristie; d’Angelo, Laure-Sophie Camilla; Noble, Christie A.; Day, Jacob O.; Dongelmans, Marie Louise; Theobald, David E.; Mar, Adam C.; Urcelay, Gonzalo P.; Morein-Zamir, Sharon; Robbins, Trevor W.

    2014-01-01

    Excessive checking is a common, debilitating symptom of obsessive-compulsive disorder (OCD). In an established rodent model of OCD checking behaviour, quinpirole (dopamine D2/3-receptor agonist) increased checking in open-field tests, indicating dopaminergic modulation of checking-like behaviours. We designed a novel operant paradigm for rats (observing response task (ORT)) to further examine cognitive processes underpinning checking behaviour and clarify how and why checking develops. We investigated i) how quinpirole increases checking, ii) dependence of these effects on D2/3 receptor function (following treatment with D2/3 receptor antagonist sulpiride) and iii) effects of reward uncertainty. In the ORT, rats pressed an ‘observing’ lever for information about the location of an ‘active’ lever that provided food reinforcement. High- and low-checkers (defined from baseline observing) received quinpirole (0.5 mg/kg, 10 treatments) or vehicle. Parametric task manipulations assessed observing/checking under increasing task demands relating to reinforcement uncertainty (variable response requirement and active-lever location switching). Treatment with sulpiride further probed the pharmacological basis of long-term behavioural changes. Quinpirole selectively increased checking, both functional observing lever presses (OLPs) and non-functional extra OLPs (EOLPs). The increase in OLPs and EOLPs was long-lasting, without further quinpirole administration. Quinpirole did not affect the immediate ability to use information from checking. Vehicle and quinpirole-treated rats (VEH and QNP respectively) were selectively sensitive to different forms of uncertainty. Sulpiride reduced non-functional EOLPs in QNP rats but had no effect on functional OLPs. These data have implications for treatment of compulsive checking in OCD, particularly for serotonin-reuptake-inhibitor treatment-refractory cases, where supplementation with dopamine receptor antagonists may be beneficial. PMID:24406720

  16. Towards the Formal Verification of a Distributed Real-Time Automotive System

    NASA Technical Reports Server (NTRS)

    Endres, Erik; Mueller, Christian; Shadrin, Andrey; Tverdyshev, Sergey

    2010-01-01

    We present the status of a project which aims at building, formally and pervasively verifying a distributed automotive system. The target system is a gate-level model which consists of several interconnected electronic control units with independent clocks. This model is verified against the specification as seen by a system programmer. The automotive system is implemented on several FPGA boards. The pervasive verification is carried out using combination of interactive theorem proving (Isabelle/HOL) and model checking (LTL).

  17. Antimicrobial activity of root canal irrigants against biofilm forming pathogens- An in vitro study

    PubMed Central

    Ghivari, Sheetal Basavraj; Bhattacharya, Haimanti; Bhat, Kishore G.; Pujar, Madhu A.

    2017-01-01

    Aims: The aim of the study was to check the antimicrobial activity of the 5% Sodium hypochlorite, 2% Chlorhexidine, 0.10% Octenidine (OCT), and 2% Silver Zeolite (SZ) at different time intervals against a single species biofilm of Enterococcus faecalis, Staphylococcus aureus, and Candida albicans model prepared on a nitrocellulose membrane. Settings and Design: In vitro nitrocellulose biofilm model was used to check antibacterial efficacy of root canal irrigants. Materials and Methods: The in vitro nitrocellulose biofilm model was used to check the antibacterial activity of root canal irrigants. Single species biofilms were suspended into 96-well microtiter plate and treated with root canal irrigants for 1, 5, 10, 15, 30, and 60 s, respectively. The remaining microbial load in the form of colony-forming unit/ml after antimicrobial treatment was tabulated and data were statistically analyzed. Statistical Analysis: SPSS version 17, Kruskal–Wallis ANOVA, Mann–Whitney U-test, and Wilcoxon matched pair test (P < 0.05) were used. Results: All tested microorganisms were eliminated within 30 s by all the antimicrobial substances tested except normal saline. 2% chlorhexidine and 0.10% OCT were equally effective against C. albicans at 30 s. Conclusion: The newly tested irrigants have shown considerable antibacterial activity against selected single species biofilm. OCT (0.10%) can be used as an alternative endodontic irrigant. PMID:29279615

  18. The United States Army War College: Time for a Change

    DTIC Science & Technology

    2012-03-23

    to learn ” choosing operational assignments over educational ones. This and organizational malaise in the SSCs have made them an “intellectual...the block checked for their next assignment and promotion. They can skate through, meeting minimal requirements….There is very little in place to...Army at the time. The first USAWC educational model was described by General Tasker H. Bliss as “ learning by doing”, as students/staff officers work

  19. Collaborative Recurrent Neural Networks forDynamic Recommender Systems

    DTIC Science & Technology

    2016-11-22

    formulation leads to an efficient and practical method. Furthermore, we demonstrate the versatility of our model by applying it to two different tasks: music ...form (user id, location id, check-in time). The LastFM9 dataset consists of sequences of songs played by a user’s music player collected by using a...Jeffrey L Elman. Finding structure in time. Cognitive science, 14(2), 1990. Alex Graves, Abdel-rahman Mohamed, and Geoffrey Hinton. Speech recognition

  20. Power quality analysis of DC arc furnace operation using the Bowman model for electric arc

    NASA Astrophysics Data System (ADS)

    Gherman, P. L.

    2018-01-01

    This work is about a relatively new domain. The DC electric arc is superior to the AC electric arc and it’s not used in Romania. This is why we analyzed the work functions of these furnaces by simulation and model checking of the simulation results.The conclusions are favorable, to be carried is to develop a real-time control system of steel elaboration process.

  1. 75 FR 61784 - Proposed Collection; Comment Request for Review of a Revised Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-06

    ... response time of ten minutes per form reporting a missing check is estimated; the same amount of time is needed to report the missing checks or electronic funds transfer (EFT) payments using the telephone. The...

  2. An Interactive Program for the Calculation and Analysis of the Parameter Sensitivities in a Linear, Time-Invariant System.

    DTIC Science & Technology

    1981-03-01

    tifiability is imposed; and the system designer now has a tool to evaluate how well the model describes the system . The algorithm is verified by checking its...xi I. Introduction In analyzing a system , the design engineer uses a mathematical model. The model, by its very definition, represents the system . It...number of G (See Eq (23).) can 18 give the designer a good indication of just how well the model defined by Eqs (1) through (3) describes the system

  3. Radar-driven high-resolution hydro-meteorological forecasts of the 26 September 2007 Venice flash flood

    NASA Astrophysics Data System (ADS)

    Rossa, Andrea M.; Laudanna Del Guerra, Franco; Borga, Marco; Zanon, Francesco; Settin, Tommaso; Leuenberger, Daniel

    2010-11-01

    SummaryThis study aims to assess the feasibility of assimilating carefully checked radar rainfall estimates into a numerical weather prediction (NWP) to extend the forecasting lead time for an extreme flash flood. The hydro-meteorological modeling chain includes the convection-permitting NWP model COSMO-2 and a coupled hydrological-hydraulic model. Radar rainfall estimates are assimilated into the NWP model via the latent heat nudging method. The study is focused on 26 September 2007 extreme flash flood which impacted the coastal area of North-eastern Italy around Venice. The hydro-meteorological modeling system is implemented over the 90 km2 Dese river basin draining to the Venice Lagoon. The radar rainfall observations are carefully checked for artifacts, including rain-induced signal attenuation, by means of physics-based correction procedures and comparison with a dense network of raingauges. The impact of the radar rainfall estimates in the assimilation cycle of the NWP model is very significant. The main individual organized convective systems are successfully introduced into the model state, both in terms of timing and localization. Also, high-intensity incorrectly localized precipitation is correctly reduced to about the observed levels. On the other hand, the highest rainfall intensities computed after assimilation underestimate the observed values by 20% and 50% at a scale of 20 km and 5 km, respectively. The positive impact of assimilating radar rainfall estimates is carried over into the free forecast for about 2-5 h, depending on when the forecast was started. The positive impact is larger when the main mesoscale convective system is present in the initial conditions. The improvements in the precipitation forecasts are propagated to the river flow simulations, with an extension of the forecasting lead time up to 3 h.

  4. Real-Time System Verification by Kappa-Induction

    NASA Technical Reports Server (NTRS)

    Pike, Lee S.

    2005-01-01

    We report the first formal verification of a reintegration protocol for a safety-critical, fault-tolerant, real-time distributed embedded system. A reintegration protocol increases system survivability by allowing a node that has suffered a fault to regain state consistent with the operational nodes. The protocol is verified in the Symbolic Analysis Laboratory (SAL), where bounded model checking and decision procedures are used to verify infinite-state systems by k-induction. The protocol and its environment are modeled as synchronizing timeout automata. Because k-induction is exponential with respect to k, we optimize the formal model to reduce the size of k. Also, the reintegrator's event-triggered behavior is conservatively modeled as time-triggered behavior to further reduce the size of k and to make it invariant to the number of nodes modeled. A corollary is that a clique avoidance property is satisfied.

  5. New Results in Software Model Checking and Analysis

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.

    2010-01-01

    This introductory article surveys new techniques, supported by automated tools, for the analysis of software to ensure reliability and safety. Special focus is on model checking techniques. The article also introduces the five papers that are enclosed in this special journal volume.

  6. State-based verification of RTCP-nets with nuXmv

    NASA Astrophysics Data System (ADS)

    Biernacka, Agnieszka; Biernacki, Jerzy; Szpyrka, Marcin

    2015-12-01

    The paper deals with an algorithm of translation of RTCP-nets' (real-time coloured Petri nets) coverability graphs into nuXmv state machines. The approach enables users to verify RTCP-nets with model checking techniques provided by the nuXmv tool. Full details of the algorithm are presented and an illustrative example of the approach usefulness is provided.

  7. Computer Program Re-layers Engineering Drawings

    NASA Technical Reports Server (NTRS)

    Crosby, Dewey C., III

    1990-01-01

    RULCHK computer program aids in structuring layers of information pertaining to part or assembly designed with software described in article "Software for Drawing Design Details Concurrently" (MFS-28444). Checks and optionally updates structure of layers for part. Enables designer to construct model and annotate its documentation without burden of manually layering part to conform to standards at design time.

  8. Variable Step Integration Coupled with the Method of Characteristics Solution for Water-Hammer Analysis, A Case Study

    NASA Technical Reports Server (NTRS)

    Turpin, Jason B.

    2004-01-01

    One-dimensional water-hammer modeling involves the solution of two coupled non-linear hyperbolic partial differential equations (PDEs). These equations result from applying the principles of conservation of mass and momentum to flow through a pipe, and usually the assumption that the speed at which pressure waves propagate through the pipe is constant. In order to solve these equations for the interested quantities (i.e. pressures and flow rates), they must first be converted to a system of ordinary differential equations (ODEs) by either approximating the spatial derivative terms with numerical techniques or using the Method of Characteristics (MOC). The MOC approach is ideal in that no numerical approximation errors are introduced in converting the original system of PDEs into an equivalent system of ODEs. Unfortunately this resulting system of ODEs is bound by a time step constraint so that when integrating the equations the solution can only be obtained at fixed time intervals. If the fluid system to be modeled also contains dynamic components (i.e. components that are best modeled by a system of ODEs), it may be necessary to take extremely small time steps during certain points of the model simulation in order to achieve stability and/or accuracy in the solution. Coupled together, the fixed time step constraint invoked by the MOC, and the occasional need for extremely small time steps in order to obtain stability and/or accuracy, can greatly increase simulation run times. As one solution to this problem, a method for combining variable step integration (VSI) algorithms with the MOC was developed for modeling water-hammer in systems with highly dynamic components. A case study is presented in which reverse flow through a dual-flapper check valve introduces a water-hammer event. The predicted pressure responses upstream of the check-valve are compared with test data.

  9. Strategy optimization for mask rule check in wafer fab

    NASA Astrophysics Data System (ADS)

    Yang, Chuen Huei; Lin, Shaina; Lin, Roger; Wang, Alice; Lee, Rachel; Deng, Erwin

    2015-07-01

    Photolithography process is getting more and more sophisticated for wafer production following Moore's law. Therefore, for wafer fab, consolidated and close cooperation with mask house is a key to achieve silicon wafer success. However, generally speaking, it is not easy to preserve such partnership because many engineering efforts and frequent communication are indispensable. The inattentive connection is obvious in mask rule check (MRC). Mask houses will do their own MRC at job deck stage, but the checking is only for identification of mask process limitation including writing, etching, inspection, metrology, etc. No further checking in terms of wafer process concerned mask data errors will be implemented after data files of whole mask are composed in mask house. There are still many potential data errors even post-OPC verification has been done for main circuits. What mentioned here are the kinds of errors which will only occur as main circuits combined with frame and dummy patterns to form whole reticle. Therefore, strategy optimization is on-going in UMC to evaluate MRC especially for wafer fab concerned errors. The prerequisite is that no impact on mask delivery cycle time even adding this extra checking. A full-mask checking based on job deck in gds or oasis format is necessary in order to secure acceptable run time. Form of the summarized error report generated by this checking is also crucial because user friendly interface will shorten engineers' judgment time to release mask for writing. This paper will survey the key factors of MRC in wafer fab.

  10. 76 FR 12999 - Submission for OMB Review; Comment Request for Review of a Revised Information Collection: (OMB...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-09

    ...,600 are reported by telephone. A response time of ten minutes per form reporting a missing check is estimated; the same amount of time is needed to report the missing checks or electronic funds transfer (EFT...

  11. Query Language for Location-Based Services: A Model Checking Approach

    NASA Astrophysics Data System (ADS)

    Hoareau, Christian; Satoh, Ichiro

    We present a model checking approach to the rationale, implementation, and applications of a query language for location-based services. Such query mechanisms are necessary so that users, objects, and/or services can effectively benefit from the location-awareness of their surrounding environment. The underlying data model is founded on a symbolic model of space organized in a tree structure. Once extended to a semantic model for modal logic, we regard location query processing as a model checking problem, and thus define location queries as hybrid logicbased formulas. Our approach is unique to existing research because it explores the connection between location models and query processing in ubiquitous computing systems, relies on a sound theoretical basis, and provides modal logic-based query mechanisms for expressive searches over a decentralized data structure. A prototype implementation is also presented and will be discussed.

  12. Abstraction Techniques for Parameterized Verification

    DTIC Science & Technology

    2006-11-01

    approach for applying model checking to unbounded systems is to extract finite state models from them using conservative abstraction techniques. Prop...36 2.5.1 Multiple Reference Processes . . . . . . . . . . . . . . . . . . . 36 2.5.2 Adding Monitor Processes...model checking to complex pieces of code like device drivers depends on the use of abstraction methods. An abstraction method extracts a small finite

  13. Understanding the modeling skill shift in engineering: the impact of self-efficacy, epistemology, and metacognition

    NASA Astrophysics Data System (ADS)

    Yildirim, Tuba Pinar

    A focus of engineering education is to prepare future engineers with problem solving, design and modeling skills. In engineering education, the former two skill areas have received copious attention making their way into the ABET criteria. Modeling, a representation containing the essential structure of an event in the real world, is a fundamental function of engineering, and an important academic skill that students develop during their undergraduate education. Yet the modeling process remains under-investigated, particularly in engineering, even though there is an increasing emphasis on modeling in engineering schools (Frey 2003). Research on modeling requires a deep understanding of multiple perspectives, that of cognition, affect, and knowledge expansion. In this dissertation, the relationship between engineering modeling skills and students' cognitive backgrounds including self-efficacy, epistemic beliefs and metacognition is investigated using model-eliciting activities (MEAs). Data were collected from sophomore students at two time periods, as well as senior engineering students. The impact of each cognitive construct on change in modeling skills was measured using a growth curve model at the sophomore level, and ordinary least squares regression at the senior level. Findings of this dissertation suggest that self-efficacy, through its direct and indirect (moderation or interaction term with time) impact, influences the growth of modeling abilities of an engineering student. When sophomore and senior modeling abilities are compared, the difference can be explained by varying self-efficacy levels. Epistemology influences modeling skill development such that the more sophisticated the student beliefs are, the higher the level of modeling ability students can attain, after controlling for the effects of conceptual learning, gender and GPA. This suggests that development of modeling ability may be constrained by the naivete of one's personal epistemology. Finally, metacognition, or 'thinking about thinking', has an impact on the development of modeling strategies of students, when the impacts of four metacognitive dimensions are considered: awareness, planning, cognitive strategy and self-checking. Students who are better at self-checking show higher growth in their modeling abilities over the course of a year, compared to students who are less proficient at self-checking. The growth in modeling abilities is also moderated by the cognitive strategy and planning skills of the student. After some experience with modeling is attained, students who have enhanced skills in these two metacognitive dimensions are observed to do better in modeling. Therefore, inherent metacognitive abilities of students can positively affect the growth of modeling ability.

  14. Analysis about modeling MEC7000 excitation system of nuclear power unit

    NASA Astrophysics Data System (ADS)

    Liu, Guangshi; Sun, Zhiyuan; Dou, Qian; Liu, Mosi; Zhang, Yihui; Wang, Xiaoming

    2018-02-01

    Aiming at the importance of accurate modeling excitation system in stability calculation of nuclear power plant inland and lack of research in modeling MEC7000 excitation system,this paper summarize a general method to modeling and simulate MEC7000 excitation system. Among this method also solve the key issues of computing method of IO interface parameter and the conversion process of excitation system measured model to BPA simulation model. At last complete the simulation modeling of MEC7000 excitation system first time in domestic. By used No-load small disturbance check, demonstrates that the proposed model and algorithm is corrective and efficient.

  15. Simulation-Based Model Checking for Nondeterministic Systems and Rare Events

    DTIC Science & Technology

    2016-03-24

    year, we have investigated AO* search and Monte Carlo Tree Search algorithms to complement and enhance CMU’s SMCMDP. 1 Final Report, March 14... tree , so we can use it to find the probability of reachability for a property in PRISM’s Probabilistic LTL. By finding the maximum probability of...savings, particularly when handling very large models. 2.3 Monte Carlo Tree Search The Monte Carlo sampling process in SMCMDP can take a long time to

  16. Assessing neurocognitive function in psychiatric disorders: A roadmap for enhancing consensus

    PubMed Central

    Ahmari, Susanne E.; Eich, Teal; Cebenoyan, Deniz; Smith, Edward E.; Simpson, H. Blair

    2014-01-01

    It has been challenging to identify core neurocognitive deficits that are consistent across multiple studies in patients with Obsessive Compulsive Disorder (OCD). In turn, this leads to difficulty in translating findings from human studies into animal models to dissect pathophysiology. In this article, we use primary data from a working memory task in OCD patients to illustrate this issue. Working memory deficiencies have been proposed as an explanatory model for the evolution of checking compulsions in a subset of OCD patients. However, findings have been mixed due to variability in task design, examination of spatial vs. verbal working memory, and heterogeneity in patient populations. Two major questions therefore remain: first, do OCD patients have disturbances in working memory? Second, if there are working memory deficits in OCD, do they cause checking compulsions?. In order to investigate these questions, we tested 19 unmedicated OCD patients and 23 matched healthy controls using a verbal working memory task that has increased difficulty/task-load compared to classic digit-span tasks. OCD patients did not significantly differ in their performance on this task compared to healthy controls, regardless of the outcome measure used (i.e. reaction time or accuracy). Exploratory analyses suggest that a subset of patients with predominant doubt/checking symptoms may have decreased memory confidence despite normal performance on trials with the highest working memory load. These results suggest that other etiologic factors for checking compulsions should be considered. In addition, they serve as a touchstone for discussion, and therefore help us to generate a roadmap for increasing consensus in the assessment of neurocognitive function in psychiatric disorders. PMID:24994503

  17. The method of a joint intraday security check system based on cloud computing

    NASA Astrophysics Data System (ADS)

    Dong, Wei; Feng, Changyou; Zhou, Caiqi; Cai, Zhi; Dan, Xu; Dai, Sai; Zhang, Chuancheng

    2017-01-01

    The intraday security check is the core application in the dispatching control system. The existing security check calculation only uses the dispatch center’s local model and data as the functional margin. This paper introduces the design of all-grid intraday joint security check system based on cloud computing and its implementation. To reduce the effect of subarea bad data on the all-grid security check, a new power flow algorithm basing on comparison and adjustment with inter-provincial tie-line plan is presented. And the numerical example illustrated the effectiveness and feasibility of the proposed method.

  18. [Factors Associated with Stress Check Attendance: Possible Effect of Timing of Annual Health Examination].

    PubMed

    Ishimaru, Tomohiro; Hattori, Michihiro; Nagata, Masako; Kuwahara, Keisuke; Watanabe, Seiji; Mori, Koji

    2018-01-01

    The stress check program has been part of annual employees' health screening since 2015. Employees are recommended, but not obliged, to undergo the stress check offered. This study was designed to examine the factors associated with stress check attendance. A total of 31,156 Japanese employees who underwent an annual health examination and a stress check service at an Occupational Health Service Center in 2016 participated in this study. Data from the annual health examination and stress check service included stress check attendance, date of attendance (if implemented), gender, age, workplace industry, number of employees at the workplace, and tobacco and alcohol consumption. Data were analyzed using multiple logistic regression. The mean rate of stress check attendance was 90.8%. A higher rate of stress check attendance was associated with a lower duration from the annual health examination, age ≥30 years, construction and transport industry, and 50-999 employees at the workplace. A lower rate of stress check attendance was associated with medical and welfare industry and ≥1,000 employees at the workplace. These findings provide insights into developing strategies for improving the rate of stress check attendance. In particular, stress check attendance may improve if the stress check service and annual health examination are conducted simultaneously.

  19. 76 FR 35344 - Airworthiness Directives; Costruzioni Aeronautiche Tecnam srl Model P2006T Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-17

    ... retraction/extension ground checks performed on the P2006T, a loose Seeger ring was found on the nose landing... specified products. The MCAI states: During Landing Gear retraction/extension ground checks performed on the... airworthiness information (MCAI) states: During Landing Gear retraction/extension ground checks performed on the...

  20. The role of healthcare system in dental check-ups in 27 European countries: multilevel analysis.

    PubMed

    Kino, Shiho; Bernabé, Eduardo; Sabbah, Wael

    2017-06-01

    To examine whether public expenditure on health and Euro Health Consumer Index (EHCI) are associated with dental check-ups in European countries. Individual data were from Eurobarometer 72.3, 2009 a cross-national survey of 27 European countries. Eligible participants were those aged 18 years and older in 27 European countries. Dental check-ups reflected dental visits for oral examination and getting advice on oral health in the last 12 months. Individual factors included age, gender, marital status, urbanisation, education, subjective social status, and difficulty in paying bills. Public expenditure on health as a percentage of gross domestic product (GDP) and EHCI were used as contextual factors. A set of multilevel logistic regression models was used to examine the relationship between dental check-ups and each of healthcare expenditure and EHCI adjusting for demographic factors, GDP per capita and socioeconomic indicators. Total number included in the analysis was 23,842. Participants in countries with greater healthcare expenditure and higher score of EHCI were significantly 1.17 (95% CI: 1.03, 1.32) and 1.30 times (95% CI: 1.04, 1.64) more likely to report dental check-ups within the past 12 months after accounting for demographic characteristics, GDP per capita, and all socioeconomic indicators. The findings suggest that greater governmental support for the healthcare and better characteristics of healthcare system are positively associated with routine dental attendance. © 2017 American Association of Public Health Dentistry.

  1. Verification of a Byzantine-Fault-Tolerant Self-stabilizing Protocol for Clock Synchronization

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.

    2008-01-01

    This paper presents the mechanical verification of a simplified model of a rapid Byzantine-fault-tolerant self-stabilizing protocol for distributed clock synchronization systems. This protocol does not rely on any assumptions about the initial state of the system except for the presence of sufficient good nodes, thus making the weakest possible assumptions and producing the strongest results. This protocol tolerates bursts of transient failures, and deterministically converges within a time bound that is a linear function of the self-stabilization period. A simplified model of the protocol is verified using the Symbolic Model Verifier (SMV). The system under study consists of 4 nodes, where at most one of the nodes is assumed to be Byzantine faulty. The model checking effort is focused on verifying correctness of the simplified model of the protocol in the presence of a permanent Byzantine fault as well as confirmation of claims of determinism and linear convergence with respect to the self-stabilization period. Although model checking results of the simplified model of the protocol confirm the theoretical predictions, these results do not necessarily confirm that the protocol solves the general case of this problem. Modeling challenges of the protocol and the system are addressed. A number of abstractions are utilized in order to reduce the state space.

  2. Visplause: Visual Data Quality Assessment of Many Time Series Using Plausibility Checks.

    PubMed

    Arbesser, Clemens; Spechtenhauser, Florian; Muhlbacher, Thomas; Piringer, Harald

    2017-01-01

    Trends like decentralized energy production lead to an exploding number of time series from sensors and other sources that need to be assessed regarding their data quality (DQ). While the identification of DQ problems for such routinely collected data is typically based on existing automated plausibility checks, an efficient inspection and validation of check results for hundreds or thousands of time series is challenging. The main contribution of this paper is the validated design of Visplause, a system to support an efficient inspection of DQ problems for many time series. The key idea of Visplause is to utilize meta-information concerning the semantics of both the time series and the plausibility checks for structuring and summarizing results of DQ checks in a flexible way. Linked views enable users to inspect anomalies in detail and to generate hypotheses about possible causes. The design of Visplause was guided by goals derived from a comprehensive task analysis with domain experts in the energy sector. We reflect on the design process by discussing design decisions at four stages and we identify lessons learned. We also report feedback from domain experts after using Visplause for a period of one month. This feedback suggests significant efficiency gains for DQ assessment, increased confidence in the DQ, and the applicability of Visplause to summarize indicators also outside the context of DQ.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Covington, E; Younge, K; Chen, X

    Purpose: To evaluate the effectiveness of an automated plan check tool to improve first-time plan quality as well as standardize and document performance of physics plan checks. Methods: The Plan Checker Tool (PCT) uses the Eclipse Scripting API to check and compare data from the treatment planning system (TPS) and treatment management system (TMS). PCT was created to improve first-time plan quality, reduce patient delays, increase efficiency of our electronic workflow, and to standardize and partially automate plan checks in the TPS. A framework was developed which can be configured with different reference values and types of checks. One examplemore » is the prescribed dose check where PCT flags the user when the planned dose and the prescribed dose disagree. PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user. A PDF report is created and automatically uploaded into the TMS. Prior to and during PCT development, errors caught during plan checks and also patient delays were tracked in order to prioritize which checks should be automated. The most common and significant errors were determined. Results: Nineteen of 33 checklist items were automated with data extracted with the PCT. These include checks for prescription, reference point and machine scheduling errors which are three of the top six causes of patient delays related to physics and dosimetry. Since the clinical roll-out, no delays have been due to errors that are automatically flagged by the PCT. Development continues to automate the remaining checks. Conclusion: With PCT, 57% of the physics plan checklist has been partially or fully automated. Treatment delays have declined since release of the PCT for clinical use. By tracking delays and errors, we have been able to measure the effectiveness of automating checks and are using this information to prioritize future development. This project was supported in part by P01CA059827.« less

  4. The dopamine D2/D3 receptor agonist quinpirole increases checking-like behaviour in an operant observing response task with uncertain reinforcement: a novel possible model of OCD.

    PubMed

    Eagle, Dawn M; Noschang, Cristie; d'Angelo, Laure-Sophie Camilla; Noble, Christie A; Day, Jacob O; Dongelmans, Marie Louise; Theobald, David E; Mar, Adam C; Urcelay, Gonzalo P; Morein-Zamir, Sharon; Robbins, Trevor W

    2014-05-01

    Excessive checking is a common, debilitating symptom of obsessive-compulsive disorder (OCD). In an established rodent model of OCD checking behaviour, quinpirole (dopamine D2/3-receptor agonist) increased checking in open-field tests, indicating dopaminergic modulation of checking-like behaviours. We designed a novel operant paradigm for rats (observing response task (ORT)) to further examine cognitive processes underpinning checking behaviour and clarify how and why checking develops. We investigated i) how quinpirole increases checking, ii) dependence of these effects on D2/3 receptor function (following treatment with D2/3 receptor antagonist sulpiride) and iii) effects of reward uncertainty. In the ORT, rats pressed an 'observing' lever for information about the location of an 'active' lever that provided food reinforcement. High- and low-checkers (defined from baseline observing) received quinpirole (0.5mg/kg, 10 treatments) or vehicle. Parametric task manipulations assessed observing/checking under increasing task demands relating to reinforcement uncertainty (variable response requirement and active-lever location switching). Treatment with sulpiride further probed the pharmacological basis of long-term behavioural changes. Quinpirole selectively increased checking, both functional observing lever presses (OLPs) and non-functional extra OLPs (EOLPs). The increase in OLPs and EOLPs was long-lasting, without further quinpirole administration. Quinpirole did not affect the immediate ability to use information from checking. Vehicle and quinpirole-treated rats (VEH and QNP respectively) were selectively sensitive to different forms of uncertainty. Sulpiride reduced non-functional EOLPs in QNP rats but had no effect on functional OLPs. These data have implications for treatment of compulsive checking in OCD, particularly for serotonin-reuptake-inhibitor treatment-refractory cases, where supplementation with dopamine receptor antagonists may be beneficial. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  5. Modeling of human operator dynamics in simple manual control utilizing time series analysis. [tracking (position)

    NASA Technical Reports Server (NTRS)

    Agarwal, G. C.; Osafo-Charles, F.; Oneill, W. D.; Gottlieb, G. L.

    1982-01-01

    Time series analysis is applied to model human operator dynamics in pursuit and compensatory tracking modes. The normalized residual criterion is used as a one-step analytical tool to encompass the processes of identification, estimation, and diagnostic checking. A parameter constraining technique is introduced to develop more reliable models of human operator dynamics. The human operator is adequately modeled by a second order dynamic system both in pursuit and compensatory tracking modes. In comparing the data sampling rates, 100 msec between samples is adequate and is shown to provide better results than 200 msec sampling. The residual power spectrum and eigenvalue analysis show that the human operator is not a generator of periodic characteristics.

  6. Model Checking Techniques for Assessing Functional Form Specifications in Censored Linear Regression Models.

    PubMed

    León, Larry F; Cai, Tianxi

    2012-04-01

    In this paper we develop model checking techniques for assessing functional form specifications of covariates in censored linear regression models. These procedures are based on a censored data analog to taking cumulative sums of "robust" residuals over the space of the covariate under investigation. These cumulative sums are formed by integrating certain Kaplan-Meier estimators and may be viewed as "robust" censored data analogs to the processes considered by Lin, Wei & Ying (2002). The null distributions of these stochastic processes can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be generated by computer simulation. Each observed process can then be graphically compared with a few realizations from the Gaussian process. We also develop formal test statistics for numerical comparison. Such comparisons enable one to assess objectively whether an apparent trend seen in a residual plot reects model misspecification or natural variation. We illustrate the methods with a well known dataset. In addition, we examine the finite sample performance of the proposed test statistics in simulation experiments. In our simulation experiments, the proposed test statistics have good power of detecting misspecification while at the same time controlling the size of the test.

  7. Occurrence analysis of daily rainfalls through non-homogeneous Poissonian processes

    NASA Astrophysics Data System (ADS)

    Sirangelo, B.; Ferrari, E.; de Luca, D. L.

    2011-06-01

    A stochastic model based on a non-homogeneous Poisson process, characterised by a time-dependent intensity of rainfall occurrence, is employed to explain seasonal effects of daily rainfalls exceeding prefixed threshold values. The data modelling has been performed with a partition of observed daily rainfall data into a calibration period for parameter estimation and a validation period for checking on occurrence process changes. The model has been applied to a set of rain gauges located in different geographical areas of Southern Italy. The results show a good fit for time-varying intensity of rainfall occurrence process by 2-harmonic Fourier law and no statistically significant evidence of changes in the validation period for different threshold values.

  8. Ultrasound use during cardiopulmonary resuscitation is associated with delays in chest compressions.

    PubMed

    Huis In 't Veld, Maite A; Allison, Michael G; Bostick, David S; Fisher, Kiondra R; Goloubeva, Olga G; Witting, Michael D; Winters, Michael E

    2017-10-01

    High-quality chest compressions are a critical component of the resuscitation of patients in cardiopulmonary arrest. Point-of-care ultrasound (POCUS) is used frequently during emergency department (ED) resuscitations, but there has been limited research assessing its benefits and harms during the delivery of cardiopulmonary resuscitation (CPR). We hypothesized that use of POCUS during cardiac arrest resuscitation adversely affects high-quality CPR by lengthening the duration of pulse checks beyond the current cardiopulmonary resuscitation guidelines recommendation of 10s. We conducted a prospective cohort study of adults in cardiac arrest treated in an urban ED between August 2015 and September 2016. Resuscitations were recorded using video equipment in designated resuscitation rooms, and the use of POCUS was documented and timed. A linear mixed-effects model was used to estimate the effect of POCUS on pulse check duration. Twenty-three patients were enrolled in our study. The mean duration of pulse checks with POCUS was 21.0s (95% CI, 18-24) compared with 13.0s (95% CI, 12-15) for those without POCUS. POCUS increased the duration of pulse checks and CPR interruption by 8.4s (95% CI, 6.7-10.0 [p<0.0001]). Age, body mass index (BMI), and procedures did not significantly affect the duration of pulse checks. The use of POCUS during cardiac arrest resuscitation was associated with significantly increased duration of pulse checks, nearly doubling the 10-s maximum duration recommended in current guidelines. It is important for acute care providers to pay close attention to the duration of interruptions in the delivery of chest compressions when using POCUS during cardiac arrest resuscitation. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheu, R; Ghafar, R; Powers, A

    Purpose: Demonstrate the effectiveness of in-house software in ensuring EMR workflow efficiency and safety. Methods: A web-based dashboard system (WBDS) was developed to monitor clinical workflow in real time using web technology (WAMP) through ODBC (Open Database Connectivity). Within Mosaiq (Elekta Inc), operational workflow is driven and indicated by Quality Check Lists (QCLs), which is triggered by automation software IQ Scripts (Elekta Inc); QCLs rely on user completion to propagate. The WBDS retrieves data directly from the Mosaig SQL database and tracks clinical events in real time. For example, the necessity of a physics initial chart check can be determinedmore » by screening all patients on treatment who have received their first fraction and who have not yet had their first chart check. Monitoring similar “real” events with our in-house software creates a safety net as its propagation does not rely on individual users input. Results: The WBDS monitors the following: patient care workflow (initial consult to end of treatment), daily treatment consistency (scheduling, technique, charges), physics chart checks (initial, EOT, weekly), new starts, missing treatments (>3 warning/>5 fractions, action required), and machine overrides. The WBDS can be launched from any web browser which allows the end user complete transparency and timely information. Since the creation of the dashboards, workflow interruptions due to accidental deletion or completion of QCLs were eliminated. Additionally, all physics chart checks were completed timely. Prompt notifications of treatment record inconsistency and machine overrides have decreased the amount of time between occurrence and execution of corrective action. Conclusion: Our clinical workflow relies primarily on QCLs and IQ Scripts; however, this functionality is not the panacea of safety and efficiency. The WBDS creates a more thorough system of checks to provide a safer and near error-less working environment.« less

  10. Characterization of Neutropenia in Advanced Cancer Patients Following Palbociclib Treatment Using a Population Pharmacokinetic-Pharmacodynamic Modeling and Simulation Approach.

    PubMed

    Sun, Wan; O'Dwyer, Peter J; Finn, Richard S; Ruiz-Garcia, Ana; Shapiro, Geoffrey I; Schwartz, Gary K; DeMichele, Angela; Wang, Diane

    2017-09-01

    Neutropenia is the most commonly reported hematologic toxicity following treatment with palbociclib, a cyclin-dependent kinase 4/6 inhibitor approved for metastatic breast cancer. Using data from 185 advanced cancer patients receiving palbociclib in 3 clinical trials, a pharmacokinetic-pharmacodynamic model was developed to describe the time course of absolute neutrophil count (ANC) and quantify the exposure-response relationship for neutropenia. These analyses help in understanding neutropenia associated with palbociclib and its comparison with chemotherapy-induced neutropenia. In the model, palbociclib plasma concentration was related to its antiproliferative effect on precursor cells through drug-related parameters (ie, maximum estimated drug effect and concentration corresponding to 50% of the maximum effect), and neutrophil physiology was mimicked through system-related parameters (ie, mean transit time, baseline ANC, and feedback parameter). Sex and baseline albumin level were significant covariates for baseline ANC. It was demonstrated by different model evaluation approaches (eg, prediction-corrected visual predictive check and standardized visual predictive check) that the final model adequately described longitudinal ANC with good predictive capability. The established model suggested that higher palbociclib exposure was associated with lower longitudinal neutrophil counts. The ANC nadir was reached approximately 21 days after palbociclib treatment initiation. Consistent with their mechanisms of action, neutropenia associated with palbociclib (cytostatic) was rapidly reversible and noncumulative, with a notably weaker antiproliferative effect on precursor cells relative to chemotherapies (cytotoxic). This pharmacokinetic-pharmacodynamic model aids in predicting neutropenia and optimizing dosing for future palbociclib trials with different dosing regimen combinations. © 2017, The American College of Clinical Pharmacology.

  11. Military Suicide Research Consortium

    DTIC Science & Technology

    2014-10-01

    increasing and decreasing (or even ceasing entirely) across different periods of time but still building on itself with each progressive episode...community from suicide. One study found that social norms, high levels of support, identification with role models , and high self-esteem help pro - tect...in follow-up. o Conducted quality control checks of clinical data . Monitored safety, adverse events for DSMB reporting. Initiated Database

  12. Factors affecting medication-order processing time.

    PubMed

    Beaman, M A; Kotzan, J A

    1982-11-01

    The factors affecting medication-order processing time at one hospital were studied. The order processing time was determined by directly observing the time to process randomly selected new drug orders on all three work shifts during two one-week periods. An order could list more than one drug for an individual patient. The observer recorded the nature, location, and cost of the drugs ordered, as well as the time to process the order. The time and type of interruptions also were noted. The time to process a drug order was classified as six dependent variables: (1) total time, (2) work time, (3) check time, (4) waiting time I--time from arrival on the dumbwaiter until work was initiated, (5) waiting time II--time between completion of the work and initiation of checking, and (6) waiting time III--time after the check was completed until the order left on the dumbwaiter. The significant predictors of each of the six dependent variables were determined using stepwise multiple regression. The total time to process a prescription order was 58.33 +/- 48.72 minutes; the urgency status of the order was the only significant determinant of total time. Urgency status also significantly predicted the three waiting-time variables. Interruptions and the number of drugs on the order were significant determinants of work time and check time. Each telephone interruption increased the work time by 1.72 minutes. While the results of this study cannot be generalized to other institutions, pharmacy managers can use the method of determining factors that affect medication-order processing time to identify problem areas in their institutions.

  13. Uptake and impact of regulated pharmacy technicians in Ontario community pharmacies.

    PubMed

    Grootendorst, Paul; Shim, Minsup; Tieu, Jimmy

    2018-01-01

    Since 2010, most provincial Colleges of Pharmacists have licensed pharmacy technicians. The colleges hoped this would give pharmacists time to provide "expanded scope" activities such as medication reviews. Little is known, however, about the uptake and impact of pharmacy technicians on pharmacists' provision of such services. We address these questions using data for Ontario community pharmacies. Data on pharmacists and pharmacy technicians were obtained from the Ontario College of Pharmacists website in September 2016. Their place of employment was used to calculate the number of full-time equivalent (FTE) pharmacists and technicians employed at each community pharmacy. Pharmacy claims data for the 12-month period ending March 31, 2016, were obtained from the Ontario Public Drug Programs (OPDP). These data included number of MedsChecks performed, type of MedsCheck and number of prescriptions dispensed to OPDP beneficiaries. Pharmacy technicians were employed in 24% of the pharmacies in our sample. Technician employment rates were highest in Central Fill pharmacies and pharmacies serving long-term care facilities. In general, pharmacies employing 1 or fewer technician full-time equivalents (FTEs) had a slightly higher probability of providing MedsChecks and, of those that did provide Meds Checks Annuals, provided more of them. Pharmacies that hired 3 or more technician FTEs were markedly less likely to provide MedsChecks. Pharmacies differ in their employment of technicians and in the apparent impact of technicians on the provision of MedsChecks. However, these represent associations. Additional research is needed to assess the causal effect of technician employment on the provision of MedsChecks.

  14. Uptake and impact of regulated pharmacy technicians in Ontario community pharmacies

    PubMed Central

    Grootendorst, Paul; Shim, Minsup

    2018-01-01

    Background: Since 2010, most provincial Colleges of Pharmacists have licensed pharmacy technicians. The colleges hoped this would give pharmacists time to provide “expanded scope” activities such as medication reviews. Little is known, however, about the uptake and impact of pharmacy technicians on pharmacists’ provision of such services. We address these questions using data for Ontario community pharmacies. Methods: Data on pharmacists and pharmacy technicians were obtained from the Ontario College of Pharmacists website in September 2016. Their place of employment was used to calculate the number of full-time equivalent (FTE) pharmacists and technicians employed at each community pharmacy. Pharmacy claims data for the 12-month period ending March 31, 2016, were obtained from the Ontario Public Drug Programs (OPDP). These data included number of MedsChecks performed, type of MedsCheck and number of prescriptions dispensed to OPDP beneficiaries. Results: Pharmacy technicians were employed in 24% of the pharmacies in our sample. Technician employment rates were highest in Central Fill pharmacies and pharmacies serving long-term care facilities. In general, pharmacies employing 1 or fewer technician full-time equivalents (FTEs) had a slightly higher probability of providing MedsChecks and, of those that did provide Meds Checks Annuals, provided more of them. Pharmacies that hired 3 or more technician FTEs were markedly less likely to provide MedsChecks. Conclusions: Pharmacies differ in their employment of technicians and in the apparent impact of technicians on the provision of MedsChecks. However, these represent associations. Additional research is needed to assess the causal effect of technician employment on the provision of MedsChecks. PMID:29796133

  15. An experimental method to verify soil conservation by check dams on the Loess Plateau, China.

    PubMed

    Xu, X Z; Zhang, H W; Wang, G Q; Chen, S C; Dang, W Q

    2009-12-01

    A successful experiment with a physical model requires necessary conditions of similarity. This study presents an experimental method with a semi-scale physical model. The model is used to monitor and verify soil conservation by check dams in a small watershed on the Loess Plateau of China. During experiments, the model-prototype ratio of geomorphic variables was kept constant under each rainfall event. Consequently, experimental data are available for verification of soil erosion processes in the field and for predicting soil loss in a model watershed with check dams. Thus, it can predict the amount of soil loss in a catchment. This study also mentions four criteria: similarities of watershed geometry, grain size and bare land, Froude number (Fr) for rainfall event, and soil erosion in downscaled models. The efficacy of the proposed method was confirmed using these criteria in two different downscaled model experiments. The B-Model, a large scale model, simulates watershed prototype. The two small scale models, D(a) and D(b), have different erosion rates, but are the same size. These two models simulate hydraulic processes in the B-Model. Experiment results show that while soil loss in the small scale models was converted by multiplying the soil loss scale number, it was very close to that of the B-Model. Obviously, with a semi-scale physical model, experiments are available to verify and predict soil loss in a small watershed area with check dam system on the Loess Plateau, China.

  16. 75 FR 66655 - Airworthiness Directives; PILATUS Aircraft Ltd. Model PC-7 Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-29

    ... December 3, 2010 (the effective date of this AD), check the airplane maintenance records to determine if... of the airplane. Do this check following paragraph 3.A. of Pilatus Aircraft Ltd. PC-7 Service... maintenance records check required in paragraph (f)(1) of this AD or it is unclear whether or not the left and...

  17. Detecting and modelling delayed density-dependence in abundance time series of a small mammal (Didelphis aurita)

    NASA Astrophysics Data System (ADS)

    Brigatti, E.; Vieira, M. V.; Kajin, M.; Almeida, P. J. A. L.; de Menezes, M. A.; Cerqueira, R.

    2016-02-01

    We study the population size time series of a Neotropical small mammal with the intent of detecting and modelling population regulation processes generated by density-dependent factors and their possible delayed effects. The application of analysis tools based on principles of statistical generality are nowadays a common practice for describing these phenomena, but, in general, they are more capable of generating clear diagnosis rather than granting valuable modelling. For this reason, in our approach, we detect the principal temporal structures on the bases of different correlation measures, and from these results we build an ad-hoc minimalist autoregressive model that incorporates the main drivers of the dynamics. Surprisingly our model is capable of reproducing very well the time patterns of the empirical series and, for the first time, clearly outlines the importance of the time of attaining sexual maturity as a central temporal scale for the dynamics of this species. In fact, an important advantage of this analysis scheme is that all the model parameters are directly biologically interpretable and potentially measurable, allowing a consistency check between model outputs and independent measurements.

  18. Multilevel modeling and panel data analysis in educational research (Case study: National examination data senior high school in West Java)

    NASA Astrophysics Data System (ADS)

    Zulvia, Pepi; Kurnia, Anang; Soleh, Agus M.

    2017-03-01

    Individual and environment are a hierarchical structure consist of units grouped at different levels. Hierarchical data structures are analyzed based on several levels, with the lowest level nested in the highest level. This modeling is commonly call multilevel modeling. Multilevel modeling is widely used in education research, for example, the average score of National Examination (UN). While in Indonesia UN for high school student is divided into natural science and social science. The purpose of this research is to develop multilevel and panel data modeling using linear mixed model on educational data. The first step is data exploration and identification relationships between independent and dependent variable by checking correlation coefficient and variance inflation factor (VIF). Furthermore, we use a simple model approach with highest level of the hierarchy (level-2) is regency/city while school is the lowest of hierarchy (level-1). The best model was determined by comparing goodness-of-fit and checking assumption from residual plots and predictions for each model. Our finding that for natural science and social science, the regression with random effects of regency/city and fixed effects of the time i.e multilevel model has better performance than the linear mixed model in explaining the variability of the dependent variable, which is the average scores of UN.

  19. Estimation of trends

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The application of statistical methods to recorded ozone measurements. The effects of a long term depletion of ozone at magnitudes predicted by the NAS is harmful to most forms of life. Empirical prewhitening filters the derivation of which is independent of the underlying physical mechanisms were analyzed. Statistical analysis performs a checks and balances effort. Time series filters variations into systematic and random parts, errors are uncorrelated, and significant phase lag dependencies are identified. The use of time series modeling to enhance the capability of detecting trends is discussed.

  20. "I share, therefore I am": personality traits, life satisfaction, and Facebook check-ins.

    PubMed

    Wang, Shaojung Sharon

    2013-12-01

    This study explored whether agreeableness, extraversion, and openness function to influence self-disclosure behavior, which in turn impacts the intensity of checking in on Facebook. A complete path from extraversion to Facebook check-in through self-disclosure and sharing was found. The indirect effect from sharing to check-in intensity through life satisfaction was particularly salient. The central component of check-in is for users to disclose a specific location selectively that has implications on demonstrating their social lives, lifestyles, and tastes, enabling a selective and optimized self-image. Implications on the hyperpersonal model and warranting principle are discussed.

  1. Radar-driven High-resolution Hydrometeorological Forecasts of the 26 September 2007 Venice flash flood

    NASA Astrophysics Data System (ADS)

    Massimo Rossa, Andrea; Laudanna Del Guerra, Franco; Borga, Marco; Zanon, Francesco; Settin, Tommaso; Leuenberger, Daniel

    2010-05-01

    Space and time scales of flash floods are such that flash flood forecasting and warning systems depend upon the accurate real-time provision of rainfall information, high-resolution numerical weather prediction (NWP) forecasts and the use of hydrological models. Currently available high-resolution NWP model models can potentially provide warning forecasters information on the future evolution of storms and their internal structure, thereby increasing convective-scale warning lead times. However, it is essential that the model be started with a very accurate representation of on-going convection, which calls for assimilation of high-resolution rainfall data. This study aims to assess the feasibility of using carefully checked radar-derived quantitative precipitation estimates (QPE) for assimilation into NWP and hydrological models. The hydrometeorological modeling chain includes the convection-permitting NWP model COSMO-2 and a hydrologic-hydraulic models built upon the concept of geomorphological transport. Radar rainfall observations are assimilated into the NWP model via the latent heat nudging method. The study is focused on 26 September 2007 extreme flash flood event which impacted the coastal area of north-eastern Italy around Venice. The hydro-meteorological modeling system is implemented over the Dese river, a 90 km2 catchment flowing to the Venice lagoon. The radar rainfall observations are carefully checked for artifacts, including beam attenuation, by means of physics-based correction procedures and comparison with a dense network of raingauges. The impact of the radar QPE in the assimilation cycle of the NWP model is very significant, in that the main individual organized convective systems were successfully introduced into the model state, both in terms of timing and localization. Also, incorrectly localized precipitation in the model reference run without rainfall assimilation was correctly reduced to about the observed levels. On the other hand, the highest rainfall intensities were underestimated by 20% at a scale of 1000 km2, and the local peaks by 50%. The positive impact of the assimilated radar rainfall was carried over into the free forecast for about 2-5 hours, depending on when this forecast was started, and was larger, when the main mesoscale convective system was present in the initial conditions. The improvements of the meteorological model simulations were directly propagated to the river flow simulations, with an extension of the warning lead time up to three hours.

  2. Concrete Model Checking with Abstract Matching and Refinement

    NASA Technical Reports Server (NTRS)

    Pasareanu Corina S.; Peianek Radek; Visser, Willem

    2005-01-01

    We propose an abstraction-based model checking method which relies on refinement of an under-approximation of the feasible behaviors of the system under analysis. The method preserves errors to safety properties, since all analyzed behaviors are feasible by definition. The method does not require an abstract transition relation to he generated, but instead executes the concrete transitions while storing abstract versions of the concrete states, as specified by a set of abstraction predicates. For each explored transition. the method checks, with the help of a theorem prover, whether there is any loss of precision introduced by abstraction. The results of these checks are used to decide termination or to refine the abstraction, by generating new abstraction predicates. If the (possibly infinite) concrete system under analysis has a finite bisimulation quotient, then the method is guaranteed to eventually explore an equivalent finite bisimilar structure. We illustrate the application of the approach for checking concurrent programs. We also show how a lightweight variant can be used for efficient software testing.

  3. Discussion on the installation checking method of precast composite floor slab with lattice girders

    NASA Astrophysics Data System (ADS)

    Chen, Li; Jin, Xing; Wang, Yahui; Zhou, Hele; Gu, Jianing

    2018-03-01

    Based on the installation checking requirements of China’s current standards and the international norms for prefabricated structural precast components, it proposed an installation checking method for precast composite floor slab with lattice girders. By taking an equivalent composite beam consisted of a single lattice girder and the precast concrete slab as the checking object, compression instability stress of upper chords and yield stress of slab distribution reinforcement at the maximum positive moment, tensile yield stress of upper chords, slab normal section normal compression stress and shear instability stress of diagonal bars at the maximum negative moment were checked. And the bending stress and deflection of support beams, strength and compression stability bearing capacity of the vertical support, shear bearing capacity of the bolt and compression bearing capacity of steel tube wall at the bolt were checked at the same time. Every different checking object was given a specific load value and load combination. Application of installation checking method was given and testified by example.

  4. Logic Model Checking of Unintended Acceleration Claims in Toyota Vehicles

    NASA Technical Reports Server (NTRS)

    Gamble, Ed

    2012-01-01

    Part of the US Department of Transportation investigation of Toyota sudden unintended acceleration (SUA) involved analysis of the throttle control software, JPL Laboratory for Reliable Software applied several techniques including static analysis and logic model checking, to the software; A handful of logic models were build, Some weaknesses were identified; however, no cause for SUA was found; The full NASA report includes numerous other analyses

  5. Logic Model Checking of Unintended Acceleration Claims in the 2005 Toyota Camry Electronic Throttle Control System

    NASA Technical Reports Server (NTRS)

    Gamble, Ed; Holzmann, Gerard

    2011-01-01

    Part of the US DOT investigation of Toyota SUA involved analysis of the throttle control software. JPL LaRS applied several techniques, including static analysis and logic model checking, to the software. A handful of logic models were built. Some weaknesses were identified; however, no cause for SUA was found. The full NASA report includes numerous other analyses

  6. Factor Structure and Scale Reliabilities of the Adjective Check List Across Time

    ERIC Educational Resources Information Center

    Miller, Stephen H.; And Others

    1978-01-01

    Investigated factor structure and scale reliabilities of Gough's Adjective Check List (ACL) and their stability over time. Employees in a community mental health center completed the ACL twice, separated by a one-year interval. After each administration, separate factor analyses were computed. All scales had highly significant test-retest…

  7. [Examination of safety improvement by failure record analysis that uses reliability engineering].

    PubMed

    Kato, Kyoichi; Sato, Hisaya; Abe, Yoshihisa; Ishimori, Yoshiyuki; Hirano, Hiroshi; Higashimura, Kyoji; Amauchi, Hiroshi; Yanakita, Takashi; Kikuchi, Kei; Nakazawa, Yasuo

    2010-08-20

    How the maintenance checks of the medical treatment system, including start of work check and the ending check, was effective for preventive maintenance and the safety improvement was verified. In this research, date on the failure of devices in multiple facilities was collected, and the data of the trouble repair record was analyzed by the technique of reliability engineering. An analysis of data on the system (8 general systems, 6 Angio systems, 11 CT systems, 8 MRI systems, 8 RI systems, and the radiation therapy system 9) used in eight hospitals was performed. The data collection period assumed nine months from April to December 2008. Seven items were analyzed. (1) Mean time between failures (MTBF) (2) Mean time to repair (MTTR) (3) Mean down time (MDT) (4) Number found by check in morning (5) Failure generation time according to modality. The classification of the breakdowns per device, the incidence, and the tendency could be understood by introducing reliability engineering. Analysis, evaluation, and feedback on the failure generation history are useful to keep downtime to a minimum and to ensure safety.

  8. The NASA Lewis integrated propulsion and flight control simulator

    NASA Technical Reports Server (NTRS)

    Bright, Michelle M.; Simon, Donald L.

    1991-01-01

    A new flight simulation facility has been developed at NASA Lewis to allow integrated propulsion-control and flight-control algorithm development and evaluation in real time. As a preliminary check of the simulator facility and the correct integration of its components, the control design and physics models for an STOVL fighter aircraft model have been demonstrated, with their associated system integration and architecture, pilot vehicle interfaces, and display symbology. The results show that this fixed-based flight simulator can provide real-time feedback and display of both airframe and propulsion variables for validation of integrated systems and testing of control design methodologies and cockpit mechanizations.

  9. Model selection and assessment for multi­-species occupancy models

    USGS Publications Warehouse

    Broms, Kristin M.; Hooten, Mevin B.; Fitzpatrick, Ryan M.

    2016-01-01

    While multi-species occupancy models (MSOMs) are emerging as a popular method for analyzing biodiversity data, formal checking and validation approaches for this class of models have lagged behind. Concurrent with the rise in application of MSOMs among ecologists, a quiet regime shift is occurring in Bayesian statistics where predictive model comparison approaches are experiencing a resurgence. Unlike single-species occupancy models that use integrated likelihoods, MSOMs are usually couched in a Bayesian framework and contain multiple levels. Standard model checking and selection methods are often unreliable in this setting and there is only limited guidance in the ecological literature for this class of models. We examined several different contemporary Bayesian hierarchical approaches for checking and validating MSOMs and applied these methods to a freshwater aquatic study system in Colorado, USA, to better understand the diversity and distributions of plains fishes. Our findings indicated distinct differences among model selection approaches, with cross-validation techniques performing the best in terms of prediction.

  10. 78 FR 40063 - Airworthiness Directives; Erickson Air-Crane Incorporated Helicopters (Type Certificate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-03

    ... Sikorsky Model S-64E helicopters. The AD requires repetitive checks of the Blade Inspection Method (BIM... and check procedures for BIM blades installed on the Model S-64F helicopters. Several blade spars with a crack emanating from corrosion pits and other damage have been found because of BIM pressure...

  11. Slicing AADL Specifications for Model Checking

    NASA Technical Reports Server (NTRS)

    Odenbrett, Maximilian; Nguyen, Viet Yen; Noll, Thomas

    2010-01-01

    To combat the state-space explosion problem in model checking larger systems, abstraction techniques can be employed. Here, methods that operate on the system specification before constructing its state space are preferable to those that try to minimize the resulting transition system as they generally reduce peak memory requirements. We sketch a slicing algorithm for system specifications written in (a variant of) the Architecture Analysis and Design Language (AADL). Given a specification and a property to be verified, it automatically removes those parts of the specification that are irrelevant for model checking the property, thus reducing the size of the corresponding transition system. The applicability and effectiveness of our approach is demonstrated by analyzing the state-space reduction for an example, employing a translator from AADL to Promela, the input language of the SPIN model checker.

  12. 75 FR 42585 - Airworthiness Directives; Empresa Brasileira de Aeronautica S.A. (EMBRAER) Model ERJ 170 and ERJ...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-22

    ... (Low Stage Bleed Check Valve) specified in Section 1 of the EMBRAER 170 Maintenance Review Board Report...-11-02-002 (Low Stage Bleed Check Valve), specified in Section 1 of the EMBRAER 170 Maintenance Review... Task 36-11-02-002 (Low Stage Bleed Check Valve) specified in Section 1 of the EMBRAER 170 Maintenance...

  13. 75 FR 9816 - Airworthiness Directives; Empresa Brasileira de Aeronautica S.A. (EMBRAER) Model ERJ 170 and ERJ...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    ... maintenance plan to include repetitive functional tests of the low-stage check valve. For certain other... program to include maintenance Task Number 36-11-02- 002 (Low Stage Bleed Check Valve), specified in... Check Valve) in Section 1 of the EMBRAER 170 Maintenance Review Board Report MRB-1621. Issued in Renton...

  14. 75 FR 39811 - Airworthiness Directives; The Boeing Company Model 777 Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-13

    ... Service Bulletin 777-57A0064, dated March 26, 2009, it is not necessary to perform the torque check on the... instructions in Boeing Alert Service Bulletin 777-57A0064, dated March 26, 2009, a torque check is redundant... are less than those for the torque check. Boeing notes that it plans to issue a new revision to this...

  15. Model Checking A Self-Stabilizing Synchronization Protocol for Arbitrary Digraphs

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.

    2012-01-01

    This report presents the mechanical verification of a self-stabilizing distributed clock synchronization protocol for arbitrary digraphs in the absence of faults. This protocol does not rely on assumptions about the initial state of the system, other than the presence of at least one node, and no central clock or a centrally generated signal, pulse, or message is used. The system under study is an arbitrary, non-partitioned digraph ranging from fully connected to 1-connected networks of nodes while allowing for differences in the network elements. Nodes are anonymous, i.e., they do not have unique identities. There is no theoretical limit on the maximum number of participating nodes. The only constraint on the behavior of the node is that the interactions with other nodes are restricted to defined links and interfaces. This protocol deterministically converges within a time bound that is a linear function of the self-stabilization period. A bounded model of the protocol is verified using the Symbolic Model Verifier (SMV) for a subset of digraphs. Modeling challenges of the protocol and the system are addressed. The model checking effort is focused on verifying correctness of the bounded model of the protocol as well as confirmation of claims of determinism and linear convergence with respect to the self-stabilization period.

  16. A Practical Approach to Implementing Real-Time Semantics

    NASA Technical Reports Server (NTRS)

    Luettgen, Gerald; Bhat, Girish; Cleaveland, Rance

    1999-01-01

    This paper investigates implementations of process algebras which are suitable for modeling concurrent real-time systems. It suggests an approach for efficiently implementing real-time semantics using dynamic priorities. For this purpose a proces algebra with dynamic priority is defined, whose semantics corresponds one-to-one to traditional real-time semantics. The advantage of the dynamic-priority approach is that it drastically reduces the state-space sizes of the systems in question while preserving all properties of their functional and real-time behavior. The utility of the technique is demonstrated by a case study which deals with the formal modeling and verification of the SCSI-2 bus-protocol. The case study is carried out in the Concurrency Workbench of North Carolina, an automated verification tool in which the process algebra with dynamic priority is implemented. It turns out that the state space of the bus-protocol model is about an order of magnitude smaller than the one resulting from real-time semantics. The accuracy of the model is proved by applying model checking for verifying several mandatory properties of the bus protocol.

  17. Rain Check Application: Mobile tool to monitor rainfall in remote parts of Haiti

    NASA Astrophysics Data System (ADS)

    Huang, X.; Baird, J.; Chiu, M. T.; Morelli, R.; de Lanerolle, T. R.; Gourley, J. R.

    2011-12-01

    Rainfall observations performed uniformly and continuously over a period of time are valuable inputs in developing climate models and predicting events such as floods and droughts. Rain-Check is a mobile application developed in Google App Inventor Platform, for android based smart phones, to allow field researchers to monitor various rain gauges distributed though out remote regions of Haiti and send daily readings via SMS messages for further analysis and long term trending. Rainfall rate and quantity interact with many other factors to influence erosion, vegetative cover, groundwater recharge, stream water chemistry and runoff into streams impacting agriculture and livestock. Rainfall observation from various sites is especially significant in Haiti with over 80% of the country is mountainous terrain. Data sets from global models and limited number of ground stations do not capture the fine-scale rainfall patterns necessary to describe local climate. Placement and reading of rain gauges are critical to accurate measurement of rainfall.

  18. Q-learning residual analysis: application to the effectiveness of sequences of antipsychotic medications for patients with schizophrenia.

    PubMed

    Ertefaie, Ashkan; Shortreed, Susan; Chakraborty, Bibhas

    2016-06-15

    Q-learning is a regression-based approach that uses longitudinal data to construct dynamic treatment regimes, which are sequences of decision rules that use patient information to inform future treatment decisions. An optimal dynamic treatment regime is composed of a sequence of decision rules that indicate how to optimally individualize treatment using the patients' baseline and time-varying characteristics to optimize the final outcome. Constructing optimal dynamic regimes using Q-learning depends heavily on the assumption that regression models at each decision point are correctly specified; yet model checking in the context of Q-learning has been largely overlooked in the current literature. In this article, we show that residual plots obtained from standard Q-learning models may fail to adequately check the quality of the model fit. We present a modified Q-learning procedure that accommodates residual analyses using standard tools. We present simulation studies showing the advantage of the proposed modification over standard Q-learning. We illustrate this new Q-learning approach using data collected from a sequential multiple assignment randomized trial of patients with schizophrenia. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Molecular dynamics of conformational substates for a simplified protein model

    NASA Astrophysics Data System (ADS)

    Grubmüller, Helmut; Tavan, Paul

    1994-09-01

    Extended molecular dynamics simulations covering a total of 0.232 μs have been carried out on a simplified protein model. Despite its simplified structure, that model exhibits properties similar to those of more realistic protein models. In particular, the model was found to undergo transitions between conformational substates at a time scale of several hundred picoseconds. The computed trajectories turned out to be sufficiently long as to permit a statistical analysis of that conformational dynamics. To check whether effective descriptions neglecting memory effects can reproduce the observed conformational dynamics, two stochastic models were studied. A one-dimensional Langevin effective potential model derived by elimination of subpicosecond dynamical processes could not describe the observed conformational transition rates. In contrast, a simple Markov model describing the transitions between but neglecting dynamical processes within conformational substates reproduced the observed distribution of first passage times. These findings suggest, that protein dynamics generally does not exhibit memory effects at time scales above a few hundred picoseconds, but confirms the existence of memory effects at a picosecond time scale.

  20. Stochastic modelling of the monthly average maximum and minimum temperature patterns in India 1981-2015

    NASA Astrophysics Data System (ADS)

    Narasimha Murthy, K. V.; Saravana, R.; Vijaya Kumar, K.

    2018-04-01

    The paper investigates the stochastic modelling and forecasting of monthly average maximum and minimum temperature patterns through suitable seasonal auto regressive integrated moving average (SARIMA) model for the period 1981-2015 in India. The variations and distributions of monthly maximum and minimum temperatures are analyzed through Box plots and cumulative distribution functions. The time series plot indicates that the maximum temperature series contain sharp peaks in almost all the years, while it is not true for the minimum temperature series, so both the series are modelled separately. The possible SARIMA model has been chosen based on observing autocorrelation function (ACF), partial autocorrelation function (PACF), and inverse autocorrelation function (IACF) of the logarithmic transformed temperature series. The SARIMA (1, 0, 0) × (0, 1, 1)12 model is selected for monthly average maximum and minimum temperature series based on minimum Bayesian information criteria. The model parameters are obtained using maximum-likelihood method with the help of standard error of residuals. The adequacy of the selected model is determined using correlation diagnostic checking through ACF, PACF, IACF, and p values of Ljung-Box test statistic of residuals and using normal diagnostic checking through the kernel and normal density curves of histogram and Q-Q plot. Finally, the forecasting of monthly maximum and minimum temperature patterns of India for the next 3 years has been noticed with the help of selected model.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zaks, D; Fletcher, R; Salamon, S

    Purpose: To develop an online framework that tracks a patient’s plan from initial simulation to treatment and that helps automate elements of the physics plan checks usually performed in the record and verify (RV) system and treatment planning system. Methods: We have developed PlanTracker, an online plan tracking system that automatically imports new patients tasks and follows it through treatment planning, physics checks, therapy check, and chart rounds. A survey was designed to collect information about the amount of time spent by medical physicists in non-physics related tasks. We then assessed these non-physics tasks for automation. Using these surveys, wemore » directed our PlanTracker software development towards the automation of intra-plan physics review. We then conducted a systematic evaluation of PlanTracker’s accuracy by generating test plans in the RV system software designed to mimic real plans, in order to test its efficacy in catching errors both real and theoretical. Results: PlanTracker has proven to be an effective improvement to the clinical workflow in a radiotherapy clinic. We present data indicating that roughly 1/3 of the physics plan check can be automated, and the workflow optimized, and show the functionality of PlanTracker. When the full system is in clinical use we will present data on improvement of time use in comparison to survey data prior to PlanTracker implementation. Conclusion: We have developed a framework for plan tracking and automatic checks in radiation therapy. We anticipate using PlanTracker as a basis for further development in clinical/research software. We hope that by eliminating the most simple and time consuming checks, medical physicists may be able to spend their time on plan quality and other physics tasks rather than in arithmetic and logic checks. We see this development as part of a broader initiative to advance the clinical/research informatics infrastructure surrounding the radiotherapy clinic. This research project has been financially supported by Varian Medical Systems, Palo Alto, CA, through a Varian MRA.« less

  2. "The five-minute check-in" intervention to ease the transition into professional education: A descriptive analysis.

    PubMed

    Cox-Davenport, Rebecca A

    2017-03-01

    Students can have problems transitioning into nursing education, and nursing instructors can have an impact on this transition by using an active coaching role. The objective of this study was to evaluate how early an academic coaching intervention helped students progress during the beginning of their first nursing semester. Student perceptions of the intervention were also explored. This study followed a descriptive non-experimental design. A nonprobability convenience sample was used. A four-year Bachelor's nursing program at a private college in central Pennsylvania. The sample included 22 first semester students enrolled in their first nursing course. For the first five weeks of the semester students were asked to meet with their nursing course instructors for "five minute check-ins". Students were coached on time management, study skills, access to resources, stress management, upcoming assignments, and grades. An online survey was also sent to students regarding their check-in experience. The student coaching needs changed throughout the five week intervention. At first students heavily needed time management coaching. Study skill coaching was a steady need through the second through fifth week, and stress management coaching increased during the last week of data collection, which was along the same time as their first exams. Students who attended four to five of the weekly visits had higher first test scores and higher overall course grades. The majority of students reported benefits for attending check-in visits including organization, study skills, and feeling more connected to the instructor. Students reported an overall benefit to attending check-in visits. Course instructors were able to intervene early with students' academic problems, and help students gain access to resources. Although the check-ins were to be brief visits, there was an impact on instructors time during the check-in weeks. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. TU-D-201-06: HDR Plan Prechecks Using Eclipse Scripting API

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palaniswaamy, G; Morrow, A; Kim, S

    Purpose: Automate brachytherapy treatment plan quality check using Eclipse v13.6 scripting API based on pre-configured rules to minimize human error and maximize efficiency. Methods: The HDR Precheck system is developed based on a rules-driven approach using Eclipse scripting API. This system checks for critical plan parameters like channel length, first source position, source step size and channel mapping. The planned treatment time is verified independently based on analytical methods. For interstitial or SAVI APBI treatment plans, a Patterson-Parker system calculation is performed to verify the planned treatment time. For endobronchial treatments, an analytical formula from TG-59 is used. Acceptable tolerancesmore » were defined based on clinical experiences in our department. The system was designed to show PASS/FAIL status levels. Additional information, if necessary, is indicated appropriately in a separate comments field in the user interface. Results: The HDR Precheck system has been developed and tested to verify the treatment plan parameters that are routinely checked by the clinical physicist. The report also serves as a reminder or checklist for the planner to perform any additional critical checks such as applicator digitization or scenarios where the channel mapping was intentionally changed. It is expected to reduce the current manual plan check time from 15 minutes to <1 minute. Conclusion: Automating brachytherapy plan prechecks significantly reduces treatment plan precheck time and reduces human errors. When fully developed, this system will be able to perform TG-43 based second check of the treatment planning system’s dose calculation using random points in the target and critical structures. A histogram will be generated along with tabulated mean and standard deviation values for each structure. A knowledge database will also be developed for Brachyvision plans which will then be used for knowledge-based plan quality checks to further reduce treatment planning errors and increase confidence in the planned treatment.« less

  4. Real-time simulation model of the HL-20 lifting body

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Cruz, Christopher I.; Ragsdale, W. A.

    1992-01-01

    A proposed manned spacecraft design, designated the HL-20, has been under investigation at Langley Research Center. Included in that investigation are flight control design and flying qualities studies utilizing a man-in-the-loop real-time simulator. This report documents the current real-time simulation model of the HL-20 lifting body vehicle, known as version 2.0, presently in use at NASA Langley Research Center. Included are data on vehicle aerodynamics, inertias, geometries, guidance and control laws, and cockpit displays and controllers. In addition, trim case and dynamic check case data is provided. The intent of this document is to provide the reader with sufficient information to develop and validate an equivalent simulation of the HL-20 for use in real-time or analytical studies.

  5. Regimes of stability and scaling relations for the removal time in the asteroid belt: a simple kinetic model and numerical tests

    NASA Astrophysics Data System (ADS)

    Cubrovic, Mihailo

    2005-02-01

    We report on our theoretical and numerical results concerning the transport mechanisms in the asteroid belt. We first derive a simple kinetic model of chaotic diffusion and show how it gives rise to some simple correlations (but not laws) between the removal time (the time for an asteroid to experience a qualitative change of dynamical behavior and enter a wide chaotic zone) and the Lyapunov time. The correlations are shown to arise in two different regimes, characterized by exponential and power-law scalings. We also show how is the so-called “stable chaos” (exponential regime) related to anomalous diffusion. Finally, we check our results numerically and discuss their possible applications in analyzing the motion of particular asteroids.

  6. The Significance of Quality Assurance within Model Intercomparison Projects at the World Data Centre for Climate (WDCC)

    NASA Astrophysics Data System (ADS)

    Toussaint, F.; Hoeck, H.; Stockhause, M.; Lautenschlager, M.

    2014-12-01

    The classical goals of a quality assessment system in the data life cycle are (1) to encourage data creators to improve their quality assessment procedures to reach the next quality level and (2) enable data consumers to decide, whether a dataset has a quality that is sufficient for usage in the target application, i.e. to appraise the data usability for their own purpose.As the data volumes of projects and the interdisciplinarity of data usage grow, the need for homogeneous structure and standardised notation of data and metadata increases. This third aspect is especially valid for the data repositories, as they manage data through machine agents. So checks for homogeneity and consistency in early parts of the workflow become essential to cope with today's data volumes.Selected parts of the workflow in the model intercomparison project CMIP5 and the archival of the data for the interdiscipliary user community of the IPCC-DDC AR5 and the associated quality checks are reviewed. We compare data and metadata checks and relate different types of checks to their positions in the data life cycle.The project's data citation approach is included in the discussion, with focus on temporal aspects of the time necessary to comply with the project's requirements for formal data citations and the demand for the availability of such data citations.In order to make different quality assessments of projects comparable, WDCC developed a generic Quality Assessment System. Based on the self-assessment approach of a maturity matrix, an objective and uniform quality level system for all data at WDCC is derived which consists of five maturity quality levels.

  7. Decision Engines for Software Analysis Using Satisfiability Modulo Theories Solvers

    NASA Technical Reports Server (NTRS)

    Bjorner, Nikolaj

    2010-01-01

    The area of software analysis, testing and verification is now undergoing a revolution thanks to the use of automated and scalable support for logical methods. A well-recognized premise is that at the core of software analysis engines is invariably a component using logical formulas for describing states and transformations between system states. The process of using this information for discovering and checking program properties (including such important properties as safety and security) amounts to automatic theorem proving. In particular, theorem provers that directly support common software constructs offer a compelling basis. Such provers are commonly called satisfiability modulo theories (SMT) solvers. Z3 is a state-of-the-art SMT solver. It is developed at Microsoft Research. It can be used to check the satisfiability of logical formulas over one or more theories such as arithmetic, bit-vectors, lists, records and arrays. The talk describes some of the technology behind modern SMT solvers, including the solver Z3. Z3 is currently mainly targeted at solving problems that arise in software analysis and verification. It has been applied to various contexts, such as systems for dynamic symbolic simulation (Pex, SAGE, Vigilante), for program verification and extended static checking (Spec#/Boggie, VCC, HAVOC), for software model checking (Yogi, SLAM), model-based design (FORMULA), security protocol code (F7), program run-time analysis and invariant generation (VS3). We will describe how it integrates support for a variety of theories that arise naturally in the context of the applications. There are several new promising avenues and the talk will touch on some of these and the challenges related to SMT solvers. Proceedings

  8. Non-invasive quality evaluation of confluent cells by image-based orientation heterogeneity analysis.

    PubMed

    Sasaki, Kei; Sasaki, Hiroto; Takahashi, Atsuki; Kang, Siu; Yuasa, Tetsuya; Kato, Ryuji

    2016-02-01

    In recent years, cell and tissue therapy in regenerative medicine have advanced rapidly towards commercialization. However, conventional invasive cell quality assessment is incompatible with direct evaluation of the cells produced for such therapies, especially in the case of regenerative medicine products. Our group has demonstrated the potential of quantitative assessment of cell quality, using information obtained from cell images, for non-invasive real-time evaluation of regenerative medicine products. However, image of cells in the confluent state are often difficult to evaluate, because accurate recognition of cells is technically difficult and the morphological features of confluent cells are non-characteristic. To overcome these challenges, we developed a new image-processing algorithm, heterogeneity of orientation (H-Orient) processing, to describe the heterogeneous density of cells in the confluent state. In this algorithm, we introduced a Hessian calculation that converts pixel intensity data to orientation data and a statistical profiling calculation that evaluates the heterogeneity of orientations within an image, generating novel parameters that yield a quantitative profile of an image. Using such parameters, we tested the algorithm's performance in discriminating different qualities of cellular images with three types of clinically important cell quality check (QC) models: remaining lifespan check (QC1), manipulation error check (QC2), and differentiation potential check (QC3). Our results show that our orientation analysis algorithm could predict with high accuracy the outcomes of all types of cellular quality checks (>84% average accuracy with cross-validation). Copyright © 2015 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  9. Efficient Craig Interpolation for Linear Diophantine (Dis)Equations and Linear Modular Equations

    DTIC Science & Technology

    2008-02-01

    Craig interpolants has enabled the development of powerful hardware and software model checking techniques. Efficient algorithms are known for computing...interpolants in rational and real linear arithmetic. We focus on subsets of integer linear arithmetic. Our main results are polynomial time algorithms ...congruences), and linear diophantine disequations. We show the utility of the proposed interpolation algorithms for discovering modular/divisibility predicates

  10. International Workshop on Discrete Time Domain Modelling of Electromagnetic Fields and Networks (2nd) Held in Berlin, Germany on October 28-29, 1993

    DTIC Science & Technology

    1993-10-29

    natural logarithm of the ratio of two maxima a period apart. Both methods are based on the results from the numerical integration. The details of this...check and okay member funtions are for sofware handshaking between the client and sever pracrss. Finally, the Forward function is used to initiate a

  11. Checking-up of optical graduated rules by laser interferometry

    NASA Astrophysics Data System (ADS)

    Miron, Nicolae P.; Sporea, Dan G.

    1996-05-01

    The main aspects related to the operating principle, design, and implementation of high-productivity equipment for checking-up the graduation accuracy of optical graduated rules used as a length reference in optical measuring instruments for precision machine tools are presented. The graduation error checking-up is done with a Michelson interferometer as a length transducer. The instrument operation is managed by a computer, which controls the equipment, data acquisition, and processing. The evaluation is performed for rule lengths from 100 to 3000 mm, with a checking-up error less than 2 micrometers/m. The checking-up time is about 15 min for a 1000-mm rule, with averaging over four measurements.

  12. Towards Time Automata and Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Hutzler, G.; Klaudel, H.; Wang, D. Y.

    2004-01-01

    The design of reactive systems must comply with logical correctness (the system does what it is supposed to do) and timeliness (the system has to satisfy a set of temporal constraints) criteria. In this paper, we propose a global approach for the design of adaptive reactive systems, i.e., systems that dynamically adapt their architecture depending on the context. We use the timed automata formalism for the design of the agents' behavior. This allows evaluating beforehand the properties of the system (regarding logical correctness and timeliness), thanks to model-checking and simulation techniques. This model is enhanced with tools that we developed for the automatic generation of code, allowing to produce very quickly a running multi-agent prototype satisfying the properties of the model.

  13. Checking Dimensionality in Item Response Models with Principal Component Analysis on Standardized Residuals

    ERIC Educational Resources Information Center

    Chou, Yeh-Tai; Wang, Wen-Chung

    2010-01-01

    Dimensionality is an important assumption in item response theory (IRT). Principal component analysis on standardized residuals has been used to check dimensionality, especially under the family of Rasch models. It has been suggested that an eigenvalue greater than 1.5 for the first eigenvalue signifies a violation of unidimensionality when there…

  14. Stress analysis of 27% scale model of AH-64 main rotor hub

    NASA Technical Reports Server (NTRS)

    Hodges, R. V.

    1985-01-01

    Stress analysis of an AH-64 27% scale model rotor hub was performed. Component loads and stresses were calculated based upon blade root loads and motions. The static and fatigue analysis indicates positive margins of safety in all components checked. Using the format developed here, the hub can be stress checked for future application.

  15. Involvement of the anterior cingulate cortex in time-based prospective memory task monitoring: An EEG analysis of brain sources using Independent Component and Measure Projection Analysis

    PubMed Central

    Burgos, Pablo; Kilborn, Kerry; Evans, Jonathan J.

    2017-01-01

    Objective Time-based prospective memory (PM), remembering to do something at a particular moment in the future, is considered to depend upon self-initiated strategic monitoring, involving a retrieval mode (sustained maintenance of the intention) plus target checking (intermittent time checks). The present experiment was designed to explore what brain regions and brain activity are associated with these components of strategic monitoring in time-based PM tasks. Method 24 participants were asked to reset a clock every four minutes, while performing a foreground ongoing word categorisation task. EEG activity was recorded and data were decomposed into source-resolved activity using Independent Component Analysis. Common brain regions across participants, associated with retrieval mode and target checking, were found using Measure Projection Analysis. Results Participants decreased their performance on the ongoing task when concurrently performed with the time-based PM task, reflecting an active retrieval mode that relied on withdrawal of limited resources from the ongoing task. Brain activity, with its source in or near the anterior cingulate cortex (ACC), showed changes associated with an active retrieval mode including greater negative ERP deflections, decreased theta synchronization, and increased alpha suppression for events locked to the ongoing task while maintaining a time-based intention. Activity in the ACC was also associated with time-checks and found consistently across participants; however, we did not find an association with time perception processing per se. Conclusion The involvement of the ACC in both aspects of time-based PM monitoring may be related to different functions that have been attributed to it: strategic control of attention during the retrieval mode (distributing attentional resources between the ongoing task and the time-based task) and anticipatory/decision making processing associated with clock-checks. PMID:28863146

  16. Introduction of paramedic led Echo in Life Support into the pre-hospital environment: The PUCA study.

    PubMed

    Reed, Matthew J; Gibson, Louise; Dewar, Alistair; Short, Steven; Black, Polly; Clegg, Gareth R

    2017-03-01

    Can pre-hospital paramedic responders perform satisfactory pre-hospital Echo in Life Support (ELS) during the 10-s pulse check window, and does pre-hospital ELS adversely affect the delivery of cardiac arrest care. Prospective observational study of a cohort of ELS trained paramedics using saved ultrasound clips and wearable camera videos. Between 23rd June 2014 and 31st January 2016, seven Resuscitation Rapid Response Unit (3RU) paramedics attended 45 patients in Lothian suffering out-of-hospital CA where resuscitation was attempted and ELS was available and performed. 80% of first ELS attempts by paramedics produced an adequate view which was excellent/good or satisfactory in 68%. 44% of views were obtained within the 10-s pulse check window with a median time off the chest of 17 (IQR 13-20) seconds. A decision to perform ELS was communicated 67% of the time, and the 10-s pulse check was counted aloud in 60%. A manual pulse check was observed in around a quarter of patients and the rhythm on the monitor was checked 38% of the time. All decision changing scans involved a decision to stop resuscitation. Paramedics are able to obtain good ELS views in the pre-hospital environment but this may lead to longer hands off the chest time and possibly less pulse and monitor checking than is recommended. Future studies will need to demonstrate either improved outcomes or a benefit from identifying patients in whom further resuscitation and transportation is futile, before ELS is widely adopted in most pre-hospital systems. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. New Model Exhaust System Supports Testing in NASA Lewis' 10- by 10-Foot Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Roeder, James W., Jr.

    1998-01-01

    In early 1996, the ability to run NASA Lewis Research Center's Abe Silverstein 10- by 10- Foot Supersonic Wind Tunnel (10x10) at subsonic test section speeds was reestablished. Taking advantage of this new speed range, a subsonic research test program was scheduled for the 10x10 in the fall of 1996. However, many subsonic aircraft test models require an exhaust source to simulate main engine flow, engine bleed flows, and other phenomena. This was also true of the proposed test model, but at the time the 10x10 did not have a model exhaust capability. So, through an in-house effort over a period of only 5 months, a new model exhaust system was designed, installed, checked out, and made ready in time to support the scheduled test program.

  18. Stochastic approaches for time series forecasting of boron: a case study of Western Turkey.

    PubMed

    Durdu, Omer Faruk

    2010-10-01

    In the present study, a seasonal and non-seasonal prediction of boron concentrations time series data for the period of 1996-2004 from Büyük Menderes river in western Turkey are addressed by means of linear stochastic models. The methodology presented here is to develop adequate linear stochastic models known as autoregressive integrated moving average (ARIMA) and multiplicative seasonal autoregressive integrated moving average (SARIMA) to predict boron content in the Büyük Menderes catchment. Initially, the Box-Whisker plots and Kendall's tau test are used to identify the trends during the study period. The measurements locations do not show significant overall trend in boron concentrations, though marginal increasing and decreasing trends are observed for certain periods at some locations. ARIMA modeling approach involves the following three steps: model identification, parameter estimation, and diagnostic checking. In the model identification step, considering the autocorrelation function (ACF) and partial autocorrelation function (PACF) results of boron data series, different ARIMA models are identified. The model gives the minimum Akaike information criterion (AIC) is selected as the best-fit model. The parameter estimation step indicates that the estimated model parameters are significantly different from zero. The diagnostic check step is applied to the residuals of the selected ARIMA models and the results indicate that the residuals are independent, normally distributed, and homoscadastic. For the model validation purposes, the predicted results using the best ARIMA models are compared to the observed data. The predicted data show reasonably good agreement with the actual data. The comparison of the mean and variance of 3-year (2002-2004) observed data vs predicted data from the selected best models show that the boron model from ARIMA modeling approaches could be used in a safe manner since the predicted values from these models preserve the basic statistics of observed data in terms of mean. The ARIMA modeling approach is recommended for predicting boron concentration series of a river.

  19. Observational analysis on inflammatory reaction to talc pleurodesis: Small and large animal model series review

    PubMed Central

    Vannucci, Jacopo; Bellezza, Guido; Matricardi, Alberto; Moretti, Giulia; Bufalari, Antonello; Cagini, Lucio; Puma, Francesco; Daddi, Niccolò

    2018-01-01

    Talc pleurodesis has been associated with pleuropulmonary damage, particularly long-term damage due to its inert nature. The present model series review aimed to assess the safety of this procedure by examining inflammatory stimulus, biocompatibility and tissue reaction following talc pleurodesis. Talc slurry was performed in rabbits: 200 mg/kg checked at postoperative day 14 (five models), 200 mg/kg checked at postoperative day 28 (five models), 40 mg/kg, checked at postoperative day 14 (five models), 40 mg/kg checked at postoperative day 28 (five models). Talc poudrage was performed in pigs: 55 mg/kg checked at postoperative day 60 (18 models). Tissue inspection and data collection followed the surgical pathology approach currently used in clinical practice. As this was an observational study, no statistical analysis was performed. Regarding the rabbit model (Oryctolagus cunicoli), the extent of adhesions ranged between 0 and 30%, and between 0 and 10% following 14 and 28 days, respectively. No intraparenchymal granuloma was observed whereas, pleural granulomas were extensively encountered following both talc dosages, with more evidence of visceral pleura granulomas following 200 mg/kg compared with 40 mg/kg. Severe florid inflammation was observed in 2/10 cases following 40 mg/kg. Parathymic, pericardium granulomas and mediastinal lymphadenopathy were evidenced at 28 days. At 60 days, from rare adhesions to extended pleurodesis were observed in the pig model (Sus Scrofa domesticus). Pleural granulomas were ubiquitous on visceral and parietal pleurae. Severe spotted inflammation among the adhesions were recorded in 15/18 pigs. Intraparenchymal granulomas were observed in 9/18 lungs. Talc produced unpredictable pleurodesis in both animal models with enduring pleural inflammation whether it was performed via slurry or poudrage. Furthermore, talc appeared to have triggered extended pleural damage, intraparenchymal nodules (porcine poudrage) and mediastinal migration (rabbit slurry). PMID:29403549

  20. Model Checking for Verification of Interactive Health IT Systems

    PubMed Central

    Butler, Keith A.; Mercer, Eric; Bahrami, Ali; Tao, Cui

    2015-01-01

    Rigorous methods for design and verification of health IT systems have lagged far behind their proliferation. The inherent technical complexity of healthcare, combined with the added complexity of health information technology makes their resulting behavior unpredictable and introduces serious risk. We propose to mitigate this risk by formalizing the relationship between HIT and the conceptual work that increasingly typifies modern care. We introduce new techniques for modeling clinical workflows and the conceptual products within them that allow established, powerful modeling checking technology to be applied to interactive health IT systems. The new capability can evaluate the workflows of a new HIT system performed by clinicians and computers to improve safety and reliability. We demonstrate the method on a patient contact system to demonstrate model checking is effective for interactive systems and that much of it can be automated. PMID:26958166

  1. Novel color additive for chlorine disinfectants corrects deficiencies in spray surface coverage and wet-contact time and checks for correct chlorine concentration.

    PubMed

    Tyan, Kevin; Jin, Katherine; Kang, Jason; Kyle, Aaron M

    2018-04-18

    Bleach sprays suffer from poor surface coverage, dry out before reaching proper contact time, and can be inadvertently over-diluted to ineffective concentrations. Highlight ® , a novel color additive for bleach that fades to indicate elapsed contact time, maintained >99.9% surface coverage over full contact time and checked for correct chlorine concentration. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  2. Longitudinal Associations between Triglycerides and Metabolic Syndrome Components in a Beijing Adult Population, 2007-2012.

    PubMed

    Tao, Li-Xin; Yang, Kun; Liu, Xiang-Tong; Cao, Kai; Zhu, Hui-Ping; Luo, Yan-Xia; Guo, Jin; Wu, Li-Juan; Li, Xia; Guo, Xiu-Hua

    2016-01-01

    Longitudinal associations between triglycerides (TG) and other metabolic syndrome (MetS) components have rarely been reported. The purpose was to investigate the longitudinal association between TG and other MetS components with time. The longitudinal study was established in 2007 on individuals who attended health check-ups at Beijing Tongren Hospital and Beijing Xiaotangshan Hospital. Data used in this study was based on 7489 participants who had at least three health check-ups over a period of 5-year follow up. Joint model was used to explore longitudinal associations between TG and other MetS components after adjusted for age. There were positive correlations between TG and other MetS components except for high density lipoprotein (HDL), and the correlations increased with time. A negative correlation was displayed between TG and HDL, and the correlation also increased with time. Among all five pairs of TG and other MetS components, the marginal correlation between TG and body mass index (BMI) was the largest for both men and women. The marginal correlation between TG and fasting plasma glucose was the smallest for men, while the marginal correlation between TG and diastolic blood pressure was the smallest for women. The longitudinal association between TG and other MetS components increased with time. Among five pairs of TG and other MetS components, the longitudinal correlation between TG and BMI was the largest. It is important to closely monitor subjects with high levels of TG and BMI in health check-up population especially for women, because these two components are closely associated with development of hypertension, diabetes, cardiovascular disease and other metabolic diseases.

  3. [The effects of social networks on health check-up service use among pre-frail older adults (candidate so-called "specified elderly individuals") compared with older people in general].

    PubMed

    Sugisawa, Hidehiro; Sugihara, Yoko

    2011-09-01

    Nursing care prevention programs cannot accomplish their goals without effective screening of pre-frail older people. Health check-up services provide a very opportunity for this purpose. In the present study we examined not only the direct and indirect effects of social networks on check-up service use among candidate pre-frail older people, but also whether these effects differ from those among older people in general. Subjects for this study were respondents of a survey for probability sampled aged 65 and over living in a city, Tokyo. Individuals who gave effective responses to items used in our analysis made up 55.8 percent of the sample. 734 candidate pre-frail older people were selected using the screening criteria provided by the ministry of Heath, Labor and Welfare. The general category of older people numbered 2,057, excluding the candidates and elderly certified for long-term care. Social networks were measured from five aspects: family size; contact with children or relatives living separately; contact with neighbors or friends; involvement in community activities; and seeing a doctor. Our model of indirect effects of social networks on check-up use included awareness of nursing care prevention programs as a mediating factor. Information about whether the subjects used the health check-up service was provided.by the regional government. Magnitude of the effects was evaluated from two aspects; using statistical tests and focusing on marginal effects. Although none of the social network indicators had direct significant impacts on check-up use, contact with children or relatives living separately, contact with neighbors or friends, or involvement with community activities demonstrated significant indirect influence. Contact with neighbors or friends, involvement with community activities, or seeing a doctor had direct significant effects on use among the general category of older people, but none of the social network indicators demonstrated significant indirect effects. Involvement with community activities had the strongest total (direct plus indirect) effects on the use in the social networks indicators among the candidates when viewed with the focus on marginal effects. However, it was estimated that the rate of use would raise only about 5 percent even if average frequency of contacts with community activities were to increase from less than one time to one time over a month among the candidates. It is suggested that effects of social networks on health check-up service use among candidates of pre-frail older people could be produced by improving awareness of nursing care prevention programs.

  4. Response surface modeling for hot, humid air decontamination of materials contaminated with Bacillus anthracis ∆Sterne and Bacillus thuringiensis Al Hakam spores

    PubMed Central

    2014-01-01

    Response surface methodology using a face-centered cube design was used to describe and predict spore inactivation of Bacillus anthracis ∆Sterne and Bacillus thuringiensis Al Hakam spores after exposure of six spore-contaminated materials to hot, humid air. For each strain/material pair, an attempt was made to fit a first or second order model. All three independent predictor variables (temperature, relative humidity, and time) were significant in the models except that time was not significant for B. thuringiensis Al Hakam on nylon. Modeling was unsuccessful for wiring insulation and wet spores because there was complete spore inactivation in the majority of the experimental space. In cases where a predictive equation could be fit, response surface plots with time set to four days were generated. The survival of highly purified Bacillus spores can be predicted for most materials tested when given the settings for temperature, relative humidity, and time. These predictions were cross-checked with spore inactivation measurements. PMID:24949256

  5. 77 FR 12450 - Airworthiness Directives; BRP-Powertrain GmbH & Co KG Rotax Reciprocating Engines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-01

    ... have been tightened to the correct torque value, i.e. not in accordance with the specification. This... AD requires performing a one-time inspection of the oil system for leaks and a torque check of the... performing a one-time inspection of the oil system for leaks and a torque check of the oil pump attachment...

  6. Intelligent Data Visualization for Cross-Checking Spacecraft System Diagnosis

    NASA Technical Reports Server (NTRS)

    Ong, James C.; Remolina, Emilio; Breeden, David; Stroozas, Brett A.; Mohammed, John L.

    2012-01-01

    Any reasoning system is fallible, so crew members and flight controllers must be able to cross-check automated diagnoses of spacecraft or habitat problems by considering alternate diagnoses and analyzing related evidence. Cross-checking improves diagnostic accuracy because people can apply information processing heuristics, pattern recognition techniques, and reasoning methods that the automated diagnostic system may not possess. Over time, cross-checking also enables crew members to become comfortable with how the diagnostic reasoning system performs, so the system can earn the crew s trust. We developed intelligent data visualization software that helps users cross-check automated diagnoses of system faults more effectively. The user interface displays scrollable arrays of timelines and time-series graphs, which are tightly integrated with an interactive, color-coded system schematic to show important spatial-temporal data patterns. Signal processing and rule-based diagnostic reasoning automatically identify alternate hypotheses and data patterns that support or rebut the original and alternate diagnoses. A color-coded matrix display summarizes the supporting or rebutting evidence for each diagnosis, and a drill-down capability enables crew members to quickly view graphs and timelines of the underlying data. This system demonstrates that modest amounts of diagnostic reasoning, combined with interactive, information-dense data visualizations, can accelerate system diagnosis and cross-checking.

  7. The independent relationship between trouble controlling Facebook use, time spent on the site and distress.

    PubMed

    Muench, Fredrick; Hayes, Marie; Kuerbis, Alexis; Shao, Sijing

    2015-09-01

    There is an emerging literature base on the relationship between maladaptive traits and "addiction" to social networking sites. These studies have operationalized addiction as either spending excessive amounts of time on social networking sites (SNS) or trouble controlling SNS use, but have not assessed the unique contribution of each of these constructs on outcomes in the same models. Moreover, these studies have exclusively been conducted with younger people rather than a heterogeneous sample. This study examined the independent relationship of a brief Facebook addiction scale, time spent on Facebook, and Facebook checking on positive and negative social domains, while controlling for self-esteem and social desirability. Participants were recruited using e-mail, SNS posts and through Amazon's MTurk system. The sample included 489 respondents ages from 18 to approximately 70, who completed a 10-15 minute survey. Results indicate that neither time spent on Facebook nor Facebook checking was significantly associated with either self-esteem, fear of negative social evaluation or social comparison, while SNS addiction symptoms were each independently associated with Facebook usage. Neither time spent on Facebook nor SNS addiction symptoms were associated with positive social relationships. Overall results suggest that time on SNS and trouble controlling use should be considered independent constructs and that interventions should target underlying loss of control as the primary intervention target above ego syntonic time spent on the site.

  8. Bayesian model checking: A comparison of tests

    NASA Astrophysics Data System (ADS)

    Lucy, L. B.

    2018-06-01

    Two procedures for checking Bayesian models are compared using a simple test problem based on the local Hubble expansion. Over four orders of magnitude, p-values derived from a global goodness-of-fit criterion for posterior probability density functions agree closely with posterior predictive p-values. The former can therefore serve as an effective proxy for the difficult-to-calculate posterior predictive p-values.

  9. Secure open cloud in data transmission using reference pattern and identity with enhanced remote privacy checking

    NASA Astrophysics Data System (ADS)

    Vijay Singh, Ran; Agilandeeswari, L.

    2017-11-01

    To handle the large amount of client’s data in open cloud lots of security issues need to be address. Client’s privacy should not be known to other group members without data owner’s valid permission. Sometime clients are fended to have accessing with open cloud servers due to some restrictions. To overcome the security issues and these restrictions related to storing, data sharing in an inter domain network and privacy checking, we propose a model in this paper which is based on an identity based cryptography in data transmission and intermediate entity which have client’s reference with identity that will take control handling of data transmission in an open cloud environment and an extended remote privacy checking technique which will work at admin side. On behalf of data owner’s authority this proposed model will give best options to have secure cryptography in data transmission and remote privacy checking either as private or public or instructed. The hardness of Computational Diffie-Hellman assumption algorithm for key exchange makes this proposed model more secure than existing models which are being used for public cloud environment.

  10. Reduced circuit implementation of encoder and syndrome generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trager, Barry M; Winograd, Shmuel

    An error correction method and system includes an Encoder and Syndrome-generator that operate in parallel to reduce the amount of circuitry used to compute check symbols and syndromes for error correcting codes. The system and method computes the contributions to the syndromes and check symbols 1 bit at a time instead of 1 symbol at a time. As a result, the even syndromes can be computed as powers of the odd syndromes. Further, the system assigns symbol addresses so that there are, for an example GF(2.sup.8) which has 72 symbols, three (3) blocks of addresses which differ by a cubemore » root of unity to allow the data symbols to be combined for reducing size and complexity of odd syndrome circuits. Further, the implementation circuit for generating check symbols is derived from syndrome circuit using the inverse of the part of the syndrome matrix for check locations.« less

  11. SU-F-T-558: ArcCheck for Patient Specific QA in Stereotactic Ablative Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramachandran, P; RMIT University, Bundoora; Tajaldeen, A

    2016-06-15

    Purpose: Stereotactic Ablative Radiotherapy (SABR) is one of the most preferred treatment techniques for early stage lung cancer. This technique has been extended to other treatment sites like Spine, Liver, Scapula, Sternum etc., This has resulted in increased physics QA time on machine. In this study, we’ve tested the feasibility of using ArcCheck as an alternative method to replace film dosimetry. Methods: Twelve patients with varied diagnosis of Lung, Liver, scapula, sternum and Spine undergoing SABR were selected for this study. Pre-treatment QA was performed for all the patients which include ionization chamber and film dosimetry. The required gamma criteriamore » for each SABR plan to pass QA and proceed to treatment is 95% (3%,1mm). In addition to this routine process, the treatment plans were exported on to an ArcCheck phantom. The planned and measured dose from the ArcCheck device were compared using four different gamma criteria: 2%,2 mm, 3%,2 mm, 3%,1 mm and 3%, 3 mm. In addition to this, we’ve also introduced errors to gantry, collimator and couch angle to assess sensitivity of the ArcCheck with potential delivery errors. Results: The ArcCheck mean passing rates for all twelve cases were 76.1%±9.7% for gamma criteria 3%,1 mm, 89.5%±5.3% for 2%,2 mm, 92.6%±4.2% for 3%,2 mm, and 97.6%±2.4% for 3%,3 mm gamma criteria. When SABR spine cases are excluded, we observe ArcCheck passing rates higher than 95% for all the studied cases with 3%, 3mm, and ArcCheck results in acceptable agreement with the film gamma results. Conclusion: Our ArcCheck results at 3%, 3 mm were found to correlate well with our non-SABR spine routine patient specific QA results (3%,1 mm). We observed significant reduction in QA time on using ArcCheck for SABR QA. This study shows that ArcCheck could replace film dosimetry for all sites except SABR spine.« less

  12. Spot-checks to measure general hygiene practice.

    PubMed

    Sonego, Ina L; Mosler, Hans-Joachim

    2016-01-01

    A variety of hygiene behaviors are fundamental to the prevention of diarrhea. We used spot-checks in a survey of 761 households in Burundi to examine whether something we could call general hygiene practice is responsible for more specific hygiene behaviors, ranging from handwashing to sweeping the floor. Using structural equation modeling, we showed that clusters of hygiene behavior, such as primary caregivers' cleanliness and household cleanliness, explained the spot-check findings well. Within our model, general hygiene practice as overall concept explained the more specific clusters of hygiene behavior well. Furthermore, the higher general hygiene practice, the more likely children were to be categorized healthy (r = 0.46). General hygiene practice was correlated with commitment to hygiene (r = 0.52), indicating a strong association to psychosocial determinants. The results show that different hygiene behaviors co-occur regularly. Using spot-checks, the general hygiene practice of a household can be rated quickly and easily.

  13. Adopting software quality measures for healthcare processes.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2009-01-01

    In this study, we investigated the adoptability of software quality measures for healthcare process measurement. Quality measures of ISO/IEC 9126 are redefined from a process perspective to build a generic healthcare process quality measurement model. Case study research method is used, and the model is applied to a public hospital's Entry to Care process. After the application, weak and strong aspects of the process can be easily observed. Access audibility, fault removal, completeness of documentation, and machine utilization are weak aspects and these aspects are the candidates for process improvement. On the other hand, functional completeness, fault ratio, input validity checking, response time, and throughput time are the strong aspects of the process.

  14. Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations. Volume 2; Appendices

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.

    2015-01-01

    This NASA Engineering and Safety Center (NESC) assessment was established to develop a set of time histories for the flight behavior of increasingly complex example aerospacecraft that could be used to partially validate various simulation frameworks. The assessment was conducted by representatives from several NASA Centers and an open-source simulation project. This document contains details on models, implementation, and results.

  15. MO-FG-CAMPUS-TeP1-03: Pre-Treatment Surface Imaging Based Collision Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiant, D; Maurer, J; Liu, H

    2016-06-15

    Purpose: Modern radiotherapy increasingly employs large immobilization devices, gantry attachments, and couch rotations for treatments. All of which raise the risk of collisions between the patient and the gantry / couch. Collision detection is often achieved by manually checking each couch position in the treatment room and sometimes results in extraneous imaging if collisions are detected after image based setup has begun. In the interest of improving efficiency and avoiding extra imaging, we explore the use of a surface imaging based collision detection model. Methods: Surfaces acquired from AlignRT (VisionRT, London, UK) were transferred in wavefront format to a custommore » Matlab (Mathworks, Natick, MA) software package (CCHECK). Computed tomography (CT) scans acquired at the same time were sent to CCHECK in DICOM format. In CCHECK, binary maps of the surfaces were created and overlaid on the CT images based on the fixed relationship of the AlignRT and CT coordinate systems. Isocenters were added through a graphical user interface (GUI). CCHECK then compares the inputted surfaces to a model of the linear accelerator (linac) to check for collisions at defined gantry and couch positions. Note, CCHECK may be used with or without a CT. Results: The nominal surface image field of view is 650 mm × 900 mm, with variance based on patient position and size. The accuracy of collision detections is primarily based on the linac model and the surface mapping process. The current linac model and mapping process yield detection accuracies on the order of 5 mm, assuming no change in patient posture between surface acquisition and treatment. Conclusions: CCHECK provides a non-ionizing method to check for collisions without the patient in the treatment room. Collision detection accuracy may be improved with more robust linac modeling. Additional gantry attachments (e.g. conical collimators) can be easily added to the model.« less

  16. Improvement on post-OPC verification efficiency for contact/via coverage check by final CD biasing of metal lines and considering their location on the metal layout

    NASA Astrophysics Data System (ADS)

    Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong

    2011-04-01

    As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model application is presented.

  17. Multi-state models for colon cancer recurrence and death with a cured fraction.

    PubMed

    Conlon, A S C; Taylor, J M G; Sargent, D J

    2014-05-10

    In cancer clinical trials, patients often experience a recurrence of disease prior to the outcome of interest, overall survival. Additionally, for many cancers, there is a cured fraction of the population who will never experience a recurrence. There is often interest in how different covariates affect the probability of being cured of disease and the time to recurrence, time to death, and time to death after recurrence. We propose a multi-state Markov model with an incorporated cured fraction to jointly model recurrence and death in colon cancer. A Bayesian estimation strategy is used to obtain parameter estimates. The model can be used to assess how individual covariates affect the probability of being cured and each of the transition rates. Checks for the adequacy of the model fit and for the functional forms of covariates are explored. The methods are applied to data from 12 randomized trials in colon cancer, where we show common effects of specific covariates across the trials. Copyright © 2013 John Wiley & Sons, Ltd.

  18. XMI2USE: A Tool for Transforming XMI to USE Specifications

    NASA Astrophysics Data System (ADS)

    Sun, Wuliang; Song, Eunjee; Grabow, Paul C.; Simmonds, Devon M.

    The UML-based Specification Environment (USE) tool supports syntactic analysis, type checking, consistency checking, and dynamic validation of invariants and pre-/post conditions specified in the Object Constraint Language (OCL). Due to its animation and analysis power, it is useful when checking critical non-functional properties such as security policies. However, the USE tool requires one to specify (i.e., "write") a model using its own textual language and does not allow one to import any model specification files created by other UML modeling tools. Hence, to make the best use of existing UML tools, we often create a model with OCL constraints using a modeling tool such as the IBM Rational Software Architect (RSA) and then use the USE tool for model validation. This approach, however, requires a manual transformation between the specifications of two different tool formats, which is error-prone and diminishes the benefit of automated model-level validations. In this paper, we describe our own implementation of a specification transformation engine that is based on the Model Driven Architecture (MDA) framework and currently supports automatic tool-level transformations from RSA to USE.

  19. Nonlinear multi-analysis of agent-based financial market dynamics by epidemic system

    NASA Astrophysics Data System (ADS)

    Lu, Yunfan; Wang, Jun; Niu, Hongli

    2015-10-01

    Based on the epidemic dynamical system, we construct a new agent-based financial time series model. In order to check and testify its rationality, we compare the statistical properties of the time series model with the real stock market indices, Shanghai Stock Exchange Composite Index and Shenzhen Stock Exchange Component Index. For analyzing the statistical properties, we combine the multi-parameter analysis with the tail distribution analysis, the modified rescaled range analysis, and the multifractal detrended fluctuation analysis. For a better perspective, the three-dimensional diagrams are used to present the analysis results. The empirical research in this paper indicates that the long-range dependence property and the multifractal phenomenon exist in the real returns and the proposed model. Therefore, the new agent-based financial model can recurrence some important features of real stock markets.

  20. Impact of geographic accessibility on utilization of the annual health check-ups by income level in Japan: A multilevel analysis.

    PubMed

    Fujita, Misuzu; Sato, Yasunori; Nagashima, Kengo; Takahashi, Sho; Hata, Akira

    2017-01-01

    Although both geographic accessibility and socioeconomic status have been indicated as being important factors for the utilization of health care services, their combined effect has not been evaluated. The aim of this study was to reveal whether an income-dependent difference in the impact of geographic accessibility on the utilization of government-led annual health check-ups exists. Existing data collected and provided by Chiba City Hall were employed and analyzed as a retrospective cohort study. The subjects were 166,966 beneficiaries of National Health Insurance in Chiba City, Japan, aged 40 to 74 years. Of all subjects, 54,748 (32.8%) had an annual health check-up in fiscal year 2012. As an optimal index of geographic accessibility has not been established, five measures were calculated: travel time to the nearest health care facility, density of health care facilities (number facilities within a 30-min walking distance from the district of residence), and three indices based on the two-step floating catchment area method. Three-level logistic regression modeling with random intercepts for household and district of residence was performed. Of the five measures, density of health care facilities was the most compatible according to Akaike's information criterion. Both low density and low income were associated with decreased utilization of the health check-ups. Furthermore, a linear relationship was observed between the density of facilities and utilization of the health check-ups in all income groups and its slope was significantly steeper among subjects with an equivalent income of 0.00 yen than among those with equivalent income of 1.01-2.00 million yen (p = 0.028) or 2.01 million yen or more (p = 0.040). This result indicated that subjects with lower incomes were more susceptible to the effects of geographic accessibility than were those with higher incomes. Thus, better geographic accessibility could increase the health check-up utilization and also decrease the income-related disparity of utilization.

  1. Full-Authority Fault-Tolerant Electronic Engine Control System for Variable Cycle Engines.

    DTIC Science & Technology

    1982-04-01

    single internally self-checked VLSI micro - processor . The selected configuration is an externally checked pair of com- mercially available...Electronic Engine Control FPMH Failures per Million Hours FTMP Fault Tolerant Multi- Processor FTSC Fault Tolerant Spaceborn Computer GRAMP Generalized...Removal * MTBR Mean Time Between Repair MTTF Mean Time to Failure xiii List of Abbreviations (continued) - NH High Pressure Rotor Speed O&S Operating

  2. Xcas as a Programming Environment for Stability Conditions for a Class of Differential Equation Models in Economics

    NASA Astrophysics Data System (ADS)

    Halkos, George E.; Tsilika, Kyriaki D.

    2011-09-01

    In this paper we examine the property of asymptotic stability in several dynamic economic systems, modeled in ordinary differential equation formulations of time parameter t. Asymptotic stability ensures intertemporal equilibrium for the economic quantity the solution stands for, regardless of what the initial conditions happen to be. Existence of economic equilibrium in continuous time models is checked via a Symbolic language, the Xcas program editor. Using stability theorems of differential equations as background a brief overview of symbolic capabilities of free software Xcas is given. We present computational experience with a programming style for stability results of ordinary linear and nonlinear differential equations. Numerical experiments on traditional applications of economic dynamics exhibit the simplicity clarity and brevity of input and output of our computer codes.

  3. Model-Driven Safety Analysis of Closed-Loop Medical Systems

    PubMed Central

    Pajic, Miroslav; Mangharam, Rahul; Sokolsky, Oleg; Arney, David; Goldman, Julian; Lee, Insup

    2013-01-01

    In modern hospitals, patients are treated using a wide array of medical devices that are increasingly interacting with each other over the network, thus offering a perfect example of a cyber-physical system. We study the safety of a medical device system for the physiologic closed-loop control of drug infusion. The main contribution of the paper is the verification approach for the safety properties of closed-loop medical device systems. We demonstrate, using a case study, that the approach can be applied to a system of clinical importance. Our method combines simulation-based analysis of a detailed model of the system that contains continuous patient dynamics with model checking of a more abstract timed automata model. We show that the relationship between the two models preserves the crucial aspect of the timing behavior that ensures the conservativeness of the safety analysis. We also describe system design that can provide open-loop safety under network failure. PMID:24177176

  4. Model-Driven Safety Analysis of Closed-Loop Medical Systems.

    PubMed

    Pajic, Miroslav; Mangharam, Rahul; Sokolsky, Oleg; Arney, David; Goldman, Julian; Lee, Insup

    2012-10-26

    In modern hospitals, patients are treated using a wide array of medical devices that are increasingly interacting with each other over the network, thus offering a perfect example of a cyber-physical system. We study the safety of a medical device system for the physiologic closed-loop control of drug infusion. The main contribution of the paper is the verification approach for the safety properties of closed-loop medical device systems. We demonstrate, using a case study, that the approach can be applied to a system of clinical importance. Our method combines simulation-based analysis of a detailed model of the system that contains continuous patient dynamics with model checking of a more abstract timed automata model. We show that the relationship between the two models preserves the crucial aspect of the timing behavior that ensures the conservativeness of the safety analysis. We also describe system design that can provide open-loop safety under network failure.

  5. Model Checking Verification and Validation at JPL and the NASA Fairmont IV and V Facility

    NASA Technical Reports Server (NTRS)

    Schneider, Frank; Easterbrook, Steve; Callahan, Jack; Montgomery, Todd

    1999-01-01

    We show how a technology transfer effort was carried out. The successful use of model checking on a pilot JPL flight project demonstrates the usefulness and the efficacy of the approach. The pilot project was used to model a complex spacecraft controller. Software design and implementation validation were carried out successfully. To suggest future applications we also show how the implementation validation step can be automated. The effort was followed by the formal introduction of the modeling technique as a part of the JPL Quality Assurance process.

  6. Analysis of an all-digital maximum likelihood carrier phase and clock timing synchronizer for eight phase-shift keying modulation

    NASA Astrophysics Data System (ADS)

    Degaudenzi, Riccardo; Vanghi, Vieri

    1994-02-01

    In all-digital Trellis-Coded 8PSK (TC-8PSK) demodulator well suited for VLSI implementation, including maximum likelihood estimation decision-directed (MLE-DD) carrier phase and clock timing recovery, is introduced and analyzed. By simply removing the trellis decoder the demodulator can efficiently cope with uncoded 8PSK signals. The proposed MLE-DD synchronization algorithm requires one sample for the phase and two samples per symbol for the timing loop. The joint phase and timing discriminator characteristics are analytically derived and numerical results checked by means of computer simulations. An approximated expression for steady-state carrier phase and clock timing mean square error has been derived and successfully checked with simulation findings. Synchronizer deviation from the Cramer Rao bound is also discussed. Mean acquisition time for the digital synchronizer has also been computed and checked, using the Monte Carlo simulation technique. Finally, TC-8PSK digital demodulator performance in terms of bit error rate and mean time to lose lock, including digital interpolators and synchronization loops, is presented.

  7. Philosophy and the practice of Bayesian statistics

    PubMed Central

    Gelman, Andrew; Shalizi, Cosma Rohilla

    2015-01-01

    A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. PMID:22364575

  8. Philosophy and the practice of Bayesian statistics.

    PubMed

    Gelman, Andrew; Shalizi, Cosma Rohilla

    2013-02-01

    A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. © 2012 The British Psychological Society.

  9. Low complexity Reed-Solomon-based low-density parity-check design for software defined optical transmission system based on adaptive puncturing decoding algorithm

    NASA Astrophysics Data System (ADS)

    Pan, Xiaolong; Liu, Bo; Zheng, Jianglong; Tian, Qinghua

    2016-08-01

    We propose and demonstrate a low complexity Reed-Solomon-based low-density parity-check (RS-LDPC) code with adaptive puncturing decoding algorithm for elastic optical transmission system. Partial received codes and the relevant column in parity-check matrix can be punctured to reduce the calculation complexity by adaptive parity-check matrix during decoding process. The results show that the complexity of the proposed decoding algorithm is reduced by 30% compared with the regular RS-LDPC system. The optimized code rate of the RS-LDPC code can be obtained after five times iteration.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belley, M; Schmidt, M; Knutson, N

    Purpose: Physics second-checks for external beam radiation therapy are performed, in-part, to verify that the machine parameters in the Record-and-Verify (R&V) system that will ultimately be sent to the LINAC exactly match the values initially calculated by the Treatment Planning System (TPS). While performing the second-check, a large portion of the physicists’ time is spent navigating and arranging display windows to locate and compare the relevant numerical values (MLC position, collimator rotation, field size, MU, etc.). Here, we describe the development of a software tool that guides the physicist by aggregating and succinctly displaying machine parameter data relevant to themore » physics second-check process. Methods: A data retrieval software tool was developed using Python to aggregate data and generate a list of machine parameters that are commonly verified during the physics second-check process. This software tool imported values from (i) the TPS RT Plan DICOM file and (ii) the MOSAIQ (R&V) Structured Query Language (SQL) database. The machine parameters aggregated for this study included: MLC positions, X&Y jaw positions, collimator rotation, gantry rotation, MU, dose rate, wedges and accessories, cumulative dose, energy, machine name, couch angle, and more. Results: A GUI interface was developed to generate a side-by-side display of the aggregated machine parameter values for each field, and presented to the physicist for direct visual comparison. This software tool was tested for 3D conformal, static IMRT, sliding window IMRT, and VMAT treatment plans. Conclusion: This software tool facilitated the data collection process needed in order for the physicist to conduct a second-check, thus yielding an optimized second-check workflow that was both more user friendly and time-efficient. Utilizing this software tool, the physicist was able to spend less time searching through the TPS PDF plan document and the R&V system and focus the second-check efforts on assessing the patient-specific plan-quality.« less

  11. Use of a remote computer terminal during field checking of Landsat digital maps

    USGS Publications Warehouse

    Robinove, Charles J.; Hutchinson, C.F.

    1978-01-01

    Field checking of small-scale land classification maps made digitally from Landsat data is facilitated by use of a remote portable teletypewriter terminal linked by teleplume to the IDIMS (Interactive Digital Image Manipulation System) at the EDC (EROS Data Center), Sioux Falls, S. Dak. When field checking of maps 20 miles northeast of Baker, Calif., during the day showed that changes in classification were needed, the terminal was used at night to combine image statistical files, remap portions of images, and produce new alphanumeric maps for field checking during the next day. The alphanumeric maps can be used without serious difficulty in location in the field even though the scale is distorted, and statistical files created during the field check can be used for full image classification and map output at the EDC. This process makes field checking faster than normal, provides interaction with the statistical data while in the field, and reduces to a minimum the number of trips needed to work interactively with the IDIMS at the EDC, thus saving significant amounts of time and money. The only significant problem is using telephone lines which at times create spurious characters in the printout or prevent the line feed (paper advance) signal from reaching the terminal, thus overprinting lines which should be sequential. We recommend that maps for field checking be made with more spectral classes than are expected because in the field it is much easier to group classes than to reclassify or separate classes when only the remote terminal is available for display.

  12. The Application of Lidar to Synthetic Vision System Integrity

    NASA Technical Reports Server (NTRS)

    Campbell, Jacob L.; UijtdeHaag, Maarten; Vadlamani, Ananth; Young, Steve

    2003-01-01

    One goal in the development of a Synthetic Vision System (SVS) is to create a system that can be certified by the Federal Aviation Administration (FAA) for use at various flight criticality levels. As part of NASA s Aviation Safety Program, Ohio University and NASA Langley have been involved in the research and development of real-time terrain database integrity monitors for SVS. Integrity monitors based on a consistency check with onboard sensors may be required if the inherent terrain database integrity is not sufficient for a particular operation. Sensors such as the radar altimeter and weather radar, which are available on most commercial aircraft, are currently being investigated for use in a real-time terrain database integrity monitor. This paper introduces the concept of using a Light Detection And Ranging (LiDAR) sensor as part of a real-time terrain database integrity monitor. A LiDAR system consists of a scanning laser ranger, an inertial measurement unit (IMU), and a Global Positioning System (GPS) receiver. Information from these three sensors can be combined to generate synthesized terrain models (profiles), which can then be compared to the stored SVS terrain model. This paper discusses an initial performance evaluation of the LiDAR-based terrain database integrity monitor using LiDAR data collected over Reno, Nevada. The paper will address the consistency checking mechanism and test statistic, sensitivity to position errors, and a comparison of the LiDAR-based integrity monitor to a radar altimeter-based integrity monitor.

  13. Flight Test Results of a Synthetic Vision Elevation Database Integrity Monitor

    NASA Technical Reports Server (NTRS)

    deHaag, Maarten Uijt; Sayre, Jonathon; Campbell, Jacob; Young, Steve; Gray, Robert

    2001-01-01

    This paper discusses the flight test results of a real-time Digital Elevation Model (DEM) integrity monitor for Civil Aviation applications. Providing pilots with Synthetic Vision (SV) displays containing terrain information has the potential to improve flight safety by improving situational awareness and thereby reducing the likelihood of Controlled Flight Into Terrain (CFIT). Utilization of DEMs, such as the digital terrain elevation data (DTED), requires a DEM integrity check and timely integrity alerts to the pilots when used for flight-critical terrain-displays, otherwise the DEM may provide hazardous misleading terrain information. The discussed integrity monitor checks the consistency between a terrain elevation profile synthesized from sensor information, and the profile given in the DEM. The synthesized profile is derived from DGPS and radar altimeter measurements. DEMs of various spatial resolutions are used to illustrate the dependency of the integrity monitor s performance on the DEMs spatial resolution. The paper will give a description of proposed integrity algorithms, the flight test setup, and the results of a flight test performed at the Ohio University airport and in the vicinity of Asheville, NC.

  14. Prospective Memory in HIV-associated Neurocognitive Disorders (HAND): The Neuropsychological Dynamics of Time Monitoring

    PubMed Central

    Doyle, Katie L.; Loft, Shayne; Morgan, Erin E.; Weber, Erica; Cushman, Clint; Johnston, Elaine; Grant, Igor; Woods, Steven Paul

    2013-01-01

    Strategic monitoring during a delay interval is theorized to be an essential feature of time-based prospective memory (TB PM), the cognitive architecture of which is thought to rely heavily on frontostriatal systems and executive functions. This hypothesis was examined in 55 individuals with HIV-associated neurocognitive disorders (HAND) and 108 seronegative comparison participants who were administered the Memory for Intentions Screening Test (MIST), during which time monitoring (clock checking) behavior was measured. Results revealed a significant interaction between HAND group and the frequency of clock checking, in which individuals with HAND monitored checked the clock significantly less often than the comparison group across the TB PM retention intervals of the MIST. Subsequent analyses in the HAND sample revealed that the frequency of clocking checking was positively related to overall TB performance, as well as to standard clinical measures of retrospective memory and verbal fluency. These findings add support to a growing body of research elucidating TB PM’s reliance on strategic monitoring processes dependent upon intact frontostriatal systems. HIV-associated TB strategic time monitoring deficits may manifest in poorer functioning outcomes, including medication non-adherence and dependence in activities of daily living. Future research is needed to further delineate the cognitive mechanisms underlying strategic time monitoring in order to advise rehabilitation strategies for reducing HAND related TB PM deficits. PMID:23465043

  15. Unitals and ovals of symmetric block designs in LDPC and space-time coding

    NASA Astrophysics Data System (ADS)

    Andriamanalimanana, Bruno R.

    2004-08-01

    An approach to the design of LDPC (low density parity check) error-correction and space-time modulation codes involves starting with known mathematical and combinatorial structures, and deriving code properties from structure properties. This paper reports on an investigation of unital and oval configurations within generic symmetric combinatorial designs, not just classical projective planes, as the underlying structure for classes of space-time LDPC outer codes. Of particular interest are the encoding and iterative (sum-product) decoding gains that these codes may provide. Various small-length cases have been numerically implemented in Java and Matlab for a number of channel models.

  16. Time-space modal logic for verification of bit-slice circuits

    NASA Astrophysics Data System (ADS)

    Hiraishi, Hiromi

    1996-03-01

    The major goal of this paper is to propose a new modal logic aiming at formal verification of bit-slice circuits. The new logic is called as time-space modal logic and its major feature is that it can handle two transition relations: one for time transition and the other for space transition. As for a verification algorithm, a symbolic model checking algorithm of the new logic is shown. This could be applicable to verification of bit-slice microprocessor of infinite bit width and 1D systolic array of infinite length. A simple benchmark result shows the effectiveness of the proposed approach.

  17. Design and analysis of DNA strand displacement devices using probabilistic model checking

    PubMed Central

    Lakin, Matthew R.; Parker, David; Cardelli, Luca; Kwiatkowska, Marta; Phillips, Andrew

    2012-01-01

    Designing correct, robust DNA devices is difficult because of the many possibilities for unwanted interference between molecules in the system. DNA strand displacement has been proposed as a design paradigm for DNA devices, and the DNA strand displacement (DSD) programming language has been developed as a means of formally programming and analysing these devices to check for unwanted interference. We demonstrate, for the first time, the use of probabilistic verification techniques to analyse the correctness, reliability and performance of DNA devices during the design phase. We use the probabilistic model checker prism, in combination with the DSD language, to design and debug DNA strand displacement components and to investigate their kinetics. We show how our techniques can be used to identify design flaws and to evaluate the merits of contrasting design decisions, even on devices comprising relatively few inputs. We then demonstrate the use of these components to construct a DNA strand displacement device for approximate majority voting. Finally, we discuss some of the challenges and possible directions for applying these methods to more complex designs. PMID:22219398

  18. Further Development of Verification Check-Cases for Six- Degree-of-Freedom Flight Vehicle Simulations

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Madden, Michael M.; Shelton, Robert; Jackson, A. A.; Castro, Manuel P.; Noble, Deleena M.; Zimmerman, Curtis J.; Shidner, Jeremy D.; White, Joseph P.; Dutta, Doumyo; hide

    2015-01-01

    This follow-on paper describes the principal methods of implementing, and documents the results of exercising, a set of six-degree-of-freedom rigid-body equations of motion and planetary geodetic, gravitation and atmospheric models for simple vehicles in a variety of endo- and exo-atmospheric conditions with various NASA, and one popular open-source, engineering simulation tools. This effort is intended to provide an additional means of verification of flight simulations. The models used in this comparison, as well as the resulting time-history trajectory data, are available electronically for persons and organizations wishing to compare their flight simulation implementations of the same models.

  19. Towards a model of pion generalized parton distributions from Dyson-Schwinger equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moutarde, H.

    2015-04-10

    We compute the pion quark Generalized Parton Distribution H{sup q} and Double Distributions F{sup q} and G{sup q} in a coupled Bethe-Salpeter and Dyson-Schwinger approach. We use simple algebraic expressions inspired by the numerical resolution of Dyson-Schwinger and Bethe-Salpeter equations. We explicitly check the support and polynomiality properties, and the behavior under charge conjugation or time invariance of our model. We derive analytic expressions for the pion Double Distributions and Generalized Parton Distribution at vanishing pion momentum transfer at a low scale. Our model compares very well to experimental pion form factor or parton distribution function data.

  20. surrosurv: An R package for the evaluation of failure time surrogate endpoints in individual patient data meta-analyses of randomized clinical trials.

    PubMed

    Rotolo, Federico; Paoletti, Xavier; Michiels, Stefan

    2018-03-01

    Surrogate endpoints are attractive for use in clinical trials instead of well-established endpoints because of practical convenience. To validate a surrogate endpoint, two important measures can be estimated in a meta-analytic context when individual patient data are available: the R indiv 2 or the Kendall's τ at the individual level, and the R trial 2 at the trial level. We aimed at providing an R implementation of classical and well-established as well as more recent statistical methods for surrogacy assessment with failure time endpoints. We also intended incorporating utilities for model checking and visualization and data generating methods described in the literature to date. In the case of failure time endpoints, the classical approach is based on two steps. First, a Kendall's τ is estimated as measure of individual level surrogacy using a copula model. Then, the R trial 2 is computed via a linear regression of the estimated treatment effects; at this second step, the estimation uncertainty can be accounted for via measurement-error model or via weights. In addition to the classical approach, we recently developed an approach based on bivariate auxiliary Poisson models with individual random effects to measure the Kendall's τ and treatment-by-trial interactions to measure the R trial 2 . The most common data simulation models described in the literature are based on: copula models, mixed proportional hazard models, and mixture of half-normal and exponential random variables. The R package surrosurv implements the classical two-step method with Clayton, Plackett, and Hougaard copulas. It also allows to optionally adjusting the second-step linear regression for measurement-error. The mixed Poisson approach is implemented with different reduced models in addition to the full model. We present the package functions for estimating the surrogacy models, for checking their convergence, for performing leave-one-trial-out cross-validation, and for plotting the results. We illustrate their use in practice on individual patient data from a meta-analysis of 4069 patients with advanced gastric cancer from 20 trials of chemotherapy. The surrosurv package provides an R implementation of classical and recent statistical methods for surrogacy assessment of failure time endpoints. Flexible simulation functions are available to generate data according to the methods described in the literature. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Parallel Software Model Checking

    DTIC Science & Technology

    2015-01-08

    checker. This project will explore this strategy to parallelize the generalized PDR algorithm for software model checking. It belongs to TF1 due to its ... focus on formal verification . Generalized PDR. Generalized Property Driven Rechability (GPDR) i is an algorithm for solving HORN-SMT reachability...subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 08

  2. Stochastic Game Analysis and Latency Awareness for Self-Adaptation

    DTIC Science & Technology

    2014-01-01

    this paper, we introduce a formal analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to quantify the...Additional Key Words and Phrases: Proactive adaptation, Stochastic multiplayer games , Latency 1. INTRODUCTION When planning how to adapt, self-adaptive...contribution of this paper is twofold: (1) A novel analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to

  3. Probabilistic Priority Message Checking Modeling Based on Controller Area Networks

    NASA Astrophysics Data System (ADS)

    Lin, Cheng-Min

    Although the probabilistic model checking tool called PRISM has been applied in many communication systems, such as wireless local area network, Bluetooth, and ZigBee, the technique is not used in a controller area network (CAN). In this paper, we use PRISM to model the mechanism of priority messages for CAN because the mechanism has allowed CAN to become the leader in serial communication for automobile and industry control. Through modeling CAN, it is easy to analyze the characteristic of CAN for further improving the security and efficiency of automobiles. The Markov chain model helps us to model the behaviour of priority messages.

  4. Aerospace Ground Equipment for model 4080 sequence programmer. A standard computer terminal is adapted to provide convenient operator to device interface

    NASA Technical Reports Server (NTRS)

    Nissley, L. E.

    1979-01-01

    The Aerospace Ground Equipment (AGE) provides an interface between a human operator and a complete spaceborne sequence timing device with a memory storage program. The AGE provides a means for composing, editing, syntax checking, and storing timing device programs. The AGE is implemented with a standard Hewlett-Packard 2649A terminal system and a minimum of special hardware. The terminal's dual tape interface is used to store timing device programs and to read in special AGE operating system software. To compose a new program for the timing device the keyboard is used to fill in a form displayed on the screen.

  5. An empirical comparison of statistical tests for assessing the proportional hazards assumption of Cox's model.

    PubMed

    Ng'andu, N H

    1997-03-30

    In the analysis of survival data using the Cox proportional hazard (PH) model, it is important to verify that the explanatory variables analysed satisfy the proportional hazard assumption of the model. This paper presents results of a simulation study that compares five test statistics to check the proportional hazard assumption of Cox's model. The test statistics were evaluated under proportional hazards and the following types of departures from the proportional hazard assumption: increasing relative hazards; decreasing relative hazards; crossing hazards; diverging hazards, and non-monotonic hazards. The test statistics compared include those based on partitioning of failure time and those that do not require partitioning of failure time. The simulation results demonstrate that the time-dependent covariate test, the weighted residuals score test and the linear correlation test have equally good power for detection of non-proportionality in the varieties of non-proportional hazards studied. Using illustrative data from the literature, these test statistics performed similarly.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barbee, D; McCarthy, A; Galavis, P

    Purpose: Errors found during initial physics plan checks frequently require replanning and reprinting, resulting decreased departmental efficiency. Additionally, errors may be missed during physics checks, resulting in potential treatment errors or interruption. This work presents a process control created using the Eclipse Scripting API (ESAPI) enabling dosimetrists and physicists to detect potential errors in the Eclipse treatment planning system prior to performing any plan approvals or printing. Methods: Potential failure modes for five categories were generated based on available ESAPI (v11) patient object properties: Images, Contours, Plans, Beams, and Dose. An Eclipse script plugin (PlanCheck) was written in C# tomore » check errors most frequently observed clinically in each of the categories. The PlanCheck algorithms were devised to check technical aspects of plans, such as deliverability (e.g. minimum EDW MUs), in addition to ensuring that policy and procedures relating to planning were being followed. The effect on clinical workflow efficiency was measured by tracking the plan document error rate and plan revision/retirement rates in the Aria database over monthly intervals. Results: The number of potential failure modes the PlanCheck script is currently capable of checking for in the following categories: Images (6), Contours (7), Plans (8), Beams (17), and Dose (4). Prior to implementation of the PlanCheck plugin, the observed error rates in errored plan documents and revised/retired plans in the Aria database was 20% and 22%, respectively. Error rates were seen to decrease gradually over time as adoption of the script improved. Conclusion: A process control created using the Eclipse scripting API enabled plan checks to occur within the planning system, resulting in reduction in error rates and improved efficiency. Future work includes: initiating full FMEA for planning workflow, extending categories to include additional checks outside of ESAPI via Aria database queries, and eventual automated plan checks.« less

  7. Cycle time reduction by Html report in mask checking flow

    NASA Astrophysics Data System (ADS)

    Chen, Jian-Cheng; Lu, Min-Ying; Fang, Xiang; Shen, Ming-Feng; Ma, Shou-Yuan; Yang, Chuen-Huei; Tsai, Joe; Lee, Rachel; Deng, Erwin; Lin, Ling-Chieh; Liao, Hung-Yueh; Tsai, Jenny; Bowhill, Amanda; Vu, Hien; Russell, Gordon

    2017-07-01

    The Mask Data Correctness Check (MDCC) is a reticle-level, multi-layer DRC-like check evolved from mask rule check (MRC). The MDCC uses extended job deck (EJB) to achieve mask composition and to perform a detailed check for positioning and integrity of each component of the reticle. Different design patterns on the mask will be mapped to different layers. Therefore, users may be able to review the whole reticle and check the interactions between different designs before the final mask pattern file is available. However, many types of MDCC check results, such as errors from overlapping patterns usually have very large and complex-shaped highlighted areas covering the boundary of the design. Users have to load the result OASIS file and overlap it to the original database that was assembled in MDCC process on a layout viewer, then search for the details of the check results. We introduce a quick result-reviewing method based on an html format report generated by Calibre® RVE. In the report generation process, we analyze and extract the essential part of result OASIS file to a result database (RDB) file by standard verification rule format (SVRF) commands. Calibre® RVE automatically loads the assembled reticle pattern and generates screen shots of these check results. All the processes are automatically triggered just after the MDCC process finishes. Users just have to open the html report to get the information they need: for example, check summary, captured images of results and their coordinates.

  8. A novel cognitive intervention for compulsive checking: Targeting maladaptive beliefs about memory.

    PubMed

    Alcolado, Gillian M; Radomsky, Adam S

    2016-12-01

    Compulsive checking is one of the most common symptoms of obsessive-compulsive disorder (OCD). Recently it has been proposed that those who check compulsively may believe their memory is poor, rather than having an actual memory impairment. The current study sought to develop and assess a brief cognitive intervention focused on improving maladaptive beliefs about memory, as they pertain to both checking symptoms and memory performance. Participants (N = 24) with a diagnosis of OCD and clinical levels of checking symptomatology were randomly assigned either to receive two weekly 1-hour therapy sessions or to self-monitor during a similar waitlist period. Time spent checking, checking symptoms, maladaptive beliefs about memory, and visuospatial memory were assessed both pre- and post-treatment/waitlist. Results showed that compared to the waitlist condition, individuals in the treatment condition displayed significant decreases in their maladaptive beliefs about memory and checking symptoms from pre- to post-intervention. They also exhibited increased recall performance on a measure of visuospatial memory. Changes in beliefs about memory were predictors of reduced post-intervention checking, but were not predictive of increased post-intervention memory scores. The lack of long term follow-up data and use of a waitlist control leave questions about the stability and specificity of the intervention. Findings provide preliminary evidence that strategies targeting beliefs about memory may be worthy of inclusion in cognitive-behavioural approaches to treating compulsive checking. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Super Mario's prison break —A proposal of object-intelligent-feedback-based classical Zeno and anti-Zeno effects

    NASA Astrophysics Data System (ADS)

    Gu, Shi-Jian

    2009-10-01

    Super Mario is imprisoned by a demon in a finite potential well on his way to save Princess Peach. He can escape from the well with the help of a flight of magic stairs floating in the space. However, the hateful demon may occasionally check his status. At that time, he has to make a judgement of either jumping to the inside ground immediately in order to avoid the discovery of his escape intention, or speeding up his escape process. Therefore, if the demon checks him too frequently such that there is no probability for him to reach the top of the barrier, he will be always inside the well, then a classical Zeno effect occurs. On the other hand, if the time interval between two subsequent checks is large enough such that he has a higher probability of being beyond the demon's controllable range already, then the demon's check actually speeds up his escape and a classical anti-Zeno effect takes place.

  10. Probabilistic choice between symmetric disparities in motion stereo matching for a lateral navigation system

    NASA Astrophysics Data System (ADS)

    Ershov, Egor; Karnaukhov, Victor; Mozerov, Mikhail

    2016-02-01

    Two consecutive frames of a lateral navigation camera video sequence can be considered as an appropriate approximation to epipolar stereo. To overcome edge-aware inaccuracy caused by occlusion, we propose a model that matches the current frame to the next and to the previous ones. The positive disparity of matching to the previous frame has its symmetric negative disparity to the next frame. The proposed algorithm performs probabilistic choice for each matched pixel between the positive disparity and its symmetric disparity cost. A disparity map obtained by optimization over the cost volume composed of the proposed probabilistic choice is more accurate than the traditional left-to-right and right-to-left disparity maps cross-check. Also, our algorithm needs two times less computational operations per pixel than the cross-check technique. The effectiveness of our approach is demonstrated on synthetic data and real video sequences, with ground-truth value.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berezhiani, Lasha; Khoury, Justin; Wang, Junpu, E-mail: lashaber@gmail.com, E-mail: jkhoury@sas.upenn.edu, E-mail: jwang217@jhu.edu

    Single-field perturbations satisfy an infinite number of consistency relations constraining the squeezed limit of correlation functions at each order in the soft momentum. These can be understood as Ward identities for an infinite set of residual global symmetries, or equivalently as Slavnov-Taylor identities for spatial diffeomorphisms. In this paper, we perform a number of novel, non-trivial checks of the identities in the context of single field inflationary models with arbitrary sound speed. We focus for concreteness on identities involving 3-point functions with a soft external mode, and consider all possible scalar and tensor combinations for the hard-momentum modes. In allmore » these cases, we check the consistency relations up to and including cubic order in the soft momentum. For this purpose, we compute for the first time the 3-point functions involving 2 scalars and 1 tensor, as well as 2 tensors and 1 scalar, for arbitrary sound speed.« less

  12. Tank 30 and 37 Supernatant Sample Cross-Check and Evaporator Feed Qualification Analysis-2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oji, L. N.

    2013-03-07

    This report summarizes the analytical data reported by the F/H and Savannah River National Laboratories for the 2012 cross-check analysis for high level waste supernatant liquid samples from SRS Tanks 30 and 37. The intent of this Tank 30 and 37 sample analyses was to perform cross-checks against routine F/H Laboratory analyses (corrosion and evaporator feed qualification programs) using samples collected at the same time from both tanks as well as split samples from the tanks.

  13. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  14. What is a good health check? An interview study of health check providers' views and practices.

    PubMed

    Stol, Yrrah H; Asscher, Eva C A; Schermer, Maartje H N

    2017-10-02

    Health checks identify (risk factors for) disease in people without symptoms. They may be offered by the government through population screenings and by other providers to individual users as 'personal health checks'. Health check providers' perspective of 'good' health checks may further the debate on the ethical evaluation and possible regulation of these personal health checks. In 2015, we interviewed twenty Dutch health check providers on criteria for 'good' health checks, and the role these criteria play in their practices. Providers unanimously formulate a number of minimal criteria: Checks must focus on (risk factors for) treatable/preventable disease; Tests must be reliable and clinically valid; Participation must be informed and voluntary; Checks should provide more benefits than harms; Governmental screenings should be cost-effective. Aspirational criteria mentioned were: Follow-up care should be provided; Providers should be skilled and experienced professionals that put the benefit of (potential) users first; Providers should take time and attention. Some criteria were contested: People should be free to test on any (risk factor for) disease; Health checks should only be performed in people at high risk for disease that are likely to implement health advice; Follow up care of privately funded tests should not drain on collective resources. Providers do not always fulfil their own criteria. Their reasons reveal conflicts between criteria, conflicts between criteria and other ethical values, and point to components in the (Dutch) organisation of health care that hinder an ethical provision of health checks. Moreover, providers consider informed consent a criterion that is hard to establish in practice. According to providers, personal health checks should meet the same criteria as population screenings, with the exception of cost-effectiveness. Providers do not always fulfil their own criteria. Results indicate that in thinking about the ethics of health checks potential conflicts between criteria and underlying values should be explicated, guidance in weighing of criteria should be provided and the larger context should be taken into account: other actors than providers need to take up responsibility, and ideally benefits and harms of health checks should be weighed against other measures targeting (risk factors for) disease.

  15. Two-dimensional solitons in conservative and parity-time-symmetric triple-core waveguides with cubic-quintic nonlinearity

    NASA Astrophysics Data System (ADS)

    Feijoo, David; Zezyulin, Dmitry A.; Konotop, Vladimir V.

    2015-12-01

    We analyze a system of three two-dimensional nonlinear Schrödinger equations coupled by linear terms and with the cubic-quintic (focusing-defocusing) nonlinearity. We consider two versions of the model: conservative and parity-time (PT ) symmetric. These models describe triple-core nonlinear optical waveguides, with balanced gain and losses in the PT -symmetric case. We obtain families of soliton solutions and discuss their stability. The latter study is performed using a linear stability analysis and checked with direct numerical simulations of the evolutional system of equations. Stable solitons are found in the conservative and PT -symmetric cases. Interactions and collisions between the conservative and PT -symmetric solitons are briefly investigated, as well.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Denbleyker, Alan; Liu, Yuzhi; Meurice, Y.

    We consider the sign problem for classical spin models at complexmore » $$\\beta =1/g_0^2$$ on $$L\\times L$$ lattices. We show that the tensor renormalization group method allows reliable calculations for larger Im$$\\beta$$ than the reweighting Monte Carlo method. For the Ising model with complex $$\\beta$$ we compare our results with the exact Onsager-Kaufman solution at finite volume. The Fisher zeros can be determined precisely with the TRG method. We check the convergence of the TRG method for the O(2) model on $$L\\times L$$ lattices when the number of states $$D_s$$ increases. We show that the finite size scaling of the calculated Fisher zeros agrees very well with the Kosterlitz-Thouless transition assumption and predict the locations for larger volume. The location of these zeros agree with Monte Carlo reweighting calculation for small volume. The application of the method for the O(2) model with a chemical potential is briefly discussed.« less

  17. The independent relationship between trouble controlling Facebook use, time spent on the site and distress

    PubMed Central

    Muench, Fredrick; Hayes, Marie; Kuerbis, Alexis; Shao, Sijing

    2015-01-01

    Background and Aims There is an emerging literature base on the relationship between maladaptive traits and “addiction” to social networking sites. These studies have operationalized addiction as either spending excessive amounts of time on social networking sites (SNS) or trouble controlling SNS use, but have not assessed the unique contribution of each of these constructs on outcomes in the same models. Moreover, these studies have exclusively been conducted with younger people rather than a heterogeneous sample. This study examined the independent relationship of a brief Facebook addiction scale, time spent on Facebook, and Facebook checking on positive and negative social domains, while controlling for self-esteem and social desirability. Methods Participants were recruited using e-mail, SNS posts and through Amazon’s MTurk system. The sample included 489 respondents ages from 18 to approximately 70, who completed a 10–15 minute survey. Results Results indicate that neither time spent on Facebook nor Facebook checking was significantly associated with either self-esteem, fear of negative social evaluation or social comparison, while SNS addiction symptoms were each independently associated with Facebook usage. Neither time spent on Facebook nor SNS addiction symptoms were associated with positive social relationships. Discussion Overall results suggest that time on SNS and trouble controlling use should be considered independent constructs and that interventions should target underlying loss of control as the primary intervention target above ego syntonic time spent on the site. PMID:26551906

  18. Model-checking techniques based on cumulative residuals.

    PubMed

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.

  19. Parameter discovery in stochastic biological models using simulated annealing and statistical model checking.

    PubMed

    Hussain, Faraz; Jha, Sumit K; Jha, Susmit; Langmead, Christopher J

    2014-01-01

    Stochastic models are increasingly used to study the behaviour of biochemical systems. While the structure of such models is often readily available from first principles, unknown quantitative features of the model are incorporated into the model as parameters. Algorithmic discovery of parameter values from experimentally observed facts remains a challenge for the computational systems biology community. We present a new parameter discovery algorithm that uses simulated annealing, sequential hypothesis testing, and statistical model checking to learn the parameters in a stochastic model. We apply our technique to a model of glucose and insulin metabolism used for in-silico validation of artificial pancreata and demonstrate its effectiveness by developing parallel CUDA-based implementation for parameter synthesis in this model.

  20. Performance optimization of internet firewalls

    NASA Astrophysics Data System (ADS)

    Chiueh, Tzi-cker; Ballman, Allen

    1997-01-01

    Internet firewalls control the data traffic in and out of an enterprise network by checking network packets against a set of rules that embodies an organization's security policy. Because rule checking is computationally more expensive than routing-table look-up, it could become a potential bottleneck for scaling up the performance of IP routers, which typically implement firewall functions in software. in this paper, we analyzed the performance problems associated with firewalls, particularly packet filters, propose a good connection cache to amortize the costly security check over the packets in a connection, and report the preliminary performance results of a trace-driven simulation that show the average packet check time can be reduced by a factor of 2.5 at the least.

  1. Generalized Symbolic Execution for Model Checking and Testing

    NASA Technical Reports Server (NTRS)

    Khurshid, Sarfraz; Pasareanu, Corina; Visser, Willem; Kofmeyer, David (Technical Monitor)

    2003-01-01

    Modern software systems, which often are concurrent and manipulate complex data structures must be extremely reliable. We present a novel framework based on symbolic execution, for automated checking of such systems. We provide a two-fold generalization of traditional symbolic execution based approaches: one, we define a program instrumentation, which enables standard model checkers to perform symbolic execution; two, we give a novel symbolic execution algorithm that handles dynamically allocated structures (e.g., lists and trees), method preconditions (e.g., acyclicity of lists), data (e.g., integers and strings) and concurrency. The program instrumentation enables a model checker to automatically explore program heap configurations (using a systematic treatment of aliasing) and manipulate logical formulae on program data values (using a decision procedure). We illustrate two applications of our framework: checking correctness of multi-threaded programs that take inputs from unbounded domains with complex structure and generation of non-isomorphic test inputs that satisfy a testing criterion. Our implementation for Java uses the Java PathFinder model checker.

  2. Model Checking Degrees of Belief in a System of Agents

    NASA Technical Reports Server (NTRS)

    Raimondi, Franco; Primero, Giuseppe; Rungta, Neha

    2014-01-01

    Reasoning about degrees of belief has been investigated in the past by a number of authors and has a number of practical applications in real life. In this paper we present a unified framework to model and verify degrees of belief in a system of agents. In particular, we describe an extension of the temporal-epistemic logic CTLK and we introduce a semantics based on interpreted systems for this extension. In this way, degrees of beliefs do not need to be provided externally, but can be derived automatically from the possible executions of the system, thereby providing a computationally grounded formalism. We leverage the semantics to (a) construct a model checking algorithm, (b) investigate its complexity, (c) provide a Java implementation of the model checking algorithm, and (d) evaluate our approach using the standard benchmark of the dining cryptographers. Finally, we provide a detailed case study: using our framework and our implementation, we assess and verify the situational awareness of the pilot of Air France 447 flying in off-nominal conditions.

  3. 46 CFR 91.25-20 - Fire-extinguishing equipment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... each inspection for certification, periodic inspection and at other times necessary, the inspector will... certification and periodic inspection, the inspector will check fire-extinguishing equipment with the following... systems shall be checked as noted in Table 91.25-20(a)(1). In addition, the hand portable fire...

  4. 46 CFR 91.25-20 - Fire extinguishing equipment.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... each inspection for certification, periodic inspection and at other times necessary, the inspector will... certification and periodic inspection, the inspector will check fire-extinguishing equipment with the following... systems shall be checked as noted in Table 91.25-20(a)(1). In addition, the hand portable fire...

  5. 46 CFR 91.25-20 - Fire extinguishing equipment.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... each inspection for certification, periodic inspection and at other times necessary, the inspector will... certification and periodic inspection, the inspector will check fire-extinguishing equipment with the following... systems shall be checked as noted in Table 91.25-20(a)(1). In addition, the hand portable fire...

  6. 46 CFR 91.25-20 - Fire-extinguishing equipment.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... each inspection for certification, periodic inspection and at other times necessary, the inspector will... certification and periodic inspection, the inspector will check fire-extinguishing equipment with the following... systems shall be checked as noted in Table 91.25-20(a)(1). In addition, the hand portable fire...

  7. Evaluation of Check in/Check out for Students with Internalizing Behavior Problems

    ERIC Educational Resources Information Center

    Hunter, Katherine K.; Chenier, Jeffrey S.; Gresham, Frank M.

    2014-01-01

    Internalizing behaviors are directed inward toward the child and are frequently overlooked in classrooms compared with externalizing behaviors. When internalizing behaviors are identified, cognitive-behavioral interventions (CBIs) are typically the intervention of choice; however, CBIs are time-consuming and require considerable skill and…

  8. The NASA Lewis integrated propulsion and flight control simulator

    NASA Technical Reports Server (NTRS)

    Bright, Michelle M.; Simon, Donald L.

    1991-01-01

    A new flight simulation facility was developed at NASA-Lewis. The purpose of this flight simulator is to allow integrated propulsion control and flight control algorithm development and evaluation in real time. As a preliminary check of the simulator facility capabilities and correct integration of its components, the control design and physics models for a short take-off and vertical landing fighter aircraft model were shown, with their associated system integration and architecture, pilot vehicle interfaces, and display symbology. The initial testing and evaluation results show that this fixed based flight simulator can provide real time feedback and display of both airframe and propulsion variables for validation of integrated flight and propulsion control systems. Additionally, through the use of this flight simulator, various control design methodologies and cockpit mechanizations can be tested and evaluated in a real time environment.

  9. Trust Based Evaluation of Wikipedia's Contributors

    NASA Astrophysics Data System (ADS)

    Krupa, Yann; Vercouter, Laurent; Hübner, Jomi Fred; Herzig, Andreas

    Wikipedia is an encyclopedia on which anybody can change its content. Some users, self-proclaimed "patrollers", regularly check recent changes in order to delete or correct those which are ruining articles integrity. The huge quantity of updates leads some articles to remain polluted a certain time before being corrected. In this work, we show how a multiagent trust model can help patrollers in their task of controlling the Wikipedia. To direct the patrollers verification towards suspicious contributors, our work relies on a formalisation of Castelfranchi & Falcone's social trust theory to assist them by representing their trust model in a cognitive way.

  10. Effects of and preference for pay for performance: an analogue analysis.

    PubMed

    Long, Robert D; Wilder, David A; Betz, Alison; Dutta, Ami

    2012-01-01

    We examined the effects of 2 payment systems on the rate of check processing and time spent on task by participants in a simulated work setting. Three participants experienced individual pay-for-performance (PFP) without base pay and pay-for-time (PFT) conditions. In the last phase, we asked participants to choose which system they preferred. For all participants, the PFP condition produced higher rates of check processing and more time spent on task than did the PFT condition, but choice of payment system varied both within and across participants.

  11. Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering

    NASA Technical Reports Server (NTRS)

    Bolton, Matthew L.; Bass, Ellen J.

    2009-01-01

    Both the human factors engineering (HFE) and formal methods communities are concerned with finding and eliminating problems with safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to use model checking with HFE practices to perform formal verification of a human-interactive system. Despite the use of a seemingly simple target system, a patient controlled analgesia pump, the initial model proved to be difficult for the model checker to verify in a reasonable amount of time. This resulted in a number of model revisions that affected the HFE architectural, representativeness, and understandability goals of the effort. If formal methods are to meet the needs of the HFE community, additional modeling tools and technological developments are necessary.

  12. Test and Evaluation Report of the IVAC (Trademark) Vital Check Monitor Model 4000AEE

    DTIC Science & Technology

    1992-02-01

    AD-A248 834 111111 jIf+l l’ l USAARL Report No. 92-14 Test and Evaluation Report of the IVAC® Vital Check Monitor DTI ~cModel 4000AEE f ELECTE APR17...does not constitute an official Department of the Army endorsement or approval of the use ot such commercial items. Reviewed: DENNIS F . SHANAHAN LTC, MC...to 12.4 GHz) was scanned for emissions. The IVACO Model 4000AEE was operated with both ac and battery power. 2.10.3.2 The radiated susceptibility

  13. Automated Verification of Specifications with Typestates and Access Permissions

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Catano, Nestor

    2011-01-01

    We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).

  14. User's manual for computer program BASEPLOT

    USGS Publications Warehouse

    Sanders, Curtis L.

    2002-01-01

    The checking and reviewing of daily records of streamflow within the U.S. Geological Survey is traditionally accomplished by hand-plotting and mentally collating tables of data. The process is time consuming, difficult to standardize, and subject to errors in computation, data entry, and logic. In addition, the presentation of flow data on the internet requires more timely and accurate computation of daily flow records. BASEPLOT was developed for checking and review of primary streamflow records within the U.S. Geological Survey. Use of BASEPLOT enables users to (1) provide efficiencies during the record checking and review process, (2) improve quality control, (3) achieve uniformity of checking and review techniques of simple stage-discharge relations, and (4) provide a tool for teaching streamflow computation techniques. The BASEPLOT program produces tables of quality control checks and produces plots of rating curves and discharge measurements; variable shift (V-shift) diagrams; and V-shifts converted to stage-discharge plots, using data stored in the U.S. Geological Survey Automatic Data Processing System database. In addition, the program plots unit-value hydrographs that show unit-value stages, shifts, and datum corrections; input shifts, datum corrections, and effective dates; discharge measurements; effective dates for rating tables; and numeric quality control checks. Checklist/tutorial forms are provided for reviewers to ensure completeness of review and standardize the review process. The program was written for the U.S. Geological Survey SUN computer using the Statistical Analysis System (SAS) software produced by SAS Institute, Incorporated.

  15. Tempo: A Toolkit for the Timed Input/Output Automata Formalism

    DTIC Science & Technology

    2008-01-30

    generation of distributed code from specifications. F .4.3 [Formal Languages]: Tempo;, D.3 [Programming Many distributed systems involve a combination of...and require The chek (i) transition is enabled when process i’s program the simulator to check the assertions after every single step counter is set to...output foo (n:Int) The Tempo simulator addresses this issue by putting the states x: Int : = 10;transitions modeler in charge of resolving the non

  16. A Quantum Computing Approach to Model Checking for Advanced Manufacturing Problems

    DTIC Science & Technology

    2014-07-01

    amount of time. In summary, the tool we developed succeeded in allowing us to produce good solutions for optimization problems that did not fit ...We compared the value of the objective obtained in each run with the known optimal value, and used this information to compute the probability of ...success for each given instance. Then we used this information to compute the expected number of repetitions (or runs) needed to obtain the optimal

  17. [Health for refugees - the Bremen model].

    PubMed

    Mohammadzadeh, Zahra; Jung, Felicitas; Lelgemann, Monika

    2016-05-01

    The Bremen model recognizes that refugee health care has to go beyond merely checking for the prevalence of contagious diseases. Elementary health care offered in the reception centre and transitory facilities is based on voluntary acceptance by the refugees. At the same time, legal requirements for the medical reception of refugees are observed. In addition, doctors performing the initial medical examination are enabled to cover acute care on the spot. During the preliminary phase of immigration refugees are allowed to see a doctor in their facility repeatedly. After a certain time, they are provided with a health card permitting limited access to regular care outside of their facility. The current rise of refugee numbers affects the situation of Bremen health care for adult as well as juvenile refugees. In spite of the increase, health care standards are maintained by means of the health card. From 2011 to 2014, "Factors influencing health status and contact with health services" averaged 29.6 % in the health check data. Diseases of the respiratory system (18.1 %) and "symptoms, signs and abnormal findings not elsewhere classified" (16.9 %) ranked second and third, respectively. Diseases of the digestive system (6.1 %) of the musculoskeletal system (6 %) and of the skin and subcutaneous tissue (3.6 %) followed. Infectious diseases such as HIV infections, hepatitis or tuberculosis were seldom.

  18. Full-chip level MEEF analysis using model based lithography verification

    NASA Astrophysics Data System (ADS)

    Kim, Juhwan; Wang, Lantian; Zhang, Daniel; Tang, Zongwu

    2005-11-01

    MEEF (Mask Error Enhancement Factor) has become a critical factor in CD uniformity control since optical lithography process moved to sub-resolution era. A lot of studies have been done by quantifying the impact of the mask CD (Critical Dimension) errors on the wafer CD errors1-2. However, the benefits from those studies were restricted only to small pattern areas of the full-chip data due to long simulation time. As fast turn around time can be achieved for the complicated verifications on very large data by linearly scalable distributed processing technology, model-based lithography verification becomes feasible for various types of applications such as post mask synthesis data sign off for mask tape out in production and lithography process development with full-chip data3,4,5. In this study, we introduced two useful methodologies for the full-chip level verification of mask error impact on wafer lithography patterning process. One methodology is to check MEEF distribution in addition to CD distribution through process window, which can be used for RET/OPC optimization at R&D stage. The other is to check mask error sensitivity on potential pinch and bridge hotspots through lithography process variation, where the outputs can be passed on to Mask CD metrology to add CD measurements on those hotspot locations. Two different OPC data were compared using the two methodologies in this study.

  19. On the equivalence of case-crossover and time series methods in environmental epidemiology.

    PubMed

    Lu, Yun; Zeger, Scott L

    2007-04-01

    The case-crossover design was introduced in epidemiology 15 years ago as a method for studying the effects of a risk factor on a health event using only cases. The idea is to compare a case's exposure immediately prior to or during the case-defining event with that same person's exposure at otherwise similar "reference" times. An alternative approach to the analysis of daily exposure and case-only data is time series analysis. Here, log-linear regression models express the expected total number of events on each day as a function of the exposure level and potential confounding variables. In time series analyses of air pollution, smooth functions of time and weather are the main confounders. Time series and case-crossover methods are often viewed as competing methods. In this paper, we show that case-crossover using conditional logistic regression is a special case of time series analysis when there is a common exposure such as in air pollution studies. This equivalence provides computational convenience for case-crossover analyses and a better understanding of time series models. Time series log-linear regression accounts for overdispersion of the Poisson variance, while case-crossover analyses typically do not. This equivalence also permits model checking for case-crossover data using standard log-linear model diagnostics.

  20. Utilisation of preventative health check-ups in the UK: findings from individual-level repeated cross-sectional data from 1992 to 2008

    PubMed Central

    Labeit, Alexander; Peinemann, Frank; Baker, Richard

    2013-01-01

    Objectives To analyse and compare the determinants of screening uptake for different National Health Service (NHS) health check-ups in the UK. Design Individual-level analysis of repeated cross-sectional surveys with balanced panel data. Setting The UK. Participants Individuals taking part in the British Household Panel Survey (BHPS), 1992–2008. Outcome measure Uptake of NHS health check-ups for cervical cancer screening, breast cancer screening, blood pressure checks, cholesterol tests, dental screening and eyesight tests. Methods Dynamic panel data models (random effects panel probit with initial conditions). Results Having had a health check-up 1 year before, and previously in accordance with the recommended schedule, was associated with higher uptake of health check-ups. Individuals who visited a general practitioner (GP) had a significantly higher uptake in 5 of the 6 health check-ups. Uptake was highest in the recommended age group for breast and cervical cancer screening. For all health check-ups, age had a non-linear relationship. Lower self-rated health status was associated with increased uptake of blood pressure checks and cholesterol tests; smoking was associated with decreased uptake of 4 health check-ups. The effects of socioeconomic variables differed for the different health check-ups. Ethnicity did not have a significant influence on any health check-up. Permanent household income had an influence only on eyesight tests and dental screening. Conclusions Common determinants for having health check-ups are age, screening history and a GP visit. Policy interventions to increase uptake should consider the central role of the GP in promoting screening examinations and in preserving a high level of uptake. Possible economic barriers to access for prevention exist for dental screening and eyesight tests, and could be a target for policy intervention. Trial registration This observational study was not registered. PMID:24366576

  1. How can machine-learning methods assist in virtual screening for hyperuricemia? A healthcare machine-learning approach.

    PubMed

    Ichikawa, Daisuke; Saito, Toki; Ujita, Waka; Oyama, Hiroshi

    2016-12-01

    Our purpose was to develop a new machine-learning approach (a virtual health check-up) toward identification of those at high risk of hyperuricemia. Applying the system to general health check-ups is expected to reduce medical costs compared with administering an additional test. Data were collected during annual health check-ups performed in Japan between 2011 and 2013 (inclusive). We prepared training and test datasets from the health check-up data to build prediction models; these were composed of 43,524 and 17,789 persons, respectively. Gradient-boosting decision tree (GBDT), random forest (RF), and logistic regression (LR) approaches were trained using the training dataset and were then used to predict hyperuricemia in the test dataset. Undersampling was applied to build the prediction models to deal with the imbalanced class dataset. The results showed that the RF and GBDT approaches afforded the best performances in terms of sensitivity and specificity, respectively. The area under the curve (AUC) values of the models, which reflected the total discriminative ability of the classification, were 0.796 [95% confidence interval (CI): 0.766-0.825] for the GBDT, 0.784 [95% CI: 0.752-0.815] for the RF, and 0.785 [95% CI: 0.752-0.819] for the LR approaches. No significant differences were observed between pairs of each approach. Small changes occurred in the AUCs after applying undersampling to build the models. We developed a virtual health check-up that predicted the development of hyperuricemia using machine-learning methods. The GBDT, RF, and LR methods had similar predictive capability. Undersampling did not remarkably improve predictive power. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Reopen parameter regions in two-Higgs doublet models

    NASA Astrophysics Data System (ADS)

    Staub, Florian

    2018-01-01

    The stability of the electroweak potential is a very important constraint for models of new physics. At the moment, it is standard for Two-Higgs doublet models (THDM), singlet or triplet extensions of the standard model to perform these checks at tree-level. However, these models are often studied in the presence of very large couplings. Therefore, it can be expected that radiative corrections to the potential are important. We study these effects at the example of the THDM type-II and find that loop corrections can revive more than 50% of the phenomenological viable points which are ruled out by the tree-level vacuum stability checks. Similar effects are expected for other extension of the standard model.

  3. Harvesting river water through small dams promote positive environmental impact.

    PubMed

    Agoramoorthy, Govindasamy; Chaudhary, Sunita; Chinnasamy, Pennan; Hsu, Minna J

    2016-11-01

    While deliberations relating to negative consequences of large dams on the environment continue to dominate world attention, positive benefits provided by small dams, also known as check dams, go unobserved. Besides, little is known about the potential of check dams in mitigating global warming impacts due to less data availability. Small dams are usually commissioned to private contractors who do not have clear mandate from their employers to post their work online for public scrutiny. As a result, statistics on the design, cost, and materials used to build check dams are not available in public domain. However, this review paper presents data for the first time on the often ignored potential of check dams mitigating climate-induced hydrological threats. We hope that the scientific analysis presented in this paper will promote further research on check dams worldwide to better comprehend their eco-friendly significance serving society.

  4. Automatic monitoring of the alignment and wear of vibration welding equipment

    DOEpatents

    Spicer, John Patrick; Cai, Wayne W.; Chakraborty, Debejyo; Mink, Keith

    2017-05-23

    A vibration welding system includes vibration welding equipment having a welding horn and anvil, a host machine, a check station, and a welding robot. At least one displacement sensor is positioned with respect to one of the welding equipment and the check station. The robot moves the horn and anvil via an arm to the check station, when a threshold condition is met, i.e., a predetermined amount of time has elapsed or a predetermined number of welds have been completed. The robot moves the horn and anvil to the check station, activates the at least one displacement sensor, at the check station, and determines a status condition of the welding equipment by processing the received signals. The status condition may be one of the alignment of the vibration welding equipment and the wear or degradation of the vibration welding equipment.

  5. 40 CFR 63.11224 - What are my monitoring, installation, operation, and maintenance requirements?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... assurance or control activities (including, as applicable, calibration checks and required zero and span... applicable, calibration checks and required zero and span adjustments) do not constitute monitoring... required zero and span adjustments), you must conduct all monitoring in continuous operation at all times...

  6. 40 CFR 63.11224 - What are my monitoring, installation, operation, and maintenance requirements?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... assurance or control activities (including, as applicable, calibration checks and required zero and span... applicable, calibration checks and required zero and span adjustments) do not constitute monitoring... required zero and span adjustments), you must conduct all monitoring in continuous operation at all times...

  7. Modeling seasonal variation of hip fracture in Montreal, Canada.

    PubMed

    Modarres, Reza; Ouarda, Taha B M J; Vanasse, Alain; Orzanco, Maria Gabriela; Gosselin, Pierre

    2012-04-01

    The investigation of the association of the climate variables with hip fracture incidences is important in social health issues. This study examined and modeled the seasonal variation of monthly population based hip fracture rate (HFr) time series. The seasonal ARIMA time series modeling approach is used to model monthly HFr incidences time series of female and male patients of the ages 40-74 and 75+ of Montreal, Québec province, Canada, in the period of 1993-2004. The correlation coefficients between meteorological variables such as temperature, snow depth, rainfall depth and day length and HFr are significant. The nonparametric Mann-Kendall test for trend assessment and the nonparametric Levene's test and Wilcoxon's test for checking the difference of HFr before and after change point are also used. The seasonality in HFr indicated sharp difference between winter and summer time. The trend assessment showed decreasing trends in HFr of female and male groups. The nonparametric test also indicated a significant change of the mean HFr. A seasonal ARIMA model was applied for HFr time series without trend and a time trend ARIMA model (TT-ARIMA) was developed and fitted to HFr time series with a significant trend. The multi criteria evaluation showed the adequacy of SARIMA and TT-ARIMA models for modeling seasonal hip fracture time series with and without significant trend. In the time series analysis of HFr of the Montreal region, the effects of the seasonal variation of climate variables on hip fracture are clear. The Seasonal ARIMA model is useful for modeling HFr time series without trend. However, for time series with significant trend, the TT-ARIMA model should be applied for modeling HFr time series. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. Verification of Compartmental Epidemiological Models using Metamorphic Testing, Model Checking and Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramanathan, Arvind; Steed, Chad A; Pullum, Laura L

    Compartmental models in epidemiology are widely used as a means to model disease spread mechanisms and understand how one can best control the disease in case an outbreak of a widespread epidemic occurs. However, a significant challenge within the community is in the development of approaches that can be used to rigorously verify and validate these models. In this paper, we present an approach to rigorously examine and verify the behavioral properties of compartmen- tal epidemiological models under several common modeling scenarios including birth/death rates and multi-host/pathogen species. Using metamorphic testing, a novel visualization tool and model checking, we buildmore » a workflow that provides insights into the functionality of compartmental epidemiological models. Our initial results indicate that metamorphic testing can be used to verify the implementation of these models and provide insights into special conditions where these mathematical models may fail. The visualization front-end allows the end-user to scan through a variety of parameters commonly used in these models to elucidate the conditions under which an epidemic can occur. Further, specifying these models using a process algebra allows one to automatically construct behavioral properties that can be rigorously verified using model checking. Taken together, our approach allows for detecting implementation errors as well as handling conditions under which compartmental epidemiological models may fail to provide insights into disease spread dynamics.« less

  9. The DaveMLTranslator: An Interface for DAVE-ML Aerodynamic Models

    NASA Technical Reports Server (NTRS)

    Hill, Melissa A.; Jackson, E. Bruce

    2007-01-01

    It can take weeks or months to incorporate a new aerodynamic model into a vehicle simulation and validate the performance of the model. The Dynamic Aerospace Vehicle Exchange Markup Language (DAVE-ML) has been proposed as a means to reduce the time required to accomplish this task by defining a standard format for typical components of a flight dynamic model. The purpose of this paper is to describe an object-oriented C++ implementation of a class that interfaces a vehicle subsystem model specified in DAVE-ML and a vehicle simulation. Using the DaveMLTranslator class, aerodynamic or other subsystem models can be automatically imported and verified at run-time, significantly reducing the elapsed time between receipt of a DAVE-ML model and its integration into a simulation environment. The translator performs variable initializations, data table lookups, and mathematical calculations for the aerodynamic build-up, and executes any embedded static check-cases for verification. The implementation is efficient, enabling real-time execution. Simple interface code for the model inputs and outputs is the only requirement to integrate the DaveMLTranslator as a vehicle aerodynamic model. The translator makes use of existing table-lookup utilities from the Langley Standard Real-Time Simulation in C++ (LaSRS++). The design and operation of the translator class is described and comparisons with existing, conventional, C++ aerodynamic models of the same vehicle are given.

  10. Discrete time rescaling theorem: determining goodness of fit for discrete time statistical models of neural spiking.

    PubMed

    Haslinger, Robert; Pipa, Gordon; Brown, Emery

    2010-10-01

    One approach for understanding the encoding of information by spike trains is to fit statistical models and then test their goodness of fit. The time-rescaling theorem provides a goodness-of-fit test consistent with the point process nature of spike trains. The interspike intervals (ISIs) are rescaled (as a function of the model's spike probability) to be independent and exponentially distributed if the model is accurate. A Kolmogorov-Smirnov (KS) test between the rescaled ISIs and the exponential distribution is then used to check goodness of fit. This rescaling relies on assumptions of continuously defined time and instantaneous events. However, spikes have finite width, and statistical models of spike trains almost always discretize time into bins. Here we demonstrate that finite temporal resolution of discrete time models prevents their rescaled ISIs from being exponentially distributed. Poor goodness of fit may be erroneously indicated even if the model is exactly correct. We present two adaptations of the time-rescaling theorem to discrete time models. In the first we propose that instead of assuming the rescaled times to be exponential, the reference distribution be estimated through direct simulation by the fitted model. In the second, we prove a discrete time version of the time-rescaling theorem that analytically corrects for the effects of finite resolution. This allows us to define a rescaled time that is exponentially distributed, even at arbitrary temporal discretizations. We demonstrate the efficacy of both techniques by fitting generalized linear models to both simulated spike trains and spike trains recorded experimentally in monkey V1 cortex. Both techniques give nearly identical results, reducing the false-positive rate of the KS test and greatly increasing the reliability of model evaluation based on the time-rescaling theorem.

  11. 10 CFR 35.2642 - Records of periodic spot-checks for teletherapy units.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ....2642 Section 35.2642 Energy NUCLEAR REGULATORY COMMISSION MEDICAL USE OF BYPRODUCT MATERIAL Records... must include— (1) The date of the spot-check; (2) The manufacturer's name, model number, and serial... device; (6) The determined accuracy of each distance measuring and localization device; (7) The...

  12. 10 CFR 35.2642 - Records of periodic spot-checks for teletherapy units.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ....2642 Section 35.2642 Energy NUCLEAR REGULATORY COMMISSION MEDICAL USE OF BYPRODUCT MATERIAL Records... must include— (1) The date of the spot-check; (2) The manufacturer's name, model number, and serial... device; (6) The determined accuracy of each distance measuring and localization device; (7) The...

  13. 12 CFR Appendix A to Part 205 - Model Disclosure Clauses and Forms

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... your checking account using information from your check to: (i) Pay for purchases. (ii) Pay bills. (3... disclose information to third parties about your account or the transfers you make: (i) Where it is...) Disclosure by government agencies of information about obtaining account balances and account histories...

  14. Using computer models to design gully erosion control structures for humid northern Ethiopia

    USDA-ARS?s Scientific Manuscript database

    Classic gully erosion control measures such as check dams have been unsuccessful in halting gully formation and growth in the humid northern Ethiopian highlands. Gullies are typically formed in vertisols and flow often bypasses the check dams as elevated groundwater tables make gully banks unstable....

  15. Building Program Verifiers from Compilers and Theorem Provers

    DTIC Science & Technology

    2015-05-14

    Checking with SMT UFO • LLVM-based front-end (partially reused in SeaHorn) • Combines Abstract Interpretation with Interpolation-Based Model Checking • (no...assertions Counter-examples are long Hard to determine (from main) what is relevant Assertion Main 35 Building Verifiers from Comp and SMT Gurfinkel, 2015

  16. Identifying inequities in maternal and child health through risk stratification to inform health systems strengthening in Northern Togo

    PubMed Central

    McCarthy, Katharine J.; Braganza, Sandra; Fiori, Kevin; Gbeleou, Christophe; Kpakpo, Vivien; Lopez, Andrew; Schechter, Jennifer; Singham Goodwin, Alicia; Jones, Heidi E.

    2017-01-01

    Objective In Togo, substantial progress in maternal and child health is needed to reach global development goals. To better inform clinic and community-based health services, this study identifies factors associated with maternal and child health care utilization in the Kara region of Northern Togo. Methods We conducted a population-representative household survey of four health clinic catchment areas of 1,075 women of reproductive age in 2015. Multivariable logistic regression was used to model individual and structural factors associated with utilization of four maternal and child health services. Key outcomes were: facility-based delivery, maternal postnatal health check by a health professional within the first six weeks of birth, childhood vaccination, and receipt of malaria medication for febrile children under age five within 72 hours of symptom onset. Results 83 percent of women who gave birth in the last 2 years delivered at a health facility. In adjusted models, the strongest predictor of facility delivery in the rural catchment areas was proximity to a health center, with women living under three kilometers having 3.7 (95% CI 1.7, 7.9) times the odds of a facility birth. Only 11 percent of women received a health check by a health provider at any time in the postnatal period. Postnatal health checks were less likely for women in the poorest households and for women who resided in rural areas. Children of polygamous mothers had half the odds of receiving malaria medication for fever within 72 hours of symptom onset, while children with increased household wealth status had increased odds of childhood vaccination and receiving treatment for malaria. Conclusion Our analysis highlights the importance of risk stratification analysis to inform the delivery and scope of maternal and child health programs needed to reach those with the least access to care. PMID:28301539

  17. MS Voss checks out his EVA space tools

    NASA Image and Video Library

    2001-03-09

    STS102-E-5032 (9 March 2001) --- On Discovery's mid deck, astronauts James S. Voss and Susan J. Helms (partially visible at right edge), STS-102 mission specialists, check gear associated with a scheduled space walk to perform work on the International Space Station (ISS). At the time this Flight Day 1 digital still camera image was exposed, the Discovery was on a time line to catch the orbital outpost and link with it during Flight Day 2.

  18. Arbitrary-order corrections for finite-time drift and diffusion coefficients

    NASA Astrophysics Data System (ADS)

    Anteneodo, C.; Riera, R.

    2009-09-01

    We address a standard class of diffusion processes with linear drift and quadratic diffusion coefficients. These contributions to dynamic equations can be directly drawn from data time series. However, real data are constrained to finite sampling rates and therefore it is crucial to establish a suitable mathematical description of the required finite-time corrections. Based on Itô-Taylor expansions, we present the exact corrections to the finite-time drift and diffusion coefficients. These results allow to reconstruct the real hidden coefficients from the empirical estimates. We also derive higher-order finite-time expressions for the third and fourth conditional moments that furnish extra theoretical checks for this class of diffusion models. The analytical predictions are compared with the numerical outcomes of representative artificial time series.

  19. Exploration of Effective Persuasive Strategies Used in Resisting Product Advertising: A Case Study of Adult Health Check-Ups.

    PubMed

    Tien, Han-Kuang; Chung, Wen

    2018-05-10

    This research addressed adults' health check-ups through the lens of Role Transportation Theory. This theory is applied to narrative advertising that lures adults into seeking health check-ups by causing audiences to empathize with the advertisement's character. This study explored the persuasive mechanism behind narrative advertising and reinforced the Protection Motivation Theory model. We added two key perturbation variables: optimistic bias and truth avoidance. To complete the verification hypothesis, we performed two experiments. In Experiment 1, we recruited 77 respondents online for testing. We used analyses of variance to verify the effectiveness of narrative and informative advertising. Then, in Experiment 2, we recruited 228 respondents to perform offline physical experiments and conducted a path analysis through structural equation modelling. The findings showed that narrative advertising positively impacted participants' disease prevention intentions. The use of Role Transportation Theory in advertising enables the audience to be emotionally connected with the character, which enhances persuasiveness. In Experiment 2, we found that the degree of role transference can interfere with optimistic bias, improve perceived health risk, and promote behavioral intentions for health check-ups. Furthermore, truth avoidance can interfere with perceived health risks, which, in turn, reduce behavioral intentions for health check-ups.

  20. Time series analysis of gold production in Malaysia

    NASA Astrophysics Data System (ADS)

    Muda, Nora; Hoon, Lee Yuen

    2012-05-01

    Gold is a soft, malleable, bright yellow metallic element and unaffected by air or most reagents. It is highly valued as an asset or investment commodity and is extensively used in jewellery, industrial application, dentistry and medical applications. In Malaysia, gold mining is limited in several areas such as Pahang, Kelantan, Terengganu, Johor and Sarawak. The main purpose of this case study is to obtain a suitable model for the production of gold in Malaysia. The model can also be used to predict the data of Malaysia's gold production in the future. Box-Jenkins time series method was used to perform time series analysis with the following steps: identification, estimation, diagnostic checking and forecasting. In addition, the accuracy of prediction is tested using mean absolute percentage error (MAPE). From the analysis, the ARIMA (3,1,1) model was found to be the best fitted model with MAPE equals to 3.704%, indicating the prediction is very accurate. Hence, this model can be used for forecasting. This study is expected to help the private and public sectors to understand the gold production scenario and later plan the gold mining activities in Malaysia.

  1. Infilling and quality checking of discharge, precipitation and temperature data using a copula based approach

    NASA Astrophysics Data System (ADS)

    Anwar, Faizan; Bárdossy, András; Seidel, Jochen

    2017-04-01

    Estimating missing values in a time series of a hydrological variable is an everyday task for a hydrologist. Existing methods such as inverse distance weighting, multivariate regression, and kriging, though simple to apply, provide no indication of the quality of the estimated value and depend mainly on the values of neighboring stations at a given step in the time series. Copulas have the advantage of representing the pure dependence structure between two or more variables (given the relationship between them is monotonic). They rid us of questions such as transforming the data before use or calculating functions that model the relationship between the considered variables. A copula-based approach is suggested to infill discharge, precipitation, and temperature data. As a first step the normal copula is used, subsequently, the necessity to use non-normal / non-symmetrical dependence is investigated. Discharge and temperature are treated as regular continuous variables and can be used without processing for infilling and quality checking. Due to the mixed distribution of precipitation values, it has to be treated differently. This is done by assigning a discrete probability to the zeros and treating the rest as a continuous distribution. Building on the work of others, along with infilling, the normal copula is also utilized to identify values in a time series that might be erroneous. This is done by treating the available value as missing, infilling it using the normal copula and checking if it lies within a confidence band (5 to 95% in our case) of the obtained conditional distribution. Hydrological data from two catchments Upper Neckar River (Germany) and Santa River (Peru) are used to demonstrate the application for datasets with different data quality. The Python code used here is also made available on GitHub. The required input is the time series of a given variable at different stations.

  2. Assessing the effects of check dams on sediment dynamics in a debris-flow catchment through SfM technique

    NASA Astrophysics Data System (ADS)

    Cucchiaro, Sara; Beinat, Alberto; Calsamiglia, Aleix; Cavalli, Marco; Cazorzi, Federico; Crema, Stefano; Marchi, Lorenzo

    2017-04-01

    The Moscardo Torrent (eastern Italian Alps) is a small rugged catchment (drainage area 4.1 km2, range in elevation between 890 and 2043 m) frequently affected by debris flows that deliver large amounts of sediment to the receiving stream, and cause concerns for infrastructures located on the alluvial fan and near the confluence. Over the last decades, hydraulic control works were implemented in the main channel to limit bed erosion and to stabilize channel banks. Although the objectives of training works have been only partly achieved, check dams and hillslope stabilization works have affected the sediment transfer from hillslopes to the channels and along the main channel. The effects of hydraulic control works were investigated by means of multi-temporal Structure from Motion (SfM) surveys based on images taken from the ground and UAV. The ground and air based surveys were carried out over a channel reach in which two check dams have recently been built. SfM surveys were taken before and after three debris-flow events (occurred between June and July 2016), allowing the generation of four high-resolution Digital Elevation Models (DEMs). Geomorphic changes caused by the debris-flow events have been assessed in order to produce the DEM of Differences (DoDs with a 0.2 m spatial resolution) that allowed estimating erosion and deposition volumes in the study area. Furthermore a debris-flow monitoring system has been in operation in the Moscardo Torrent; the analysis of the videos and of the hydrographs recorded by ultrasonic sensors permitted to assess the debris-flow volumes. These estimates were used to characterize the magnitude of events in support of the topographic analysis. By examining the changing pattern of erosion and deposition over time it was possible to understand the check dams' effects on sediment dynamics. The results show that the new check dams effectively stored sediment transported by the three debris flows. However, once the check dams have been completely filled, they lost their functionality, letting sediment flow downstream along paths drawn accidentally by the torrent control works and by the morphology of debris-flow deposits. Moreover, debris-flow lobes deposited upstream of the check dams could act as sediment sources further increasing downstream debris-flow magnitude.

  3. Discrete Time Rescaling Theorem: Determining Goodness of Fit for Discrete Time Statistical Models of Neural Spiking

    PubMed Central

    Haslinger, Robert; Pipa, Gordon; Brown, Emery

    2010-01-01

    One approach for understanding the encoding of information by spike trains is to fit statistical models and then test their goodness of fit. The time rescaling theorem provides a goodness of fit test consistent with the point process nature of spike trains. The interspike intervals (ISIs) are rescaled (as a function of the model’s spike probability) to be independent and exponentially distributed if the model is accurate. A Kolmogorov Smirnov (KS) test between the rescaled ISIs and the exponential distribution is then used to check goodness of fit. This rescaling relies upon assumptions of continuously defined time and instantaneous events. However spikes have finite width and statistical models of spike trains almost always discretize time into bins. Here we demonstrate that finite temporal resolution of discrete time models prevents their rescaled ISIs from being exponentially distributed. Poor goodness of fit may be erroneously indicated even if the model is exactly correct. We present two adaptations of the time rescaling theorem to discrete time models. In the first we propose that instead of assuming the rescaled times to be exponential, the reference distribution be estimated through direct simulation by the fitted model. In the second, we prove a discrete time version of the time rescaling theorem which analytically corrects for the effects of finite resolution. This allows us to define a rescaled time which is exponentially distributed, even at arbitrary temporal discretizations. We demonstrate the efficacy of both techniques by fitting Generalized Linear Models (GLMs) to both simulated spike trains and spike trains recorded experimentally in monkey V1 cortex. Both techniques give nearly identical results, reducing the false positive rate of the KS test and greatly increasing the reliability of model evaluation based upon the time rescaling theorem. PMID:20608868

  4. Population Pharmacokinetics of Hydroxychloroquine in Japanese Patients With Cutaneous or Systemic Lupus Erythematosus.

    PubMed

    Morita, Shigemichi; Takahashi, Toshiya; Yoshida, Yasushi; Yokota, Naohisa

    2016-04-01

    Hydroxychloroquine (HCQ) is an effective treatment for patients with cutaneous lupus erythematosus (CLE) or systemic lupus erythematosus (SLE) and has been used for these patients in more than 70 nations. However, in Japan, HCQ has not been approved for CLE or SLE. To establish an appropriate therapeutic regimen and to clarify the pharmacokinetics (PK) of HCQ in Japanese patients with CLE with or without SLE (CLE/SLE), a population pharmacokinetic (PopPK) analysis was performed. In a clinical study of Japanese patients with a diagnosis of CLE irrespective of the presence of SLE, blood and plasma drug concentration-time data receiving multiple oral doses of HCQ sulfate (200-400 mg daily) were analyzed using nonlinear mixed-effects model software. The blood and plasma concentrations of HCQ were analyzed using a high-performance liquid chromatography tandem mass spectrometry method. Model evaluation and validation were performed using goodness-of-fit (GOF) plots, visual predictive check, and a bootstrap. The PopPKs of HCQ in the blood and plasma of 90 Japanese patients with CLE/SLE were well described by a 1-compartment model with first-order absorption and absorption lag time. Body weight was a significant (P < 0.001) covariate of oral clearance of HCQ. The final model was assessed using GOF plots, a bootstrap, and visual predictive check, and this model was appropriate. Simulations based on the final model suggested that the recommended daily doses of HCQ sulfate (200-400 mg) based on the ideal body weight in Japanese patients with CLE/SLE were in the similar concentration ranges. The PopPK models derived from both blood and plasma HCQ concentrations of Japanese patients with CLE/SLE were developed and validated. Based on this study, the dosage regimens of HCQ sulfate for Japanese patients with CLE/SLE should be calculated using the individual ideal body weight.

  5. The GOLM-database standard- a framework for time-series data management based on free software

    NASA Astrophysics Data System (ADS)

    Eichler, M.; Francke, T.; Kneis, D.; Reusser, D.

    2009-04-01

    Monitoring and modelling projects usually involve time series data originating from different sources. Often, file formats, temporal resolution and meta-data documentation rarely adhere to a common standard. As a result, much effort is spent on converting, harmonizing, merging, checking, resampling and reformatting these data. Moreover, in work groups or during the course of time, these tasks tend to be carried out redundantly and repeatedly, especially when new data becomes available. The resulting duplication of data in various formats strains additional ressources. We propose a database structure and complementary scripts for facilitating these tasks. The GOLM- (General Observation and Location Management) framework allows for import and storage of time series data of different type while assisting in meta-data documentation, plausibility checking and harmonization. The imported data can be visually inspected and its coverage among locations and variables may be visualized. Supplementing scripts provide options for data export for selected stations and variables and resampling of the data to the desired temporal resolution. These tools can, for example, be used for generating model input files or reports. Since GOLM fully supports network access, the system can be used efficiently by distributed working groups accessing the same data over the internet. GOLM's database structure and the complementary scripts can easily be customized to specific needs. Any involved software such as MySQL, R, PHP, OpenOffice as well as the scripts for building and using the data base, including documentation, are free for download. GOLM was developed out of the practical requirements of the OPAQUE-project. It has been tested and further refined in the ERANET-CRUE and SESAM projects, all of which used GOLM to manage meteorological, hydrological and/or water quality data.

  6. Capacity for patterns and sequences in Kanerva's SDM as compared to other associative memory models

    NASA Technical Reports Server (NTRS)

    Keeler, James D.

    1987-01-01

    The information capacity of Kanerva's Sparse Distributed Memory (SDM) and Hopfield-type neural networks is investigated. Under the approximations used, it is shown that the total information stored in these systems is proportional to the number connections in the network. The proportionality constant is the same for the SDM and Hopfield-type models independent of the particular model, or the order of the model. The approximations are checked numerically. This same analysis can be used to show that the SDM can store sequences of spatiotemporal patterns, and the addition of time-delayed connections allows the retrieval of context dependent temporal patterns. A minor modification of the SDM can be used to store correlated patterns.

  7. Capacity for patterns and sequences in Kanerva's SDM as compared to other associative memory models. [Sparse, Distributed Memory

    NASA Technical Reports Server (NTRS)

    Keeler, James D.

    1988-01-01

    The information capacity of Kanerva's Sparse Distributed Memory (SDM) and Hopfield-type neural networks is investigated. Under the approximations used here, it is shown that the total information stored in these systems is proportional to the number connections in the network. The proportionality constant is the same for the SDM and Hopfield-type models independent of the particular model, or the order of the model. The approximations are checked numerically. This same analysis can be used to show that the SDM can store sequences of spatiotemporal patterns, and the addition of time-delayed connections allows the retrieval of context dependent temporal patterns. A minor modification of the SDM can be used to store correlated patterns.

  8. Good health checks according to the general public; expectations and criteria: a focus group study.

    PubMed

    Stol, Yrrah H; Asscher, Eva C A; Schermer, Maartje H N

    2018-06-22

    Health checks or health screenings identify (risk factors for) disease in people without a specific medical indication. So far, the perspective of (potential) health check users has remained underexposed in discussions about the ethics and regulation of health checks. In 2017, we conducted a qualitative study with lay people from the Netherlands (four focus groups). We asked what participants consider characteristics of good and bad health checks, and whether they saw a role for the Dutch government. Participants consider a good predictive value the most important characteristic of a good health check. Information before, during and after the test, knowledgeable and reliable providers, tests for treatable (risk factors for) disease, respect for privacy, no unnecessary health risks and accessibility are also mentioned as criteria for good health checks. Participants make many assumptions about health check offers. They assume health checks provide certainty about the presence or absence of disease, that health checks offer opportunities for health benefits and that the privacy of health check data is guaranteed. In their choice for provider and test they tend to rely more on heuristics than information. Participants trust physicians to put the interest of potential health check users first and expect the Dutch government to intervene if providers other than physicians failed to do so by offering tests with a low predictive value, or tests that may harm people, or by infringing the privacy of users. Assumptions of participants are not always justified, but they may influence the choice to participate. This is problematic because choices for checks with a low predictive value that do not provide health benefits may create uncertainty and may cause harm to health; an outcome diametrically opposite to the one intended. Also, this may impair the relationship of trust with physicians and the Dutch government. To further and protect autonomous choice and to maintain trust, we recommend the following measures to timely adjust false expectations: advertisements that give an accurate impression of health check offers, and the installation of a quality mark.

  9. Thermodynamic feature of a Brownian heat engine operating between two heat baths.

    PubMed

    Asfaw, Mesfin

    2014-01-01

    A generalized theory of nonequilibrium thermodynamics for a Brownian motor operating between two different heat baths is presented. Via a simple paradigmatic model, we not only explore the thermodynamic feature of the engine in the regime of the nonequilibrium steady state but also study the short time behavior of the system for either the isothermal case with load or, in general, the nonisothermal case with or without load. Many elegant thermodynamic theories can be checked via the present model. Furthermore the dependence of the velocity, the efficiency, and the performance of the refrigerator on time t is examined. Our study reveals a current reversal due to time t. In the early system relaxation period, the model works neither as a heat engine nor as a refrigerator and only after a certain period of time does the model start functioning as a heat engine or as a refrigerator. The performance of the engine also improves with time and at steady state the engine manifests a higher efficiency or performance as a refrigerator. Furthermore the effect of energy exchange via the kinetic energy on the performance of the heat engine is explored.

  10. Profile local linear estimation of generalized semiparametric regression model for longitudinal data.

    PubMed

    Sun, Yanqing; Sun, Liuquan; Zhou, Jie

    2013-07-01

    This paper studies the generalized semiparametric regression model for longitudinal data where the covariate effects are constant for some and time-varying for others. Different link functions can be used to allow more flexible modelling of longitudinal data. The nonparametric components of the model are estimated using a local linear estimating equation and the parametric components are estimated through a profile estimating function. The method automatically adjusts for heterogeneity of sampling times, allowing the sampling strategy to depend on the past sampling history as well as possibly time-dependent covariates without specifically model such dependence. A [Formula: see text]-fold cross-validation bandwidth selection is proposed as a working tool for locating an appropriate bandwidth. A criteria for selecting the link function is proposed to provide better fit of the data. Large sample properties of the proposed estimators are investigated. Large sample pointwise and simultaneous confidence intervals for the regression coefficients are constructed. Formal hypothesis testing procedures are proposed to check for the covariate effects and whether the effects are time-varying. A simulation study is conducted to examine the finite sample performances of the proposed estimation and hypothesis testing procedures. The methods are illustrated with a data example.

  11. Site investigation and modelling at "La Maina" landslide (Carnian Alps, Italy)

    NASA Astrophysics Data System (ADS)

    Marcato, G.; Mantovani, M.; Pasuto, A.; Silvano, S.; Tagliavini, F.; Zabuski, L.; Zannoni, A.

    2006-01-01

    The Sauris reservoir is a hydroelectric basin closed downstream by a 136 m high, double arc concrete dam. The dam is firmly anchored to a consistent rock (Dolomia dello Schlern), but the Lower Triassic clayey formations, cropping out especially in the lower part of the slopes, have made the whole catchment basin increasingly prone to landslides. In recent years, the "La Maina landslide" has opened up several joints over a surface of about 100 000 m2, displacing about 1 500 000 m3 of material. Particular attention is now being given to the evolution of the instability area, as the reservoir is located at the foot of the landslide. Under the commission of the Regional Authority for Civil Protection a numerical modelling simulation in a pseudo-time condition of the slope was developed, in order to understand the risk for transport infrastructures, for some houses and for the reservoir and to take urgent mesaures to stabilize the slope. A monitoring system consisting of four inclinometers, three wire extensometers and ten GPS bench-mark pillars was immediately set up to check on surface and deep displacements. The data collected and the geological and geomorphological evidences was used to carry out a numerical simulation. The reliability of the results was checked by comparing the model with the morphological evidence of the movement. The mitigation measures were designed and realised following the indications provided by the model.

  12. SU-E-T-377: Inaccurate Positioning Might Introduce Significant MapCheck Calibration Error in Flatten Filter Free Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, S; Chao, C; Columbia University, NY, NY

    2014-06-01

    Purpose: This study investigates the calibration error of detector sensitivity for MapCheck due to inaccurate positioning of the device, which is not taken into account by the current commercial iterative calibration algorithm. We hypothesize the calibration is more vulnerable to the positioning error for the flatten filter free (FFF) beams than the conventional flatten filter flattened beams. Methods: MapCheck2 was calibrated with 10MV conventional and FFF beams, with careful alignment and with 1cm positioning error during calibration, respectively. Open fields of 37cmx37cm were delivered to gauge the impact of resultant calibration errors. The local calibration error was modeled as amore » detector independent multiplication factor, with which propagation error was estimated with positioning error from 1mm to 1cm. The calibrated sensitivities, without positioning error, were compared between the conventional and FFF beams to evaluate the dependence on the beam type. Results: The 1cm positioning error leads to 0.39% and 5.24% local calibration error in the conventional and FFF beams respectively. After propagating to the edges of MapCheck, the calibration errors become 6.5% and 57.7%, respectively. The propagation error increases almost linearly with respect to the positioning error. The difference of sensitivities between the conventional and FFF beams was small (0.11 ± 0.49%). Conclusion: The results demonstrate that the positioning error is not handled by the current commercial calibration algorithm of MapCheck. Particularly, the calibration errors for the FFF beams are ~9 times greater than those for the conventional beams with identical positioning error, and a small 1mm positioning error might lead to up to 8% calibration error. Since the sensitivities are only slightly dependent of the beam type and the conventional beam is less affected by the positioning error, it is advisable to cross-check the sensitivities between the conventional and FFF beams to detect potential calibration errors due to inaccurate positioning. This work was partially supported by a DOD Grant No.; DOD W81XWH1010862.« less

  13. Impact of geographic accessibility on utilization of the annual health check-ups by income level in Japan: A multilevel analysis

    PubMed Central

    Fujita, Misuzu; Hata, Akira

    2017-01-01

    Although both geographic accessibility and socioeconomic status have been indicated as being important factors for the utilization of health care services, their combined effect has not been evaluated. The aim of this study was to reveal whether an income-dependent difference in the impact of geographic accessibility on the utilization of government-led annual health check-ups exists. Existing data collected and provided by Chiba City Hall were employed and analyzed as a retrospective cohort study. The subjects were 166,966 beneficiaries of National Health Insurance in Chiba City, Japan, aged 40 to 74 years. Of all subjects, 54,748 (32.8%) had an annual health check-up in fiscal year 2012. As an optimal index of geographic accessibility has not been established, five measures were calculated: travel time to the nearest health care facility, density of health care facilities (number facilities within a 30-min walking distance from the district of residence), and three indices based on the two-step floating catchment area method. Three-level logistic regression modeling with random intercepts for household and district of residence was performed. Of the five measures, density of health care facilities was the most compatible according to Akaike’s information criterion. Both low density and low income were associated with decreased utilization of the health check-ups. Furthermore, a linear relationship was observed between the density of facilities and utilization of the health check-ups in all income groups and its slope was significantly steeper among subjects with an equivalent income of 0.00 yen than among those with equivalent income of 1.01–2.00 million yen (p = 0.028) or 2.01 million yen or more (p = 0.040). This result indicated that subjects with lower incomes were more susceptible to the effects of geographic accessibility than were those with higher incomes. Thus, better geographic accessibility could increase the health check-up utilization and also decrease the income-related disparity of utilization. PMID:28486522

  14. A rule-based approach to model checking of UML state machines

    NASA Astrophysics Data System (ADS)

    Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz

    2016-12-01

    In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.

  15. Introduction of Virtualization Technology to Multi-Process Model Checking

    NASA Technical Reports Server (NTRS)

    Leungwattanakit, Watcharin; Artho, Cyrille; Hagiya, Masami; Tanabe, Yoshinori; Yamamoto, Mitsuharu

    2009-01-01

    Model checkers find failures in software by exploring every possible execution schedule. Java PathFinder (JPF), a Java model checker, has been extended recently to cover networked applications by caching data transferred in a communication channel. A target process is executed by JPF, whereas its peer process runs on a regular virtual machine outside. However, non-deterministic target programs may produce different output data in each schedule, causing the cache to restart the peer process to handle the different set of data. Virtualization tools could help us restore previous states of peers, eliminating peer restart. This paper proposes the application of virtualization technology to networked model checking, concentrating on JPF.

  16. Body checking is associated with weight- and body-related shame and weight- and body-related guilt among men and women.

    PubMed

    Solomon-Krakus, Shauna; Sabiston, Catherine M

    2017-12-01

    This study examined whether body checking was a correlate of weight- and body-related shame and guilt for men and women. Participants were 537 adults (386 women) between the ages of 17 and 74 (M age =28.29, SD=14.63). Preliminary analyses showed women reported significantly more body-checking (p<.001), weight- and body-related shame (p<.001), and weight- and body-related guilt (p<.001) than men. In sex-stratified hierarchical linear regression models, body checking was significantly and positively associated with weight- and body-related shame (R 2 =.29 and .43, p<.001) and weight- and body-related guilt (R 2 =.34 and .45, p<.001) for men and women, respectively. Based on these findings, body checking is associated with negative weight- and body-related self-conscious emotions. Intervention and prevention efforts aimed at reducing negative weight- and body-related self-conscious emotions should consider focusing on body checking for adult men and women. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Short- and Long-Term Earthquake Forecasts Based on Statistical Models

    NASA Astrophysics Data System (ADS)

    Console, Rodolfo; Taroni, Matteo; Murru, Maura; Falcone, Giuseppe; Marzocchi, Warner

    2017-04-01

    The epidemic-type aftershock sequences (ETAS) models have been experimentally used to forecast the space-time earthquake occurrence rate during the sequence that followed the 2009 L'Aquila earthquake and for the 2012 Emilia earthquake sequence. These forecasts represented the two first pioneering attempts to check the feasibility of providing operational earthquake forecasting (OEF) in Italy. After the 2009 L'Aquila earthquake the Italian Department of Civil Protection nominated an International Commission on Earthquake Forecasting (ICEF) for the development of the first official OEF in Italy that was implemented for testing purposes by the newly established "Centro di Pericolosità Sismica" (CPS, the seismic Hazard Center) at the Istituto Nazionale di Geofisica e Vulcanologia (INGV). According to the ICEF guidelines, the system is open, transparent, reproducible and testable. The scientific information delivered by OEF-Italy is shaped in different formats according to the interested stakeholders, such as scientists, national and regional authorities, and the general public. The communication to people is certainly the most challenging issue, and careful pilot tests are necessary to check the effectiveness of the communication strategy, before opening the information to the public. With regard to long-term time-dependent earthquake forecast, the application of a newly developed simulation algorithm to Calabria region provided typical features in time, space and magnitude behaviour of the seismicity, which can be compared with those of the real observations. These features include long-term pseudo-periodicity and clustering of strong earthquakes, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the moderate and higher magnitude range.

  18. Towards component-based validation of GATE: aspects of the coincidence processor

    PubMed Central

    Moraes, Eder R.; Poon, Jonathan K.; Balakrishnan, Karthikayan; Wang, Wenli; Badawi, Ramsey D.

    2014-01-01

    GATE is public domain software widely used for Monte Carlo simulation in emission tomography. Validations of GATE have primarily been performed on a whole-system basis, leaving the possibility that errors in one sub-system may be offset by errors in others. We assess the accuracy of the GATE PET coincidence generation sub-system in isolation, focusing on the options most closely modeling the majority of commercially available scanners. Independent coincidence generators were coded by teams at Toshiba Medical Research Unit (TMRU) and UC Davis. A model similar to the Siemens mCT scanner was created in GATE. Annihilation photons interacting with the detectors were recorded. Coincidences were generated using GATE, TMRU and UC Davis code and results compared to “ground truth” obtained from the history of the photon interactions. GATE was tested twice, once with every qualified single event opening a time window and initiating a coincidence check (the “multiple window method”), and once where a time window is opened and a coincidence check initiated only by the first single event to occur after the end of the prior time window (the “single window method”). True, scattered and random coincidences were compared. Noise equivalent count rates were also computed and compared. The TMRU and UC Davis coincidence generators agree well with ground truth. With GATE, reasonable accuracy can be obtained if the single window method option is chosen and random coincidences are estimated without use of the delayed coincidence option. However in this GATE version, other parameter combinations can result in significant errors. PMID:25240897

  19. A simple solution for improving reliability of cardiac arrest equipment provision in hospital.

    PubMed

    Davies, Michelle; Couper, Keith; Bradley, Julie; Baker, Annalie; Husselbee, Natalie; Woolley, Sarah; Davies, Robin P; Perkins, Gavin D

    2014-11-01

    Effective and safe cardiac arrest care in the hospital setting is reliant on the immediate availability of emergency equipment. The patient safety literature highlights deficiencies in current approaches to resuscitation equipment provision, highlighting the need for innovative solutions to this problem. We conducted a before-after study at a large NHS trust to evaluate the effect of a sealed tray system and database on resuscitation equipment provision. The system was evaluated by a series of unannounced inspections to assess resuscitation trolley compliance with local policy prior to and following system implementation. The time taken to check trolleys was assessed by timing clinicians checking both types of trolley in a simulation setting. The sealed tray system was implemented in 2010, and led to a significant increase in the number of resuscitation trolleys without missing, surplus, or expired items (2009: n=1 (4.76%) vs 2011: n=37 (100%), p<0.001). It also significantly reduced the time required to check each resuscitation trolley in the simulation setting (12.86 (95% CI: 10.02-15.71) vs 3.15 (95% CI: 1.19-4.51)min, p<0.001), but had no effect on the number of resuscitation trolleys checked every day over the previous month (2009: n=8 (38.10%) vs 2011: n=11 (29.73%), p=0.514). The implementation of a sealed tray system led to a significant and sustained improvement in resuscitation equipment provision, but had no effect on resuscitation trolley checking frequency. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  20. Regulatory Conformance Checking: Logic and Logical Form

    ERIC Educational Resources Information Center

    Dinesh, Nikhil

    2010-01-01

    We consider the problem of checking whether an organization conforms to a body of regulation. Conformance is studied in a runtime verification setting. The regulation is translated to a logic, from which we synthesize monitors. The monitors are evaluated as the state of an organization evolves over time, raising an alarm if a violation is…

  1. 5 CFR 1655.17 - Prepayment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... participant may repay a loan in full, without a penalty, at any time before the declaration of a taxable... means receipt by the TSP record keeper of a payment, by personal check or guaranteed funds made payable... returns a loan check to the TSP record keeper, it will be treated as a repayment; however, additional...

  2. Time Series Forecasting of the Number of Malaysia Airlines and AirAsia Passengers

    NASA Astrophysics Data System (ADS)

    Asrah, N. M.; Nor, M. E.; Rahim, S. N. A.; Leng, W. K.

    2018-04-01

    The standard practice in forecasting process involved by fitting a model and further analysis on the residuals. If we know the distributional behaviour of the time series data, it can help us to directly analyse the model identification, parameter estimation, and model checking. In this paper, we want to compare the distributional behaviour data from the number of Malaysia Airlines (MAS) and AirAsia passenger’s. From the previous research, the AirAsia passengers are govern by geometric Brownian motion (GBM). The data were normally distributed, stationary and independent. Then, GBM was used to forecast the number of AirAsia passenger’s. The same methods were applied to MAS data and the results then were compared. Unfortunately, the MAS data were not govern by GBM. Then, the standard approach in time series forecasting will be applied to MAS data. From this comparison, we can conclude that the number of AirAsia passengers are always in peak season rather than MAS passengers.

  3. Practical Results from the Application of Model Checking and Test Generation from UML/SysML Models of On-Board Space Applications

    NASA Astrophysics Data System (ADS)

    Faria, J. M.; Mahomad, S.; Silva, N.

    2009-05-01

    The deployment of complex safety-critical applications requires rigorous techniques and powerful tools both for the development and V&V stages. Model-based technologies are increasingly being used to develop safety-critical software, and arguably, turning to them can bring significant benefits to such processes, however, along with new challenges. This paper presents the results of a research project where we tried to extend current V&V methodologies to be applied on UML/SysML models and aiming at answering the demands related to validation issues. Two quite different but complementary approaches were investigated: (i) model checking and the (ii) extraction of robustness test-cases from the same models. These two approaches don't overlap and when combined provide a wider reaching model/design validation ability than each one alone thus offering improved safety assurance. Results are very encouraging, even though they either fell short of the desired outcome as shown for model checking, or still appear as not fully matured as shown for robustness test case extraction. In the case of model checking, it was verified that the automatic model validation process can become fully operational and even expanded in scope once tool vendors help (inevitably) to improve the XMI standard interoperability situation. For the robustness test case extraction methodology, the early approach produced interesting results but need further systematisation and consolidation effort in order to produce results in a more predictable fashion and reduce reliance on expert's heuristics. Finally, further improvements and innovation research projects were immediately apparent for both investigated approaches, which point to either circumventing current limitations in XMI interoperability on one hand and bringing test case specification onto the same graphical level as the models themselves and then attempting to automate the generation of executable test cases from its standard UML notation.

  4. 75 FR 52482 - Airworthiness Directives; PILATUS Aircraft Ltd. Model PC-7 Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-26

    ..., check the airplane maintenance records to determine if the left and/or right aileron outboard bearing... an entry is found during the airplane maintenance records check required in paragraph (f)(1) of this...-0849; Directorate Identifier 2010-CE-043-AD] RIN 2120-AA64 Airworthiness Directives; PILATUS Aircraft...

  5. 77 FR 50644 - Airworthiness Directives; Cessna Airplane Company Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... airplanes that have P/N 1134104-1 or 1134104-5 A/C compressor motor installed; an aircraft logbook check for... following: (1) Inspect the number of hours on the A/C compressor hour meter; and (2) Check the aircraft.... Do the replacement following Cessna Aircraft Company Model 525 Maintenance Manual, Revision 23, dated...

  6. CCM-C,Collins checks the middeck experiment

    NASA Image and Video Library

    1999-07-24

    S93-E-5016 (23 July 1999) --- Astronaut Eileen M. Collins, mission commander, checks on an experiment on Columbia's middeck during Flight Day 1 activity. The experiment is called the Cell Culture Model, Configuration C. Objectives of it are to validate cell culture models for muscle, bone and endothelial cell biochemical and functional loss induced by microgravity stress; to evaluate cytoskeleton, metabolism, membrane integrity and protease activity in target cells; and to test tissue loss pharmaceuticals for efficacy. The photo was recorded with an electronic still camera (ESC).

  7. Markov Mixed Effects Modeling Using Electronic Adherence Monitoring Records Identifies Influential Covariates to HIV Preexposure Prophylaxis.

    PubMed

    Madrasi, Kumpal; Chaturvedula, Ayyappa; Haberer, Jessica E; Sale, Mark; Fossler, Michael J; Bangsberg, David; Baeten, Jared M; Celum, Connie; Hendrix, Craig W

    2017-05-01

    Adherence is a major factor in the effectiveness of preexposure prophylaxis (PrEP) for HIV prevention. Modeling patterns of adherence helps to identify influential covariates of different types of adherence as well as to enable clinical trial simulation so that appropriate interventions can be developed. We developed a Markov mixed-effects model to understand the covariates influencing adherence patterns to daily oral PrEP. Electronic adherence records (date and time of medication bottle cap opening) from the Partners PrEP ancillary adherence study with a total of 1147 subjects were used. This study included once-daily dosing regimens of placebo, oral tenofovir disoproxil fumarate (TDF), and TDF in combination with emtricitabine (FTC), administered to HIV-uninfected members of serodiscordant couples. One-coin and first- to third-order Markov models were fit to the data using NONMEM ® 7.2. Model selection criteria included objective function value (OFV), Akaike information criterion (AIC), visual predictive checks, and posterior predictive checks. Covariates were included based on forward addition (α = 0.05) and backward elimination (α = 0.001). Markov models better described the data than 1-coin models. A third-order Markov model gave the lowest OFV and AIC, but the simpler first-order model was used for covariate model building because no additional benefit on prediction of target measures was observed for higher-order models. Female sex and older age had a positive impact on adherence, whereas Sundays, sexual abstinence, and sex with a partner other than the study partner had a negative impact on adherence. Our findings suggest adherence interventions should consider the role of these factors. © 2016, The American College of Clinical Pharmacology.

  8. Robust Linear Models for Cis-eQTL Analysis.

    PubMed

    Rantalainen, Mattias; Lindgren, Cecilia M; Holmes, Christopher C

    2015-01-01

    Expression Quantitative Trait Loci (eQTL) analysis enables characterisation of functional genetic variation influencing expression levels of individual genes. In outbread populations, including humans, eQTLs are commonly analysed using the conventional linear model, adjusting for relevant covariates, assuming an allelic dosage model and a Gaussian error term. However, gene expression data generally have noise that induces heavy-tailed errors relative to the Gaussian distribution and often include atypical observations, or outliers. Such departures from modelling assumptions can lead to an increased rate of type II errors (false negatives), and to some extent also type I errors (false positives). Careful model checking can reduce the risk of type-I errors but often not type II errors, since it is generally too time-consuming to carefully check all models with a non-significant effect in large-scale and genome-wide studies. Here we propose the application of a robust linear model for eQTL analysis to reduce adverse effects of deviations from the assumption of Gaussian residuals. We present results from a simulation study as well as results from the analysis of real eQTL data sets. Our findings suggest that in many situations robust models have the potential to provide more reliable eQTL results compared to conventional linear models, particularly in respect to reducing type II errors due to non-Gaussian noise. Post-genomic data, such as that generated in genome-wide eQTL studies, are often noisy and frequently contain atypical observations. Robust statistical models have the potential to provide more reliable results and increased statistical power under non-Gaussian conditions. The results presented here suggest that robust models should be considered routinely alongside other commonly used methodologies for eQTL analysis.

  9. An Advanced Computational Approach to System of Systems Analysis & Architecting Using Agent-Based Behavioral Model: Phase 2

    DTIC Science & Technology

    2013-11-18

    for each valid interface between the systems. The factor is proportional to the count of feasible interfaces in the meta-architecture framework... proportional to the square root of the sector area being covered by each type of system, plus some time for transmitting data to, and double checking by, the...22] J.-H. Ahn, "An Archietcture Description method for Acknowledged System of Systems based on Federated Architeture ," in Advanced Science and

  10. Ammunition Resupply Model. Volume II. Programmers Manual.

    DTIC Science & Technology

    1980-03-01

    pointer tables. If the placement is successful the flag ( ICHECK ) is set equal to 1. COMMON BLOCKS: EVENTS CALLS: NONE IS CALLED BY: SCHED CALLING PARAMETERS...decimal portion of the event time multiplied by 3600. ICHECK - 0 if no room on the file, I if there is room on the file. LOCAL ARRAYS: JFORE (1024...8217EVT, ITH, I-IS, !CHECK) C PUTEVT PLACES AN EVENT RECORD IN -THE QUEUE IN CHRONOLOGICAL C ORDER A,1D UPDATES THE QUEUE DIRECTORY. ICHECK FLAG SET C IF

  11. Stakeout surveys for check dams in gullied areas by using the FreeXSap photogrammetric method

    NASA Astrophysics Data System (ADS)

    Castillo, Carlos; Marín-Moreno, Víctor; Taguas, Encarnación V.

    2017-04-01

    Prior to any check dam construction work, it is necessary to carry out field stakeout surveys to define the layout of the dam series according to spacing criteria. While in expensive and complex settings, accurate measurement techniques might be justified (e.g. differential GPS), for small to medium-sized check dams typical of areas affected by gully erosion, simpler methodologies might be more cost-efficient. Innovative 3D photogrammetric techniques based on Structure-from-Motion (SfM) algorithms have proved to be useful across different geomorphological applications and have been successfully applied for gully assessment. In this communication, we present an efficient methodology consisting of the application of a free interface for photogrammetric reconstruction (FreeXSap) combined with simple distance measurements to obtain channel cross-sections determining the width and height of the check dam for a particular cross-section. We will illustrate its use for a hundred-meter-long gully under conventional agriculture in Córdoba (Spain). FreeXSap is an easy-to-use graphical user interface written in Matlab Code (Mathworks, 2016) for the reconstruction of 3D models from image sets taken with digital consumer-grade cameras. The SfM algorithms are based on MicMac scripts (Pierrot-Deseilligny and Cléry, 2011) along with routines specifically developed for the orientation, determination and geometrical analysis of cross-sections. It only requires the collection of a few pictures of a channel cross-section (normally below 5) by the camera operator to build an accurate 3D model, while a second operator holds a pole in vertical position (with the help of a bubble level attached to the pole) in order to provide orientation and scale for further processing. The spacing between check dams was determined using the head-to-toe rule by using a clinometer App on a Smartphone. In this work we will evaluate the results of the application of this methodology in terms of time and cost requirements and the capabilities and operation procedure of FreeXSap will be presented. This tool will be available for free download. REFERENCES Pierrot-Deseilligny, M and Cléry, I. APERO, an Open Source Bundle Adjusment Software for Automatic Calibration and Orientation of a Set of Images. Proceedings of the ISPRS Commission V Symposium, Image Engineering and Vision Metrology, Trento, Italy, 2-4 March 2011.

  12. Using LDPC Code Constraints to Aid Recovery of Symbol Timing

    NASA Technical Reports Server (NTRS)

    Jones, Christopher; Villasnor, John; Lee, Dong-U; Vales, Esteban

    2008-01-01

    A method of utilizing information available in the constraints imposed by a low-density parity-check (LDPC) code has been proposed as a means of aiding the recovery of symbol timing in the reception of a binary-phase-shift-keying (BPSK) signal representing such a code in the presence of noise, timing error, and/or Doppler shift between the transmitter and the receiver. This method and the receiver architecture in which it would be implemented belong to a class of timing-recovery methods and corresponding receiver architectures characterized as pilotless in that they do not require transmission and reception of pilot signals. Acquisition and tracking of a signal of the type described above have traditionally been performed upstream of, and independently of, decoding and have typically involved utilization of a phase-locked loop (PLL). However, the LDPC decoding process, which is iterative, provides information that can be fed back to the timing-recovery receiver circuits to improve performance significantly over that attainable in the absence of such feedback. Prior methods of coupling LDPC decoding with timing recovery had focused on the use of output code words produced as the iterations progress. In contrast, in the present method, one exploits the information available from the metrics computed for the constraint nodes of an LDPC code during the decoding process. In addition, the method involves the use of a waveform model that captures, better than do the waveform models of the prior methods, distortions introduced by receiver timing errors and transmitter/ receiver motions. An LDPC code is commonly represented by use of a bipartite graph containing two sets of nodes. In the graph corresponding to an (n,k) code, the n variable nodes correspond to the code word symbols and the n-k constraint nodes represent the constraints that the code places on the variable nodes in order for them to form a valid code word. The decoding procedure involves iterative computation of values associated with these nodes. A constraint node represents a parity-check equation using a set of variable nodes as inputs. A valid decoded code word is obtained if all parity-check equations are satisfied. After each iteration, the metrics associated with each constraint node can be evaluated to determine the status of the associated parity check. Heretofore, normally, these metrics would be utilized only within the LDPC decoding process to assess whether or not variable nodes had converged to a codeword. In the present method, it is recognized that these metrics can be used to determine accuracy of the timing estimates used in acquiring the sampled data that constitute the input to the LDPC decoder. In fact, the number of constraints that are satisfied exhibits a peak near the optimal timing estimate. Coarse timing estimation (or first-stage estimation as described below) is found via a parametric search for this peak. The present method calls for a two-stage receiver architecture illustrated in the figure. The first stage would correct large time delays and frequency offsets; the second stage would track random walks and correct residual time and frequency offsets. In the first stage, constraint-node feedback from the LDPC decoder would be employed in a search algorithm in which the searches would be performed in successively narrower windows to find the correct time delay and/or frequency offset. The second stage would include a conventional first-order PLL with a decision-aided timing-error detector that would utilize, as its decision aid, decoded symbols from the LDPC decoder. The method has been tested by means of computational simulations in cases involving various timing and frequency errors. The results of the simulations ined in the ideal case of perfect timing in the receiver.

  13. Class Model Development Using Business Rules

    NASA Astrophysics Data System (ADS)

    Skersys, Tomas; Gudas, Saulius

    New developments in the area of computer-aided system engineering (CASE) greatly improve processes of the information systems development life cycle (ISDLC). Much effort is put into the quality improvement issues, but IS development projects still suffer from the poor quality of models during the system analysis and design cycles. At some degree, quality of models that are developed using CASE tools can be assured using various. automated. model comparison, syntax. checking procedures. It. is also reasonable to check these models against the business domain knowledge, but the domain knowledge stored in the repository of CASE tool (enterprise model) is insufficient (Gudas et al. 2004). Involvement of business domain experts into these processes is complicated because non- IT people often find it difficult to understand models that were developed by IT professionals using some specific modeling language.

  14. Global sensitivity analysis, probabilistic calibration, and predictive assessment for the data assimilation linked ecosystem carbon model

    DOE PAGES

    Safta, C.; Ricciuto, Daniel M.; Sargsyan, Khachik; ...

    2015-07-01

    In this paper we propose a probabilistic framework for an uncertainty quantification (UQ) study of a carbon cycle model and focus on the comparison between steady-state and transient simulation setups. A global sensitivity analysis (GSA) study indicates the parameters and parameter couplings that are important at different times of the year for quantities of interest (QoIs) obtained with the data assimilation linked ecosystem carbon (DALEC) model. We then employ a Bayesian approach and a statistical model error term to calibrate the parameters of DALEC using net ecosystem exchange (NEE) observations at the Harvard Forest site. The calibration results are employedmore » in the second part of the paper to assess the predictive skill of the model via posterior predictive checks.« less

  15. Normalized coffin-manson plot in terms of a new life function based on stress relaxation under creep-fatigue conditions

    NASA Astrophysics Data System (ADS)

    Jeong, Chang Yeol; Nam, Soo Woo; Lim, Jong Dae

    2003-04-01

    A new life prediction function based on a model formulated in terms of stress relaxation during hold time under creep-fatigue conditions is proposed. From the idea that reduction in fatigue life with hold is due to the creep effect of stress relaxation that results in additional energy dissipation in the hysteresis loop, it is suggested that the relaxed stress range may be a creep-fatigue damage function. Creep-fatigue data from the present and other investigators are used to check the validity of the proposed life prediction equation. It is shown that the data satisfy the applicability of the life relation model. Accordingly, using this life prediction model, one may realize that all the Coffin-Manson plots at various levels of hold time in strain-controlled creep-fatigue tests can be normalized to make one straight line.

  16. Transforming hemoglobin measurement in trauma patients: noninvasive spot check hemoglobin.

    PubMed

    Joseph, Bellal; Pandit, Viraj; Aziz, Hassan; Kulvatunyou, Narong; Zangbar, Bardiya; Tang, Andrew; O' Keeffe, Terence; Jehangir, Qasim; Snyder, Kara; Rhee, Peter

    2015-01-01

    Technological advances now allow for noninvasive Hbg measurements. Previous studies have reported on the efficacy of continuous noninvasive Hgb devices. Recently, a new device, Pronto-7, a spot check pulse CO-oximeter has become available. The aim of our study was to assess noninvasive Hgb measurement in trauma patients. We performed a prospective cohort analysis of all trauma patients presenting to our Level I trauma center. Invasive Hgb and spot check Hgb measurements were obtained simultaneously at presentation. Spot check was measured 2 times with each invasive Hgb value. Normal Hgb was defined as >8 mg/dL. Spearman correlation and Bland-Altman analysis was performed. A total of 525 patients had attempted spot check Hgb measurements with a success rate of 86% (n = 450). A total of 450 invasive and 1,350 spot check Hgb measurements were obtained. Mean ± SD age was 41 ± 21 years, 74% were male, and mean Injury Severity Score was 21 ± 13. Thirty-eight percent (n = 173) of patients had Hgb ≤8 mg/dL at presentation. Mean invasive Hgb was 11.5 ± 4.36 g/dL, mean spot check Hgb 11.1 ± 3.60 g/dL, and mean difference was 0.3 ± 1.3 g/dL. Spot check Hgb values had strong correlation with invasive Hgb measurements (R(2) = 0.77; R = 0.86; p = 0.04) with 76% accuracy and 95.4% sensitivity. Spot check Hgb monitoring had excellent correlation with invasive Hgb measurements. Application of spot check has more clinical use as compared with previous continuous Hgb monitoring. This novel technology allows immediate and accurate Hgb measurements in trauma patients. Copyright © 2015 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  17. The current and potential health benefits of the National Health Service Health Check cardiovascular disease prevention programme in England: A microsimulation study

    PubMed Central

    Jackson, Christopher; Steinacher, Arno; Goodman, Anna; Langenberg, Claudia; Griffin, Simon

    2018-01-01

    Background The National Health Service (NHS) Health Check programme was introduced in 2009 in England to systematically assess all adults in midlife for cardiovascular disease risk factors. However, its current benefit and impact on health inequalities are unknown. It is also unclear whether feasible changes in how it is delivered could result in increased benefits. It is one of the first such programmes in the world. We sought to estimate the health benefits and effect on inequalities of the current NHS Health Check programme and the impact of making feasible changes to its implementation. Methods and findings We developed a microsimulation model to estimate the health benefits (incident ischaemic heart disease, stroke, dementia, and lung cancer) of the NHS Health Check programme in England. We simulated a population of adults in England aged 40–45 years and followed until age 100 years, using data from the Health Survey of England (2009–2012) and the English Longitudinal Study of Aging (1998–2012), to simulate changes in risk factors for simulated individuals over time. We used recent programme data to describe uptake of NHS Health Checks and of 4 associated interventions (statin medication, antihypertensive medication, smoking cessation, and weight management). Estimates of treatment efficacy and adherence were based on trial data. We estimated the benefits of the current NHS Health Check programme compared to a healthcare system without systematic health checks. This counterfactual scenario models the detection and treatment of risk factors that occur within ‘routine’ primary care. We also explored the impact of making feasible changes to implementation of the programme concerning eligibility, uptake of NHS Health Checks, and uptake of treatments offered through the programme. We estimate that the NHS Health Check programme prevents 390 (95% credible interval 290 to 500) premature deaths before 80 years of age and results in an additional 1,370 (95% credible interval 1,100 to 1,690) people being free of disease (ischaemic heart disease, stroke, dementia, and lung cancer) at age 80 years per million people aged 40–45 years at baseline. Over the life of the cohort (i.e., followed from 40–45 years to 100 years), the changes result in an additional 10,000 (95% credible interval 8,200 to 13,000) quality-adjusted life years (QALYs) and an additional 9,000 (6,900 to 11,300) years of life. This equates to approximately 300 fewer premature deaths and 1,000 more people living free of these diseases each year in England. We estimate that the current programme is increasing QALYs by 3.8 days (95% credible interval 3.0–4.7) per head of population and increasing survival by 3.3 days (2.5–4.1) per head of population over the 60 years of follow-up. The current programme has a greater absolute impact on health for those living in the most deprived areas compared to those living in the least deprived areas (4.4 [2.7–6.5] days of additional quality-adjusted life per head of population versus 2.8 [1.7–4.0] days; 5.1 [3.4–7.1] additional days lived per head of population versus 3.3 [2.1–4.5] days). Making feasible changes to the delivery of the existing programme could result in a sizable increase in the benefit. For example, a strategy that combines extending eligibility to those with preexisting hypertension, extending the upper age of eligibility to 79 years, increasing uptake of health checks by 30%, and increasing treatment rates 2.5-fold amongst eligible patients (i.e., ‘maximum potential’ scenario) results in at least a 3-fold increase in benefits compared to the current programme (1,360 premature deaths versus 390; 5,100 people free of 1 of the 4 diseases versus 1,370; 37,000 additional QALYs versus 10,000; 33,000 additional years of life versus 9,000). Ensuring those who are assessed and eligible for statins receive statins is a particularly important strategy to increase benefits. Estimates of overall benefit are based on current incidence and management, and future declines in disease incidence or improvements in treatment could alter the actual benefits observed in the long run. We have focused on the cardiovascular element of the NHS Health Check programme. Some important noncardiovascular health outcomes (e.g., chronic obstructive pulmonary disease [COPD] prevention from smoking cessation and cancer prevention from weight loss) and other parts of the programme (e.g., brief interventions to reduce harmful alcohol consumption) have not been modelled. Conclusions Our model indicates that the current NHS Health Check programme is contributing to improvements in health and reducing health inequalities. Feasible changes in the organisation of the programme could result in more than a 3-fold increase in health benefits. PMID:29509767

  18. The current and potential health benefits of the National Health Service Health Check cardiovascular disease prevention programme in England: A microsimulation study.

    PubMed

    Mytton, Oliver T; Jackson, Christopher; Steinacher, Arno; Goodman, Anna; Langenberg, Claudia; Griffin, Simon; Wareham, Nick; Woodcock, James

    2018-03-01

    The National Health Service (NHS) Health Check programme was introduced in 2009 in England to systematically assess all adults in midlife for cardiovascular disease risk factors. However, its current benefit and impact on health inequalities are unknown. It is also unclear whether feasible changes in how it is delivered could result in increased benefits. It is one of the first such programmes in the world. We sought to estimate the health benefits and effect on inequalities of the current NHS Health Check programme and the impact of making feasible changes to its implementation. We developed a microsimulation model to estimate the health benefits (incident ischaemic heart disease, stroke, dementia, and lung cancer) of the NHS Health Check programme in England. We simulated a population of adults in England aged 40-45 years and followed until age 100 years, using data from the Health Survey of England (2009-2012) and the English Longitudinal Study of Aging (1998-2012), to simulate changes in risk factors for simulated individuals over time. We used recent programme data to describe uptake of NHS Health Checks and of 4 associated interventions (statin medication, antihypertensive medication, smoking cessation, and weight management). Estimates of treatment efficacy and adherence were based on trial data. We estimated the benefits of the current NHS Health Check programme compared to a healthcare system without systematic health checks. This counterfactual scenario models the detection and treatment of risk factors that occur within 'routine' primary care. We also explored the impact of making feasible changes to implementation of the programme concerning eligibility, uptake of NHS Health Checks, and uptake of treatments offered through the programme. We estimate that the NHS Health Check programme prevents 390 (95% credible interval 290 to 500) premature deaths before 80 years of age and results in an additional 1,370 (95% credible interval 1,100 to 1,690) people being free of disease (ischaemic heart disease, stroke, dementia, and lung cancer) at age 80 years per million people aged 40-45 years at baseline. Over the life of the cohort (i.e., followed from 40-45 years to 100 years), the changes result in an additional 10,000 (95% credible interval 8,200 to 13,000) quality-adjusted life years (QALYs) and an additional 9,000 (6,900 to 11,300) years of life. This equates to approximately 300 fewer premature deaths and 1,000 more people living free of these diseases each year in England. We estimate that the current programme is increasing QALYs by 3.8 days (95% credible interval 3.0-4.7) per head of population and increasing survival by 3.3 days (2.5-4.1) per head of population over the 60 years of follow-up. The current programme has a greater absolute impact on health for those living in the most deprived areas compared to those living in the least deprived areas (4.4 [2.7-6.5] days of additional quality-adjusted life per head of population versus 2.8 [1.7-4.0] days; 5.1 [3.4-7.1] additional days lived per head of population versus 3.3 [2.1-4.5] days). Making feasible changes to the delivery of the existing programme could result in a sizable increase in the benefit. For example, a strategy that combines extending eligibility to those with preexisting hypertension, extending the upper age of eligibility to 79 years, increasing uptake of health checks by 30%, and increasing treatment rates 2.5-fold amongst eligible patients (i.e., 'maximum potential' scenario) results in at least a 3-fold increase in benefits compared to the current programme (1,360 premature deaths versus 390; 5,100 people free of 1 of the 4 diseases versus 1,370; 37,000 additional QALYs versus 10,000; 33,000 additional years of life versus 9,000). Ensuring those who are assessed and eligible for statins receive statins is a particularly important strategy to increase benefits. Estimates of overall benefit are based on current incidence and management, and future declines in disease incidence or improvements in treatment could alter the actual benefits observed in the long run. We have focused on the cardiovascular element of the NHS Health Check programme. Some important noncardiovascular health outcomes (e.g., chronic obstructive pulmonary disease [COPD] prevention from smoking cessation and cancer prevention from weight loss) and other parts of the programme (e.g., brief interventions to reduce harmful alcohol consumption) have not been modelled. Our model indicates that the current NHS Health Check programme is contributing to improvements in health and reducing health inequalities. Feasible changes in the organisation of the programme could result in more than a 3-fold increase in health benefits.

  19. A simple model for strong ground motions and response spectra

    USGS Publications Warehouse

    Safak, Erdal; Mueller, Charles; Boatwright, John

    1988-01-01

    A simple model for the description of strong ground motions is introduced. The model shows that response spectra can be estimated by using only four parameters of the ground motion, the RMS acceleration, effective duration and two corner frequencies that characterize the effective frequency band of the motion. The model is windowed band-limited white noise, and is developed by studying the properties of two functions, cumulative squared acceleration in the time domain, and cumulative squared amplitude spectrum in the frequency domain. Applying the methods of random vibration theory, the model leads to a simple analytical expression for the response spectra. The accuracy of the model is checked by using the ground motion recordings from the aftershock sequences of two different earthquakes and simulated accelerograms. The results show that the model gives a satisfactory estimate of the response spectra.

  20. Testing the Grossman model of medical spending determinants with macroeconomic panel data.

    PubMed

    Hartwig, Jochen; Sturm, Jan-Egbert

    2018-02-16

    Michael Grossman's human capital model of the demand for health has been argued to be one of the major achievements in theoretical health economics. Attempts to test this model empirically have been sparse, however, and with mixed results. These attempts so far relied on using-mostly cross-sectional-micro data from household surveys. For the first time in the literature, we bring in macroeconomic panel data for 29 OECD countries over the period 1970-2010 to test the model. To check the robustness of the results for the determinants of medical spending identified by the model, we include additional covariates in an extreme bounds analysis (EBA) framework. The preferred model specifications (including the robust covariates) do not lend much empirical support to the Grossman model. This is in line with the mixed results of earlier studies.

  1. [Proposal and preliminary validation of a check-list for the assessment of occupational exposure to repetitive movements of the upper lims].

    PubMed

    Colombini, D; Occhipinti, E; Cairoli, S; Baracco, A

    2000-01-01

    Over the last few years the Authors developed and implemented, a specific check-list for a "rapid" assessment of occupational exposure to repetitive movements and exertion of the upper limbs, after verifying the lack of such a tool which also had to be coherent with the latest data in the specialized literature. The check-list model and the relevant application procedures are presented and discussed. The check-list was applied by trained factory technicians in 46 different working tasks where the OCRA method previously proposed by the Authors was also applied by independent observers. Since 46 pairs of observation data were available (OCRA index and check-list score) it was possible to verify, via parametric and nonparametric statistical tests, the level of association between the two variables and to find the best simple regression function (exponential in this case) of the OCRA index from the check-list score. By means of this function, which was highly significant (R2 = 0.98, p < 0.0000), the values of the check-list score which better corresponded to the critical values (for exposure assessment) of the OCRA index looked for. The following correspondance values between OCRA Index and check-list were then established with a view to classifying exposure levels. The check-list "critical" scores were established considering the need for obtaining, in borderline cases, a potential effect of overestimation of the exposure level. On the basis of practical application experience and the preliminary validation results, recommendations are made and the caution needed in the use of the check-list is suggested.

  2. Orbital Signature Analyzer (OSA): A spacecraft health/safety monitoring and analysis tool

    NASA Technical Reports Server (NTRS)

    Weaver, Steven; Degeorges, Charles; Bush, Joy; Shendock, Robert; Mandl, Daniel

    1993-01-01

    Fixed or static limit sensing is employed in control centers to ensure that spacecraft parameters remain within a nominal range. However, many critical parameters, such as power system telemetry, are time-varying and, as such, their 'nominal' range is necessarily time-varying as well. Predicted data, manual limits checking, and widened limit-checking ranges are often employed in an attempt to monitor these parameters without generating excessive limits violations. Generating predicted data and manual limits checking are both resource intensive, while broadening limit ranges for time-varying parameters is clearly inadequate to detect all but catastrophic problems. OSA provides a low-cost solution by using analytically selected data as a reference upon which to base its limits. These limits are always defined relative to the time-varying reference data, rather than as fixed upper and lower limits. In effect, OSA provides individual limits tailored to each value throughout all the data. A side benefit of using relative limits is that they automatically adjust to new reference data. In addition, OSA provides a wealth of analytical by-products in its execution.

  3. Improving treatment plan evaluation with automation.

    PubMed

    Covington, Elizabeth L; Chen, Xiaoping; Younge, Kelly C; Lee, Choonik; Matuszak, Martha M; Kessler, Marc L; Keranen, Wayne; Acosta, Eduardo; Dougherty, Ashley M; Filpansick, Stephanie E; Moran, Jean M

    2016-11-08

    The goal of this work is to evaluate the effectiveness of Plan-Checker Tool (PCT) which was created to improve first-time plan quality, reduce patient delays, increase the efficiency of our electronic workflow, and standardize and automate the phys-ics plan review in the treatment planning system (TPS). PCT uses an application programming interface to check and compare data from the TPS and treatment management system (TMS). PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user as part of a plan readiness check for treatment. Prior to and during PCT development, errors identified during the physics review and causes of patient treatment start delays were tracked to prioritize which checks should be automated. Nineteen of 33checklist items were automated, with data extracted with PCT. There was a 60% reduction in the number of patient delays in the six months after PCT release. PCT was suc-cessfully implemented for use on all external beam treatment plans in our clinic. While the number of errors found during the physics check did not decrease, automation of checks increased visibility of errors during the physics check, which led to decreased patient delays. The methods used here can be applied to any TMS and TPS that allows queries of the database. © 2016 The Authors.

  4. Automated survey of 8000 plan checks at eight facilities.

    PubMed

    Halabi, Tarek; Lu, Hsiao-Ming; Bernard, Damian A; Chu, James C H; Kirk, Michael C; Hamilton, Russell J; Lei, Yu; Driewer, Joseph

    2016-09-01

    To identify policy and system related weaknesses in treatment planning and plan check work-flows. The authors' web deployed plan check automation solution, PlanCheck, which works with all major planning and record and verify systems (demonstrated here for mosaiq only), allows them to compute violation rates for a large number of plan checks across many facilities without requiring the manual data entry involved with incident filings. Workflows and failure modes are heavily influenced by the type of record and verify system used. Rather than tackle multiple record and verify systems at once, the authors restricted the present survey to mosaiq facilities. Violations were investigated by sending inquiries to physicists running the program. Frequent violations included inadequate tracking in the record and verify system of total and prescription doses. Infrequent violations included incorrect setting of patient orientation in the record and verify system. Peaks in the distribution, over facilities, of violation frequencies pointed to suboptimal policies at some of these facilities. Correspondence with physicists often revealed incomplete knowledge of settings at their facility necessary to perform thorough plan checks. The survey leads to the identification of specific and important policy and system deficiencies that include: suboptimal timing of initial plan checks, lack of communication or agreement on conventions surrounding prescription definitions, and lack of automation in the transfer of some parameters.

  5. Double checking medicines: defence against error or contributory factor?

    PubMed

    Armitage, Gerry

    2008-08-01

    The double checking of medicines in health care is a contestable procedure. It occupies an obvious position in health care practice and is understood to be an effective defence against medication error but the process is variable and the outcomes have not been exposed to testing. This paper presents an appraisal of the process using data from part of a larger study on the contributory factors in medication errors and their reporting. Previous research studies are reviewed; data are analysed from a review of 991 drug error reports and a subsequent series of 40 in-depth interviews with health professionals in an acute hospital in northern England. The incident reports showed that errors occurred despite double checking but that action taken did not appear to investigate the checking process. Most interview participants (34) talked extensively about double checking but believed the process to be inconsistent. Four key categories were apparent: deference to authority, reduction of responsibility, automatic processing and lack of time. Solutions to the problems were also offered, which are discussed with several recommendations. Double checking medicines should be a selective and systematic procedure informed by key principles and encompassing certain behaviours. Psychological research may be instructive in reducing checking errors but the aviation industry may also have a part to play in increasing error wisdom and reducing risk.

  6. One method for life time estimation of a bucket wheel machine for coal moving

    NASA Astrophysics Data System (ADS)

    Vîlceanu, Fl; Iancu, C.

    2016-08-01

    Rehabilitation of outdated equipment with lifetime expired, or in the ultimate life period, together with high cost investments for their replacement, makes rational the efforts made to extend their life. Rehabilitation involves checking operational safety based on relevant expertise of metal structures supporting effective resistance and assessing the residual lifetime. The bucket wheel machine for coal constitute basic machine within deposits of coal of power plants. The estimate of remaining life can be done by checking the loading on the most stressed subassembly by Finite Element Analysis on a welding detail. The paper presents step-by-step the method of calculus applied in order to establishing the residual lifetime of a bucket wheel machine for coal moving using non-destructive methods of study (fatigue cracking analysis + FEA). In order to establish the actual state of machine and areas subject to study, was done FEA of this mining equipment, performed on the geometric model of mechanical analyzed structures, with powerful CAD/FEA programs. By applying the method it can be calculated residual lifetime, by extending the results from the most stressed area of the equipment to the entire machine, and thus saving time and money from expensive replacements.

  7. Searching for memories, Sudoku, implicit check bits, and the iterative use of not-always-correct rapid neural computation.

    PubMed

    Hopfield, J J

    2008-05-01

    The algorithms that simple feedback neural circuits representing a brain area can rapidly carry out are often adequate to solve easy problems but for more difficult problems can return incorrect answers. A new excitatory-inhibitory circuit model of associative memory displays the common human problem of failing to rapidly find a memory when only a small clue is present. The memory model and a related computational network for solving Sudoku puzzles produce answers that contain implicit check bits in the representation of information across neurons, allowing a rapid evaluation of whether the putative answer is correct or incorrect through a computation related to visual pop-out. This fact may account for our strong psychological feeling of right or wrong when we retrieve a nominal memory from a minimal clue. This information allows more difficult computations or memory retrievals to be done in a serial fashion by using the fast but limited capabilities of a computational module multiple times. The mathematics of the excitatory-inhibitory circuits for associative memory and for Sudoku, both of which are understood in terms of energy or Lyapunov functions, is described in detail.

  8. Assume-Guarantee Verification of Source Code with Design-Level Assumptions

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.

    2004-01-01

    Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.

  9. Safe and effective nursing shift handover with NURSEPASS: An interrupted time series.

    PubMed

    Smeulers, Marian; Dolman, Christine D; Atema, Danielle; van Dieren, Susan; Maaskant, Jolanda M; Vermeulen, Hester

    2016-11-01

    Implementation of a locally developed evidence based nursing shift handover blueprint with a bedside-safety-check to determine the effect size on quality of handover. A mixed methods design with: (1) an interrupted time series analysis to determine the effect on handover quality in six domains; (2) descriptive statistics to analyze the intercepted discrepancies by the bedside-safety-check; (3) evaluation sessions to gather experiences with the new handover process. We observed a continued trend of improvement in handover quality and a significant improvement in two domains of handover: organization/efficiency and contents. The bedside-safety-check successfully identified discrepancies on drains, intravenous medications, bandages or general condition and was highly appreciated. Use of the nursing shift handover blueprint showed promising results on effectiveness as well as on feasibility and acceptability. However, to enable long term measurement on effectiveness, evaluation with large scale interrupted times series or statistical process control is needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Social-cognitive determinants of the tick check: a cross-sectional study on self-protective behavior in combatting Lyme disease.

    PubMed

    van der Heijden, Amy; Mulder, Bob C; Poortvliet, P Marijn; van Vliet, Arnold J H

    2017-11-25

    Performing a tick check after visiting nature is considered the most important preventive measure to avoid contracting Lyme disease. Checking the body for ticks after visiting nature is the only measure that can fully guarantee whether one has been bitten by a tick and provides the opportunity to remove the tick as soon as possible, thereby greatly reducing the chance of contracting Lyme disease. However, compliance to performing the tick check is low. In addition, most previous studies on determinants of preventive measures to avoid Lyme disease lack a clear definition and/or operationalization of the term "preventive measures". Those that do distinguish multiple behaviors including the tick check, fail to describe the systematic steps that should be followed in order to perform the tick check effectively. Hence, the purpose of this study was to identify determinants of systematically performing the tick check, based on social cognitive theory. A cross-sectional self-administered survey questionnaire was filled out online by 508 respondents (M age  = 51.7, SD = 16.0; 50.2% men; 86.4% daily or weekly nature visitors). Bivariate correlations and multivariate regression analyses were conducted to identify associations between socio-cognitive determinants (i.e. concepts related to humans' intrinsic and extrinsic motivation to perform certain behavior), and the tick check, and between socio-cognitive determinants and proximal goal to do the tick check. The full regression model explained 28% of the variance in doing the tick check. Results showed that performing the tick check was associated with proximal goal (β = .23, p < 0.01), self-efficacy (β = .22, p < 0.01), self-evaluative outcome expectations (β = .21, p < 0.01), descriptive norm (β = .16, p < 0.01), and experience (β = .13, p < 0.01). Our study is among the first to examine the determinants of systematic performance of the tick check, using an extended version of social cognitive theory to identify determinants. Based on the results, a number of practical recommendations can be made to promote the performance of the tick check.

  11. Time dependence of correlation functions following a quantum quench.

    PubMed

    Calabrese, Pasquale; Cardy, John

    2006-04-07

    We show that the time dependence of correlation functions in an extended quantum system in d dimensions, which is prepared in the ground state of some Hamiltonian and then evolves without dissipation according to some other Hamiltonian, may be extracted using methods of boundary critical phenomena in d + 1 dimensions. For d = 1 particularly powerful results are available using conformal field theory. These are checked against those available from solvable models. They may be explained in terms of a picture, valid more generally, whereby quasiparticles, entangled over regions of the order of the correlation length in the initial state, then propagate classically through the system.

  12. Surface disinfection by exposure to germicidal UV light.

    PubMed

    Katara, G; Hemvani, N; Chitnis, S; Chitnis, V; Chitnis, D S

    2008-01-01

    The present study was aimed to design a simple model to check efficacy of germicidal UV tube, to standardise the position, distance and time for UV light and also to find out its efficacy against medically important bacteria, the bacterial spores and fungi. The microbial cultures tested included gram positive and gram negative bacteria, bacterial spores and fungal spores. The microbes streaked on solid media were exposed to UV light. The inactivation of the order of four logs was observed for bacteria. UV light can have efficient inactivation of bacteria up to a distance of eight feet on either side and exposure time of 30 minutes is adequate.

  13. Optical near-field analysis of spherical metals: Application of the FDTD method combined with the ADE method.

    PubMed

    Yamaguchi, Takashi; Hinata, Takashi

    2007-09-03

    The time-average energy density of the optical near-field generated around a metallic sphere is computed using the finite-difference time-domain method. To check the accuracy, the numerical results are compared with the rigorous solutions by Mie theory. The Lorentz-Drude model, which is coupled with Maxwell's equation via motion equations of an electron, is applied to simulate the dispersion relation of metallic materials. The distributions of the optical near-filed generated around a metallic hemisphere and a metallic spheroid are also computed, and strong optical near-fields are obtained at the rim of them.

  14. Heartbeat-based error diagnosis framework for distributed embedded systems

    NASA Astrophysics Data System (ADS)

    Mishra, Swagat; Khilar, Pabitra Mohan

    2012-01-01

    Distributed Embedded Systems have significant applications in automobile industry as steer-by-wire, fly-by-wire and brake-by-wire systems. In this paper, we provide a general framework for fault detection in a distributed embedded real time system. We use heartbeat monitoring, check pointing and model based redundancy to design a scalable framework that takes care of task scheduling, temperature control and diagnosis of faulty nodes in a distributed embedded system. This helps in diagnosis and shutting down of faulty actuators before the system becomes unsafe. The framework is designed and tested using a new simulation model consisting of virtual nodes working on a message passing system.

  15. Heartbeat-based error diagnosis framework for distributed embedded systems

    NASA Astrophysics Data System (ADS)

    Mishra, Swagat; Khilar, Pabitra Mohan

    2011-12-01

    Distributed Embedded Systems have significant applications in automobile industry as steer-by-wire, fly-by-wire and brake-by-wire systems. In this paper, we provide a general framework for fault detection in a distributed embedded real time system. We use heartbeat monitoring, check pointing and model based redundancy to design a scalable framework that takes care of task scheduling, temperature control and diagnosis of faulty nodes in a distributed embedded system. This helps in diagnosis and shutting down of faulty actuators before the system becomes unsafe. The framework is designed and tested using a new simulation model consisting of virtual nodes working on a message passing system.

  16. Prediction Interval Development for Wind-Tunnel Balance Check-Loading

    NASA Technical Reports Server (NTRS)

    Landman, Drew; Toro, Kenneth G.; Commo, Sean A.; Lynn, Keith C.

    2014-01-01

    Results from the Facility Analysis Verification and Operational Reliability project revealed a critical gap in capability in ground-based aeronautics research applications. Without a standardized process for check-loading the wind-tunnel balance or the model system, the quality of the aerodynamic force data collected varied significantly between facilities. A prediction interval is required in order to confirm a check-loading. The prediction interval provides an expected upper and lower bound on balance load prediction at a given confidence level. A method has been developed which accounts for sources of variability due to calibration and check-load application. The prediction interval method of calculation and a case study demonstrating its use is provided. Validation of the methods is demonstrated for the case study based on the probability of capture of confirmation points.

  17. User's guide for MAGIC-Meteorologic and hydrologic genscn (generate scenarios) input converter

    USGS Publications Warehouse

    Ortel, Terry W.; Martin, Angel

    2010-01-01

    Meteorologic and hydrologic data used in watershed modeling studies are collected by various agencies and organizations, and stored in various formats. Data may be in a raw, un-processed format with little or no quality control, or may be checked for validity before being made available. Flood-simulation systems require data in near real-time so that adequate flood warnings can be made. Additionally, forecasted data are needed to operate flood-control structures to potentially mitigate flood damages. Because real-time data are of a provisional nature, missing data may need to be estimated for use in floodsimulation systems. The Meteorologic and Hydrologic GenScn (Generate Scenarios) Input Converter (MAGIC) can be used to convert data from selected formats into the Hydrologic Simulation System-Fortran hourly-observations format for input to a Watershed Data Management database, for use in hydrologic modeling studies. MAGIC also can reformat the data to the Full Equations model time-series format, for use in hydraulic modeling studies. Examples of the application of MAGIC for use in the flood-simulation system for Salt Creek in northeastern Illinois are presented in this report.

  18. A comparison/validation of a fractional derivative model with an empirical model of non-linear shock waves in swelling shales

    NASA Astrophysics Data System (ADS)

    Droghei, Riccardo; Salusti, Ettore

    2013-04-01

    Control of drilling parameters, as fluid pressure, mud weight, salt concentration is essential to avoid instabilities when drilling through shale sections. To investigate shale deformation, fundamental for deep oil drilling and hydraulic fracturing for gas extraction ("fracking"), a non-linear model of mechanic and chemo-poroelastic interactions among fluid, solute and the solid matrix is here discussed. The two equations of this model describe the isothermal evolution of fluid pressure and solute density in a fluid saturated porous rock. Their solutions are quick non-linear Burger's solitary waves, potentially destructive for deep operations. In such analysis the effect of diffusion, that can play a particular role in fracking, is investigated. Then, following Civan (1998), both diffusive and shock waves are applied to fine particles filtration due to such quick transients , their effect on the adjacent rocks and the resulting time-delayed evolution. Notice how time delays in simple porous media dynamics have recently been analyzed using a fractional derivative approach. To make a tentative comparison of these two deeply different methods,in our model we insert fractional time derivatives, i.e. a kind of time-average of the fluid-rocks interactions. Then the delaying effects of fine particles filtration is compared with fractional model time delays. All this can be seen as an empirical check of these fractional models.

  19. Reconstruction of the sediment flow regime in a semi-arid Mediterranean catchment using check dam sediment information.

    NASA Astrophysics Data System (ADS)

    Bussi, G.; Rodríguez, X.; Francés, F.; Benito, G.; Sánchez-Moya, Y.; Sopeña, A.

    2012-04-01

    When using hydrological and sedimentological models, lack of historical records is often one of the main problems to face, since observed data are essential for model validation. If gauged data are poor or absent, a source of additional proxy data may be the slack-water deposits accumulated in check dams. The aim of this work is to present the result of the reconstruction of the recent hydrological and sediment yield regime of a semi-arid Mediterranean catchment (Rambla del Poyo, Spain, 184 square km) by coupling palaeoflood techniques with a distributed hydrological and sediment cycle model, using as proxy data the sandy slack-water deposits accumulated upstream a small check dam (reservoir volume 2,500 square m) located in the headwater basin (drainage area 13 square km). The solid volume trapped into the reservoir has been estimated using differential GPS data and an interpolation technique. Afterwards, the total solid volume has been disaggregated into various layers (flood units), by means of a stratigraphical description of a depositional sequence in a 3.5 m trench made across the reservoir sediment deposit, taking care of identifying all flood units; the separation between flood units is indicated by a break in deposition. The sedimentary sequence shows evidence of 15 flood events that occurred after the dam construction (early '90). Not all events until the present are included; for the last ones, the stream velocity and energy conditions for generating slack-water deposits were not fulfilled due to the reservoir filling. The volume of each flood unit has been estimated making the hypothesis that layers have a simple pyramidal shape (or wedge); every volume represents an estimation of the sediments trapped into the reservoir corresponding to each flood event. The obtained results have been compared with the results of modeling a 20 year time series (1990 - 2009) with the distributed conceptual hydrological and sediment yield model TETIS-SED, in order to assign a date to every flood unit. The TETIS-SED model provides the sediment yield series divided into textural fractions (sand, silt and clay). In order to determine the amount of sediments trapped into the ponds, trap efficiency of each check dam is computed by using the STEP model (Sediment Trap Efficiency model for small Ponds, Verstraeten and Poesen, 2001). Sediment dry bulk density is calculated according to Lane and Koelzer (1943) formulae. In order to improve the reliability of the flood reconstruction, distributed historical fire data has also been used for dating carbon layers found in the depositional sequence. Finally, a date has been assigned to every flood unit, corresponding to an extreme rainfall event; the result is a sediment volume series from 1990 to 2009, which may be very helpful for validating both hydrological and sediment yield models and can improve our understanding on erosion and sediment yield in this catchment.

  20. Quantitative assessment of locomotive syndrome by the loco-check questionnaire in older Japanese females

    PubMed Central

    Noge, Sachiko; Ohishi, Tatsuo; Yoshida, Takuya; Kumagai, Hiromichi

    2017-01-01

    [Purpose] Locomotive syndrome (LS) is a condition by which older people may require care service because of problems with locomotive organs. This study examined whether the loco-check, a 7-item questionnaire, is useful for quantitatively assessing the severity of LS. [Subjects and Methods] Seventy-one community dwelling Japanese females aged 64–96 years (81.7 ± 8.0 years) participated in this study. The associations of the loco-check with thigh muscle mass measured by X-ray CT, physical performance, nutritional status, and quality of life (QOL) were investigated. [Results] The results showed that the number of times that “yes” was selected in the loco-check was significantly correlated with thigh muscle mass, major measures of physical performance, nutritional status, and QOL. This number was also significantly larger in the participants experiencing falling, fracture, and lumbar pain than in those without these episodes. [Conclusion] These results suggest that the loco-check might be useful for quantitatively evaluating LS. PMID:28932003

  1. 14 CFR 121.343 - Flight data recorders.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... recorder must be installed at the next heavy maintenance check after May 26, 1994, but no later than May 26, 1995. A heavy maintenance check is considered to be any time an aircraft is scheduled to be out of..., 1995, compliance date for all aircraft on that list. (3) After May 26, 1994, any aircraft that is...

  2. 14 CFR 121.343 - Flight data recorders.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... recorder must be installed at the next heavy maintenance check after May 26, 1994, but no later than May 26, 1995. A heavy maintenance check is considered to be any time an aircraft is scheduled to be out of..., 1995, compliance date for all aircraft on that list. (3) After May 26, 1994, any aircraft that is...

  3. 14 CFR 121.343 - Flight data recorders.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... recorder must be installed at the next heavy maintenance check after May 26, 1994, but no later than May 26, 1995. A heavy maintenance check is considered to be any time an aircraft is scheduled to be out of..., 1995, compliance date for all aircraft on that list. (3) After May 26, 1994, any aircraft that is...

  4. 14 CFR 121.343 - Flight data recorders.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... recorder must be installed at the next heavy maintenance check after May 26, 1994, but no later than May 26, 1995. A heavy maintenance check is considered to be any time an aircraft is scheduled to be out of..., 1995, compliance date for all aircraft on that list. (3) After May 26, 1994, any aircraft that is...

  5. 14 CFR 121.343 - Flight data recorders.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... recorder must be installed at the next heavy maintenance check after May 26, 1994, but no later than May 26, 1995. A heavy maintenance check is considered to be any time an aircraft is scheduled to be out of..., 1995, compliance date for all aircraft on that list. (3) After May 26, 1994, any aircraft that is...

  6. 41 CFR 301-51.200 - For what expenses may I receive a travel advance?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... personal check, or travelers check) Any time you are on official travel. (1) M&IE covered by the per diem... receive a travel advance? 301-51.200 Section 301-51.200 Public Contracts and Property Management Federal Travel Regulation System TEMPORARY DUTY (TDY) TRAVEL ALLOWANCES ARRANGING FOR TRAVEL SERVICES, PAYING...

  7. 41 CFR 301-51.200 - For what expenses may I receive a travel advance?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... personal check, or travelers check) Any time you are on official travel. (1) M&IE covered by the per diem... receive a travel advance? 301-51.200 Section 301-51.200 Public Contracts and Property Management Federal Travel Regulation System TEMPORARY DUTY (TDY) TRAVEL ALLOWANCES ARRANGING FOR TRAVEL SERVICES, PAYING...

  8. 41 CFR 301-51.200 - For what expenses may I receive a travel advance?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... personal check, or travelers check) Any time you are on official travel. (1) M&IE covered by the per diem... receive a travel advance? 301-51.200 Section 301-51.200 Public Contracts and Property Management Federal Travel Regulation System TEMPORARY DUTY (TDY) TRAVEL ALLOWANCES ARRANGING FOR TRAVEL SERVICES, PAYING...

  9. 41 CFR 301-51.200 - For what expenses may I receive a travel advance?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... personal check, or travelers check) Any time you are on official travel. (1) M&IE covered by the per diem... receive a travel advance? 301-51.200 Section 301-51.200 Public Contracts and Property Management Federal Travel Regulation System TEMPORARY DUTY (TDY) TRAVEL ALLOWANCES ARRANGING FOR TRAVEL SERVICES, PAYING...

  10. Validation of daily increments and a marine-entry check in the otoliths of sockeye salmon Oncorhynchus nerka post-smolts.

    PubMed

    Freshwater, C; Trudel, M; Beacham, T D; Neville, C-E; Tucker, S; Juanes, F

    2015-07-01

    Juvenile sockeye salmon Oncorhynchus nerka that were reared and smolted in laboratory conditions were found to produce otolith daily increments, as well as a consistently visible marine-entry check formed during their transition to salt water. Field-collected O. nerka post-smolts of an equivalent age also displayed visible checks; however, microchemistry estimates of marine-entry date using Sr:Ca ratios differed from visual estimates by c. 9 days suggesting that microstructural and microchemical processes occur on different time scales. © 2015 The Fisheries Society of the British Isles.

  11. Low-Density Parity-Check Code Design Techniques to Simplify Encoding

    NASA Astrophysics Data System (ADS)

    Perez, J. M.; Andrews, K.

    2007-11-01

    This work describes a method for encoding low-density parity-check (LDPC) codes based on the accumulate-repeat-4-jagged-accumulate (AR4JA) scheme, using the low-density parity-check matrix H instead of the dense generator matrix G. The use of the H matrix to encode allows a significant reduction in memory consumption and provides the encoder design a great flexibility. Also described are new hardware-efficient codes, based on the same kind of protographs, which require less memory storage and area, allowing at the same time a reduction in the encoding delay.

  12. Design and scheduling for periodic concurrent error detection and recovery in processor arrays

    NASA Technical Reports Server (NTRS)

    Wang, Yi-Min; Chung, Pi-Yu; Fuchs, W. Kent

    1992-01-01

    Periodic application of time-redundant error checking provides the trade-off between error detection latency and performance degradation. The goal is to achieve high error coverage while satisfying performance requirements. We derive the optimal scheduling of checking patterns in order to uniformly distribute the available checking capability and maximize the error coverage. Synchronous buffering designs using data forwarding and dynamic reconfiguration are described. Efficient single-cycle diagnosis is implemented by error pattern analysis and direct-mapped recovery cache. A rollback recovery scheme using start-up control for local recovery is also presented.

  13. Mandatory Identification Bar Checks: How Bouncers Are Doing Their Job

    ERIC Educational Resources Information Center

    Monk-Turner, Elizabeth; Allen, John; Casten, John; Cowling, Catherine; Gray, Charles; Guhr, David; Hoofnagle, Kara; Huffman, Jessica; Mina, Moises; Moore, Brian

    2011-01-01

    The behavior of bouncers at on site establishments that served alcohol was observed. Our aim was to better understand how bouncers went about their job when the bar had a mandatory policy to check identification of all customers. Utilizing an ethnographic decision model, we found that bouncers were significantly more likely to card customers that…

  14. Enhancing Classroom Management Using the Classroom Check-up Consultation Model with In-Vivo Coaching and Goal Setting Components

    ERIC Educational Resources Information Center

    Kleinert, Whitney L.; Silva, Meghan R.; Codding, Robin S.; Feinberg, Adam B.; St. James, Paula S.

    2017-01-01

    Classroom management is essential to promote learning in schools, and as such it is imperative that teachers receive adequate support to maximize their competence implementing effective classroom management strategies. One way to improve teachers' classroom managerial competence is through consultation. The Classroom Check-Up (CCU) is a structured…

  15. 76 FR 18964 - Airworthiness Directives; Costruzioni Aeronautiche Tecnam srl Model P2006T Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-06

    ... Landing Gear retraction/extension ground checks performed on the P2006T, a loose Seeger ring was found on... condition for the specified products. The MCAI states: During Landing Gear retraction/extension ground... retraction/extension ground checks performed on the P2006T, a loose Seeger ring was found on the nose landing...

  16. 78 FR 69987 - Airworthiness Directives; Erickson Air-Crane Incorporated Helicopters (Type Certificate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ... to require recurring checks of the Blade Inspection Method (BIM) indicator on each blade to determine whether the BIM indicator is signifying that the blade pressure may have been compromised by a blade crack... check procedures for BIM blades installed on the Model S-64E and S-64F helicopters. Several blade spars...

  17. Motivational Interviewing for Effective Classroom Management: The Classroom Check-Up. Practical Intervention in the Schools Series

    ERIC Educational Resources Information Center

    Reinke, Wendy M.; Herman, Keith C.; Sprick, Randy

    2011-01-01

    Highly accessible and user-friendly, this book focuses on helping K-12 teachers increase their use of classroom management strategies that work. It addresses motivational aspects of teacher consultation that are essential, yet often overlooked. The Classroom Check-Up is a step-by-step model for assessing teachers' organizational, instructional,…

  18. 75 FR 63045 - Airworthiness Directives; BAE SYSTEMS (OPERATIONS) LIMITED Model BAe 146 and Avro 146-RJ Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-14

    ... the fitting and wing structure. Checking the nuts with a suitable torque spanner to the specifications in the torque figures shown in Table 2. of the Accomplishment Instructions of BAE SYSTEMS (OPERATIONS... installed, and Doing either an ultrasonic inspection for damaged bolts or torque check of the tension bolts...

  19. 76 FR 13069 - Airworthiness Directives; BAE Systems (Operations) Limited Model ATP Airplanes; BAE Systems...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-10

    ..., an operator found an aileron trim tab hinge pin that had migrated sufficiently to cause a rubbing.... Recently, during a walk round check, an operator found an aileron trim tab hinge pin that had migrated... walk round check, an operator found an aileron trim tab hinge pin that had migrated sufficiently to...

  20. Chemotherapy Order Entry by a Clinical Support Pharmacy Technician in an Outpatient Medical Day Unit

    PubMed Central

    Neville, Heather; Broadfield, Larry; Harding, Claudia; Heukshorst, Shelley; Sweetapple, Jennifer; Rolle, Megan

    2016-01-01

    Background: Pharmacy technicians are expanding their scope of practice, often in partnership with pharmacists. In oncology, such a shift in responsibilities may lead to workflow efficiencies, but may also cause concerns about patient risk and medication errors. Objectives: The primary objective was to compare the time spent on order entry and order-entry checking before and after training of a clinical support pharmacy technician (CSPT) to perform chemotherapy order entry. The secondary objectives were to document workflow interruptions and to assess medication errors. Methods: This before-and-after observational study investigated chemotherapy order entry for ambulatory oncology patients. Order entry was performed by pharmacists before the process change (phase 1) and by 1 CSPT after the change (phase 2); order-entry checking was performed by a pharmacist during both phases. The tasks were timed by an independent observer using a personal digital assistant. A convenience sample of 125 orders was targeted for each phase. Data were exported to Microsoft Excel software, and timing differences for each task were tested with an unpaired t test. Results: Totals of 143 and 128 individual orders were timed for order entry during phase 1 (pharmacist) and phase 2 (CSPT), respectively. The mean total time to perform order entry was greater during phase 1 (1:37 min versus 1:20 min; p = 0.044). Totals of 144 and 122 individual orders were timed for order-entry checking (by a pharmacist) in phases 1 and 2, respectively, and there was no difference in mean total time for order-entry checking (1:21 min versus 1:20 min; p = 0.69). There were 33 interruptions not related to order entry (totalling 39:38 min) during phase 1 and 25 interruptions (totalling 30:08 min) during phase 2. Three errors were observed during order entry in phase 1 and one error during order-entry checking in phase 2; the errors were rated as having no effect on patient care. Conclusions: Chemotherapy order entry by a trained CSPT appeared to be just as safe and efficient as order entry by a pharmacist. Changes in pharmacy technicians’ scope of practice could increase the amount of time available for pharmacists to provide direct patient care in the oncology setting. PMID:27402999

  1. Chemotherapy Order Entry by a Clinical Support Pharmacy Technician in an Outpatient Medical Day Unit.

    PubMed

    Neville, Heather; Broadfield, Larry; Harding, Claudia; Heukshorst, Shelley; Sweetapple, Jennifer; Rolle, Megan

    2016-01-01

    Pharmacy technicians are expanding their scope of practice, often in partnership with pharmacists. In oncology, such a shift in responsibilities may lead to workflow efficiencies, but may also cause concerns about patient risk and medication errors. The primary objective was to compare the time spent on order entry and order-entry checking before and after training of a clinical support pharmacy technician (CSPT) to perform chemotherapy order entry. The secondary objectives were to document workflow interruptions and to assess medication errors. This before-and-after observational study investigated chemotherapy order entry for ambulatory oncology patients. Order entry was performed by pharmacists before the process change (phase 1) and by 1 CSPT after the change (phase 2); order-entry checking was performed by a pharmacist during both phases. The tasks were timed by an independent observer using a personal digital assistant. A convenience sample of 125 orders was targeted for each phase. Data were exported to Microsoft Excel software, and timing differences for each task were tested with an unpaired t test. Totals of 143 and 128 individual orders were timed for order entry during phase 1 (pharmacist) and phase 2 (CSPT), respectively. The mean total time to perform order entry was greater during phase 1 (1:37 min versus 1:20 min; p = 0.044). Totals of 144 and 122 individual orders were timed for order-entry checking (by a pharmacist) in phases 1 and 2, respectively, and there was no difference in mean total time for order-entry checking (1:21 min versus 1:20 min; p = 0.69). There were 33 interruptions not related to order entry (totalling 39:38 min) during phase 1 and 25 interruptions (totalling 30:08 min) during phase 2. Three errors were observed during order entry in phase 1 and one error during order-entry checking in phase 2; the errors were rated as having no effect on patient care. Chemotherapy order entry by a trained CSPT appeared to be just as safe and efficient as order entry by a pharmacist. Changes in pharmacy technicians' scope of practice could increase the amount of time available for pharmacists to provide direct patient care in the oncology setting.

  2. Stochastic Local Search for Core Membership Checking in Hedonic Games

    NASA Astrophysics Data System (ADS)

    Keinänen, Helena

    Hedonic games have emerged as an important tool in economics and show promise as a useful formalism to model multi-agent coalition formation in AI as well as group formation in social networks. We consider a coNP-complete problem of core membership checking in hedonic coalition formation games. No previous algorithms to tackle the problem have been presented. In this work, we overcome this by developing two stochastic local search algorithms for core membership checking in hedonic games. We demonstrate the usefulness of the algorithms by showing experimentally that they find solutions efficiently, particularly for large agent societies.

  3. Nonlinearities of heart rate variability in animal models of impaired cardiac control: contribution of different time scales.

    PubMed

    Silva, Luiz Eduardo Virgilio; Lataro, Renata Maria; Castania, Jaci Airton; Silva, Carlos Alberto Aguiar; Salgado, Helio Cesar; Fazan, Rubens; Porta, Alberto

    2017-08-01

    Heart rate variability (HRV) has been extensively explored by traditional linear approaches (e.g., spectral analysis); however, several studies have pointed to the presence of nonlinear features in HRV, suggesting that linear tools might fail to account for the complexity of the HRV dynamics. Even though the prevalent notion is that HRV is nonlinear, the actual presence of nonlinear features is rarely verified. In this study, the presence of nonlinear dynamics was checked as a function of time scales in three experimental models of rats with different impairment of the cardiac control: namely, rats with heart failure (HF), spontaneously hypertensive rats (SHRs), and sinoaortic denervated (SAD) rats. Multiscale entropy (MSE) and refined MSE (RMSE) were chosen as the discriminating statistic for the surrogate test utilized to detect nonlinearity. Nonlinear dynamics is less present in HF animals at both short and long time scales compared with controls. A similar finding was found in SHR only at short time scales. SAD increased the presence of nonlinear dynamics exclusively at short time scales. Those findings suggest that a working baroreflex contributes to linearize HRV and to reduce the likelihood to observe nonlinear components of the cardiac control at short time scales. In addition, an increased sympathetic modulation seems to be a source of nonlinear dynamics at long time scales. Testing nonlinear dynamics as a function of the time scales can provide a characterization of the cardiac control complementary to more traditional markers in time, frequency, and information domains. NEW & NOTEWORTHY Although heart rate variability (HRV) dynamics is widely assumed to be nonlinear, nonlinearity tests are rarely used to check this hypothesis. By adopting multiscale entropy (MSE) and refined MSE (RMSE) as the discriminating statistic for the nonlinearity test, we show that nonlinear dynamics varies with time scale and the type of cardiac dysfunction. Moreover, as complexity metrics and nonlinearities provide complementary information, we strongly recommend using the test for nonlinearity as an additional index to characterize HRV. Copyright © 2017 the American Physiological Society.

  4. Robust check loss-based variable selection of high-dimensional single-index varying-coefficient model

    NASA Astrophysics Data System (ADS)

    Song, Yunquan; Lin, Lu; Jian, Ling

    2016-07-01

    Single-index varying-coefficient model is an important mathematical modeling method to model nonlinear phenomena in science and engineering. In this paper, we develop a variable selection method for high-dimensional single-index varying-coefficient models using a shrinkage idea. The proposed procedure can simultaneously select significant nonparametric components and parametric components. Under defined regularity conditions, with appropriate selection of tuning parameters, the consistency of the variable selection procedure and the oracle property of the estimators are established. Moreover, due to the robustness of the check loss function to outliers in the finite samples, our proposed variable selection method is more robust than the ones based on the least squares criterion. Finally, the method is illustrated with numerical simulations.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakaguchi, Yuji, E-mail: nkgc2003@yahoo.co.jp; Ono, Takeshi; Onitsuka, Ryota

    COMPASS system (IBA Dosimetry, Schwarzenbruck, Germany) and ArcCHECK with 3DVH software (Sun Nuclear Corp., Melbourne, FL) are commercial quasi-3-dimensional (3D) dosimetry arrays. Cross-validation to compare them under the same conditions, such as a treatment plan, allows for clear evaluation of such measurement devices. In this study, we evaluated the accuracy of reconstructed dose distributions from the COMPASS system and ArcCHECK with 3DVH software using Monte Carlo simulation (MC) for multi-leaf collimator (MLC) test patterns and clinical VMAT plans. In a phantom study, ArcCHECK 3DVH showed clear differences from COMPASS, measurement and MC due to the detector resolution and the dosemore » reconstruction method. Especially, ArcCHECK 3DVH showed 7% difference from MC for the heterogeneous phantom. ArcCHECK 3DVH only corrects the 3D dose distribution of treatment planning system (TPS) using ArcCHECK measurement, and therefore the accuracy of ArcCHECK 3DVH depends on TPS. In contrast, COMPASS showed good agreement with MC for all cases. However, the COMPASS system requires many complicated installation procedures such as beam modeling, and appropriate commissioning is needed. In terms of clinical cases, there were no large differences for each QA device. The accuracy of the compass and ArcCHECK 3DVH systems for phantoms and clinical cases was compared. Both systems have advantages and disadvantages for clinical use, and consideration of the operating environment is important. The QA system selection is depending on the purpose and workflow in each hospital.« less

  6. The remote sensing of ocean primary productivity - Use of a new data compilation to test satellite algorithms

    NASA Technical Reports Server (NTRS)

    Balch, William; Evans, Robert; Brown, Jim; Feldman, Gene; Mcclain, Charles; Esaias, Wayne

    1992-01-01

    Global pigment and primary productivity algorithms based on a new data compilation of over 12,000 stations occupied mostly in the Northern Hemisphere, from the late 1950s to 1988, were tested. The results showed high variability of the fraction of total pigment contributed by chlorophyll, which is required for subsequent predictions of primary productivity. Two models, which predict pigment concentration normalized to an attenuation length of euphotic depth, were checked against 2,800 vertical profiles of pigments. Phaeopigments consistently showed maxima at about one optical depth below the chlorophyll maxima. CZCS data coincident with the sea truth data were also checked. A regression of satellite-derived pigment vs ship-derived pigment had a coefficient of determination. The satellite underestimated the true pigment concentration in mesotrophic and oligotrophic waters and overestimated the pigment concentration in eutrophic waters. The error in the satellite estimate showed no trends with time between 1978 and 1986.

  7. An Envelope Based Feedback Control System for Earthquake Early Warning: Reality Check Algorithm

    NASA Astrophysics Data System (ADS)

    Heaton, T. H.; Karakus, G.; Beck, J. L.

    2016-12-01

    Earthquake early warning systems are, in general, designed to be open loop control systems in such a way that the output, i.e., the warning messages, only depend on the input, i.e., recorded ground motions, up to the moment when the message is issued in real-time. We propose an algorithm, which is called Reality Check Algorithm (RCA), which would assess the accuracy of issued warning messages, and then feed the outcome of the assessment back into the system. Then, the system would modify its messages if necessary. That is, we are proposing to convert earthquake early warning systems into feedback control systems by integrating them with RCA. RCA works by continuously monitoring and comparing the observed ground motions' envelopes to the predicted envelopes of Virtual Seismologist (Cua 2005). Accuracy of magnitude and location (both spatial and temporal) estimations of the system are assessed separately by probabilistic classification models, which are trained by a Sparse Bayesian Learning technique called Automatic Relevance Determination prior.

  8. Repeatability Modeling for Wind-Tunnel Measurements: Results for Three Langley Facilities

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.; Houlden, Heather P.

    2014-01-01

    Data from extensive check standard tests of seven measurement processes in three NASA Langley Research Center wind tunnels are statistically analyzed to test a simple model previously presented in 2000 for characterizing short-term, within-test and across-test repeatability. The analysis is intended to support process improvement and development of uncertainty models for the measurements. The analysis suggests that the repeatability can be estimated adequately as a function of only the test section dynamic pressure over a two-orders- of-magnitude dynamic pressure range. As expected for low instrument loading, short-term coefficient repeatability is determined by the resolution of the instrument alone (air off). However, as previously pointed out, for the highest dynamic pressure range the coefficient repeatability appears to be independent of dynamic pressure, thus presenting a lower floor for the standard deviation for all three time frames. The simple repeatability model is shown to be adequate for all of the cases presented and for all three time frames.

  9. A study of applications scribe frame data verifications using design rule check

    NASA Astrophysics Data System (ADS)

    Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki

    2013-06-01

    In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.

  10. Scaling with System Size of the Lyapunov Exponents for the Hamiltonian Mean Field Model

    NASA Astrophysics Data System (ADS)

    Manos, Thanos; Ruffo, Stefano

    2011-12-01

    The Hamiltonian Mean Field model is a prototype for systems with long-range interactions. It describes the motion of N particles moving on a ring, coupled with an infinite-range potential. The model has a second-order phase transition at the energy density Uc =3/4 and its dynamics is exactly described by the Vlasov equation in the N→∞ limit. Its chaotic properties have been investigated in the past, but the determination of the scaling with N of the Lyapunov Spectrum (LS) of the model remains a challenging open problem. Here we show that the N -1/3 scaling of the Maximal Lyapunov Exponent (MLE), found in previous numerical and analytical studies, extends to the full LS; scaling is "precocious" for the LS, meaning that it becomes manifest for a much smaller number of particles than the one needed to check the scaling for the MLE. Besides that, the N -1/3 scaling appears to be valid not only for U>Uc , as suggested by theoretical approaches based on a random matrix approximation, but also below a threshold energy Ut ≈0.2. Using a recently proposed method (GALI) devised to rapidly check the chaotic or regular nature of an orbit, we find that Ut is also the energy at which a sharp transition from weak to strong chaos is present in the phase-space of the model. Around this energy the phase of the vector order parameter of the model becomes strongly time dependent, inducing a significant untrapping of particles from a nonlinear resonance.

  11. Experiences of General Practitioners and Practice Support Staff Using a Health and Lifestyle Screening App in Primary Health Care: Implementation Case Study

    PubMed Central

    Wadley, Greg; Sanci, Lena Amanda

    2018-01-01

    Background Technology-based screening of young people for mental health disorders and health compromising behaviors in general practice increases the disclosure of sensitive health issues and improves patient-centered care. However, few studies investigate how general practitioners (GPs) and practice support staff (receptionists and practice managers) integrate screening technology into their routine work, including the problems that arise and how the staff surmount them. Objective The aim of this study was to investigate the implementation of a health and lifestyle screening app, Check Up GP, for young people aged 14 to 25 years attending an Australian general practice. Methods We conducted an in-depth implementation case study of Check Up GP in one general practice clinic, with methodology informed by action research. Semistructured interviews and focus groups were conducted with GPs and support staff at the end of the implementation period. Data were thematically analyzed and mapped to normalization process theory constructs. We also analyzed the number of times we supported staff, the location where young people completed Check Up GP, and whether they felt they had sufficient privacy and received a text messaging (short message service, SMS) link at the time of taking their appointment. Results A total of 4 GPs and 10 support staff at the clinic participated in the study, with all except 3 receptionists participating in the final interviews and focus groups. During the 2-month implementation period, the technology and administration of Check Up GP was iterated through 4 major quality improvement cycles in response to the needs of the staff. This resulted in a reduction in the average time taken to complete Check Up GP from 14 min to 10 min, improved SMS text messaging for young people, and a more consistent description of the app by receptionists to young people. In the first weeks of implementation, researchers needed to regularly support staff with the app’s administration; however, this support decreased over time, even as usage rose slightly. The majority of young people (73/87, 84%) completed Check Up GP in the waiting room, with less than half (35/80, 44%) having received an SMS from the clinic with a link to the tool. Participating staff valued Check Up GP, particularly its facilitation of youth-friendly practice. However, there was at first a lack of organizational systems and capacity to implement the app and also initially a reliance on researchers to facilitate the process. Conclusions The implementation of a screening app in the dynamic and time-restricted general practice setting presents a range of technical and administrative challenges. Successful implementation of a screening app is possible but requires adequate time and intensive facilitation. More resources, external to staff, are needed to drive and support sustainable technology innovation and implementation in general practice settings. PMID:29691209

  12. Experiences of General Practitioners and Practice Support Staff Using a Health and Lifestyle Screening App in Primary Health Care: Implementation Case Study.

    PubMed

    Webb, Marianne Julie; Wadley, Greg; Sanci, Lena Amanda

    2018-04-24

    Technology-based screening of young people for mental health disorders and health compromising behaviors in general practice increases the disclosure of sensitive health issues and improves patient-centered care. However, few studies investigate how general practitioners (GPs) and practice support staff (receptionists and practice managers) integrate screening technology into their routine work, including the problems that arise and how the staff surmount them. The aim of this study was to investigate the implementation of a health and lifestyle screening app, Check Up GP, for young people aged 14 to 25 years attending an Australian general practice. We conducted an in-depth implementation case study of Check Up GP in one general practice clinic, with methodology informed by action research. Semistructured interviews and focus groups were conducted with GPs and support staff at the end of the implementation period. Data were thematically analyzed and mapped to normalization process theory constructs. We also analyzed the number of times we supported staff, the location where young people completed Check Up GP, and whether they felt they had sufficient privacy and received a text messaging (short message service, SMS) link at the time of taking their appointment. A total of 4 GPs and 10 support staff at the clinic participated in the study, with all except 3 receptionists participating in the final interviews and focus groups. During the 2-month implementation period, the technology and administration of Check Up GP was iterated through 4 major quality improvement cycles in response to the needs of the staff. This resulted in a reduction in the average time taken to complete Check Up GP from 14 min to 10 min, improved SMS text messaging for young people, and a more consistent description of the app by receptionists to young people. In the first weeks of implementation, researchers needed to regularly support staff with the app's administration; however, this support decreased over time, even as usage rose slightly. The majority of young people (73/87, 84%) completed Check Up GP in the waiting room, with less than half (35/80, 44%) having received an SMS from the clinic with a link to the tool. Participating staff valued Check Up GP, particularly its facilitation of youth-friendly practice. However, there was at first a lack of organizational systems and capacity to implement the app and also initially a reliance on researchers to facilitate the process. The implementation of a screening app in the dynamic and time-restricted general practice setting presents a range of technical and administrative challenges. Successful implementation of a screening app is possible but requires adequate time and intensive facilitation. More resources, external to staff, are needed to drive and support sustainable technology innovation and implementation in general practice settings. ©Marianne Julie Webb, Greg Wadley, Lena Amanda Sanci. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 24.04.2018.

  13. Transient finite element analysis of electric double layer using Nernst-Planck-Poisson equations with a modified Stern layer.

    PubMed

    Lim, Jongil; Whitcomb, John; Boyd, James; Varghese, Julian

    2007-01-01

    A finite element implementation of the transient nonlinear Nernst-Planck-Poisson (NPP) and Nernst-Planck-Poisson-modified Stern (NPPMS) models is presented. The NPPMS model uses multipoint constraints to account for finite ion size, resulting in realistic ion concentrations even at high surface potential. The Poisson-Boltzmann equation is used to provide a limited check of the transient models for low surface potential and dilute bulk solutions. The effects of the surface potential and bulk molarity on the electric potential and ion concentrations as functions of space and time are studied. The ability of the models to predict realistic energy storage capacity is investigated. The predicted energy is much more sensitive to surface potential than to bulk solution molarity.

  14. Finite element implementation of state variable-based viscoplasticity models

    NASA Technical Reports Server (NTRS)

    Iskovitz, I.; Chang, T. Y. P.; Saleeb, A. F.

    1991-01-01

    The implementation of state variable-based viscoplasticity models is made in a general purpose finite element code for structural applications of metals deformed at elevated temperatures. Two constitutive models, Walker's and Robinson's models, are studied in conjunction with two implicit integration methods: the trapezoidal rule with Newton-Raphson iterations and an asymptotic integration algorithm. A comparison is made between the two integration methods, and the latter method appears to be computationally more appealing in terms of numerical accuracy and CPU time. However, in order to make the asymptotic algorithm robust, it is necessary to include a self adaptive scheme with subincremental step control and error checking of the Jacobian matrix at the integration points. Three examples are given to illustrate the numerical aspects of the integration methods tested.

  15. The friable sponge model of a cometary nucleus

    NASA Technical Reports Server (NTRS)

    Horanyi, M.; Gombosi, T. I.; Korosmezey, A.; Kecskemety, K.; Szego, K.; Cravens, T. E.; Nagy, A. F.

    1984-01-01

    The mantle/core model of cometary nuclei, first suggested by Whipple and subsequently developed by Mendis and Brin, is modified and extended. New terms are added to the heat conduction equation for the mantle, which is solved in order to obtain the temperature distribution in the mantle and the gas production rate as a function of mantle thickness and heliocentric distance. These results are then combined with some specific assumptions about the mantle structure (the friable sponge model) in order to make predictions for the variation of gas production rate and mantle thickness as functions of heliocentric distance for different comets. A solution of the time-dependent heat conduction equation is presented in order to check some of the assumptions.

  16. PKIX Certificate Status in Hybrid MANETs

    NASA Astrophysics Data System (ADS)

    Muñoz, Jose L.; Esparza, Oscar; Gañán, Carlos; Parra-Arnau, Javier

    Certificate status validation is a hard problem in general but it is particularly complex in Mobile Ad-hoc Networks (MANETs) because we require solutions to manage both the lack of fixed infrastructure inside the MANET and the possible absence of connectivity to trusted authorities when the certification validation has to be performed. In this sense, certificate acquisition is usually assumed as an initialization phase. However, certificate validation is a critical operation since the node needs to check the validity of certificates in real-time, that is, when a particular certificate is going to be used. In such MANET environments, it may happen that the node is placed in a part of the network that is disconnected from the source of status data at the moment the status checking is required. Proposals in the literature suggest the use of caching mechanisms so that the node itself or a neighbour node has some status checking material (typically on-line status responses or lists of revoked certificates). However, to the best of our knowledge the only criterion to evaluate the cached (obsolete) material is the time. In this paper, we analyse how to deploy a certificate status checking PKI service for hybrid MANET and we propose a new criterion based on risk to evaluate cached status data that is much more appropriate and absolute than time because it takes into account the revocation process.

  17. Real time detection of farm-level swine mycobacteriosis outbreak using time series modeling of the number of condemned intestines in abattoirs.

    PubMed

    Adachi, Yasumoto; Makita, Kohei

    2015-09-01

    Mycobacteriosis in swine is a common zoonosis found in abattoirs during meat inspections, and the veterinary authority is expected to inform the producer for corrective actions when an outbreak is detected. The expected value of the number of condemned carcasses due to mycobacteriosis therefore would be a useful threshold to detect an outbreak, and the present study aims to develop such an expected value through time series modeling. The model was developed using eight years of inspection data (2003 to 2010) obtained at 2 abattoirs of the Higashi-Mokoto Meat Inspection Center, Japan. The resulting model was validated by comparing the predicted time-dependent values for the subsequent 2 years with the actual data for 2 years between 2011 and 2012. For the modeling, at first, periodicities were checked using Fast Fourier Transformation, and the ensemble average profiles for weekly periodicities were calculated. An Auto-Regressive Integrated Moving Average (ARIMA) model was fitted to the residual of the ensemble average on the basis of minimum Akaike's information criterion (AIC). The sum of the ARIMA model and the weekly ensemble average was regarded as the time-dependent expected value. During 2011 and 2012, the number of whole or partial condemned carcasses exceeded the 95% confidence interval of the predicted values 20 times. All of these events were associated with the slaughtering of pigs from three producers with the highest rate of condemnation due to mycobacteriosis.

  18. Analysis and stochastic modelling of Intensity-Duration-Frequency relationship from 88 years of 10 min rainfall data in North Spain

    NASA Astrophysics Data System (ADS)

    Delgado, Oihane; Campo-Bescós, Miguel A.; López, J. Javier

    2017-04-01

    Frequently, when we are trying to solve certain hydrological engineering problems, it is often necessary to know rain intensity values related to a specific probability or return period, T. Based on analyses of extreme rainfall events at different time scale aggregation, we can deduce the relationships among Intensity-Duration-Frequency (IDF), that are widely used in hydraulic infrastructure design. However, the lack of long time series of rainfall intensities for smaller time periods, minutes or hours, leads to use mathematical expressions to characterize and extend these curves. One way to deduce them is through the development of synthetic rainfall time series generated from stochastic models, which is evaluated in this work. From recorded accumulated rainfall time series every 10 min in the pluviograph of Igueldo (San Sebastian, Spain) for the time period between 1927-2005, their homogeneity has been checked and possible statistically significant increasing or decreasing trends have also been shown. Subsequently, two models have been calibrated: Bartlett-Lewis and Markov chains models, which are based on the successions of storms, composed for a series of rainfall events, separated by a short interval of time each. Finally, synthetic ten-minute rainfall time series are generated, which allow to estimate detailed IDF curves and compare them with the estimated IDF based on the recorded data.

  19. Firearm Acquisition Without Background Checks: Results of a National Survey.

    PubMed

    Miller, Matthew; Hepburn, Lisa; Azrael, Deborah

    2017-02-21

    In 1994, 40% of U.S. gun owners who had recently acquired a firearm did so without a background check. No contemporary estimates exist. To estimate the proportion of current U.S. gun owners who acquired their most recent firearm without a background check, by time since and manner of acquisition, for the nation as a whole and separately in states with and without legislation regulating private sales. Probability-based online survey. United States, 2015. 1613 adult gun owners. Current gun owners were asked where and when they acquired their last firearm; if they purchased the firearm; and whether, as part of that acquisition, they had a background check (or were asked to show a firearm license or permit). 22% (95% CI, 16% to 27%) of gun owners who reported obtaining their most recent firearm within the previous 2 years reported doing so without a background check. For firearms purchased privately within the previous 2 years (that is, other than from a store or pawnshop, including sales between individuals in person, online, or at gun shows), 50% (CI, 35% to 65%) were obtained without a background check. This percentage was 26% (CI, 5% to 47%) for owners residing in states regulating private firearm sales and 57% (CI, 40% to 75%) for those living in states without regulations on private firearm sales. Potential inaccuracies due to recall and social desirability bias. 22% of current U.S. gun owners who acquired a firearm within the past 2 years did so without a background check. Although this represents a smaller proportion of gun owners obtaining firearms without background checks than in the past, millions of U.S. adults continue to acquire guns without background checks, especially in states that do not regulate private firearm sales. Fund for a Safer Future and the Joyce Foundation.

  20. Population pharmacokinetic modeling and simulation of huperzine A in elderly Chinese subjects

    PubMed Central

    Sheng, Lei; Qu, Yi; Yan, Jing; Liu, Gang-yi; Wang, Wei-liang; Wang, Yi-jun; Wang, Hong-yi; Zhang, Meng-qi; Lu, Chuan; Liu, Yun; Jia, Jing-yin; Hu, Chao-ying; Li, Xue-ning; Yu, Chen; Xu, Hong-rong

    2016-01-01

    Aim: Our preliminary results show that huperzine A, an acetylcholinesterase inhibitor used to treat Alzheimer's disease (AD) patients in China, exhibits different pharmacokinetic features in elderly and young healthy subjects. However, its pharmacokinetic data in elderly subjects remains unavailable to date. Thus, we developed a population pharmacokinetic (PPK) model of huperzine A in elderly Chinese people, and identified the covariate affecting its pharmacokinetics for optimal individual administration. Methods: A total of 341 serum huperzine A concentration records was obtained from 2 completed clinical trials (14 elderly healthy subjects in a phase I pharmacokinetic study; 35 elderly AD patients in a phase II study). Population pharmacokinetic analysis was performed using the non-linear mixed-effect modeling software Phoenix NLME1.1.1. The effects of age, gender, body weight, height, creatinine, endogenous creatinine clearance rate as well as drugs administered concomitantly were analyzed. Bootstrap and visual predictive checks were used simultaneously to validate the final population pharmacokinetics models. Results: The plasma concentration-time profile of huperzine A was best described by a one-compartment model with first-order absorption and elimination. Age was identified as the covariate having significant influence on huperzine A clearance. The final PPK model of huperzine A was: CL (L/h)=2.4649*(age/86)(−3.3856), Ka=0.6750 h−1, V (L)=104.216. The final PPK model was demonstrated to be suitable and effective by the bootstrap and visual predictive checks. Conclusion: A PPK model of huperzine A in elderly Chinese subjects is established, which can be used to predict PPK parameters of huperzine A in the treatment of elderly AD patients. PMID:27180987

  1. Model Checking Real Time Java Using Java PathFinder

    NASA Technical Reports Server (NTRS)

    Lindstrom, Gary; Mehlitz, Peter C.; Visser, Willem

    2005-01-01

    The Real Time Specification for Java (RTSJ) is an augmentation of Java for real time applications of various degrees of hardness. The central features of RTSJ are real time threads; user defined schedulers; asynchronous events, handlers, and control transfers; a priority inheritance based default scheduler; non-heap memory areas such as immortal and scoped, and non-heap real time threads whose execution is not impeded by garbage collection. The Robust Software Systems group at NASA Ames Research Center has JAVA PATHFINDER (JPF) under development, a Java model checker. JPF at its core is a state exploring JVM which can examine alternative paths in a Java program (e.g., via backtracking) by trying all nondeterministic choices, including thread scheduling order. This paper describes our implementation of an RTSJ profile (subset) in JPF, including requirements, design decisions, and current implementation status. Two examples are analyzed: jobs on a multiprogramming operating system, and a complex resource contention example involving autonomous vehicles crossing an intersection. The utility of JPF in finding logic and timing errors is illustrated, and the remaining challenges in supporting all of RTSJ are assessed.

  2. [Design and implementation of data checking system for Chinese materia medica resources survey].

    PubMed

    Wang, Hui; Zhang, Xiao-Bo; Ge, Xiao-Guang; Jin, Yan; Jing, Zhi-Xian; Qi, Yuan-Hua; Wang, Ling; Zhao, Yu-Ping; Wang, Wei; Guo, Lan-Ping; Huang, Lu-Qi

    2017-11-01

    The Chinese material medica resources (CMMR) national survey information management system has collected a large amount of data. To help dealing with data recheck, reduce the work of inside, improve the recheck of survey data from provincial and county level, National Resource Center for Chinese Materia Medical has designed a data checking system for Chinese material medica resources survey based on J2EE technology, Java language, Oracle data base in accordance with the SOA framework. It includes single data check, check score, content manage, check the survey data census data with manual checking and automatic checking about census implementation plan, key research information, general survey information, cultivation of medicinal materials information, germplasm resources information the medicine information, market research information, traditional knowledge information, specimen information of this 9 aspects 20 class 175 indicators in two aspects of the quantity and quality. The established system assists in the completion of the data consistency and accuracy, pushes the county survey team timely to complete the data entry arrangement work, so as to improve the integrity, consistency and accuracy of the survey data, and ensure effective and available data, which lay a foundation for providing accurate data support for national survey of the Chinese material medica resources (CMMR) results summary, and displaying results and sharing. Copyright© by the Chinese Pharmaceutical Association.

  3. Improving treatment plan evaluation with automation

    PubMed Central

    Covington, Elizabeth L.; Chen, Xiaoping; Younge, Kelly C.; Lee, Choonik; Matuszak, Martha M.; Kessler, Marc L.; Keranen, Wayne; Acosta, Eduardo; Dougherty, Ashley M.; Filpansick, Stephanie E.

    2016-01-01

    The goal of this work is to evaluate the effectiveness of Plan‐Checker Tool (PCT) which was created to improve first‐time plan quality, reduce patient delays, increase the efficiency of our electronic workflow, and standardize and automate the physics plan review in the treatment planning system (TPS). PCT uses an application programming interface to check and compare data from the TPS and treatment management system (TMS). PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user as part of a plan readiness check for treatment. Prior to and during PCT development, errors identified during the physics review and causes of patient treatment start delays were tracked to prioritize which checks should be automated. Nineteen of 33 checklist items were automated, with data extracted with PCT. There was a 60% reduction in the number of patient delays in the six months after PCT release. PCT was successfully implemented for use on all external beam treatment plans in our clinic. While the number of errors found during the physics check did not decrease, automation of checks increased visibility of errors during the physics check, which led to decreased patient delays. The methods used here can be applied to any TMS and TPS that allows queries of the database. PACS number(s): 87.55.‐x, 87.55.N‐, 87.55.Qr, 87.55.tm, 89.20.Bb PMID:27929478

  4. 40 CFR 60.3041 - What is the minimum amount of monitoring data I must collect with my continuous emission...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... and Compliance Times for Other Solid Waste Incineration Units That Commenced Construction On or Before.... An operating day is any day the unit combusts any municipal or institutional solid waste. (d) If you... malfunction or when repairs, calibration checks, or zero and span checks keep you from collecting the minimum...

  5. 40 CFR 60.3041 - What is the minimum amount of monitoring data I must collect with my continuous emission...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... and Compliance Times for Other Solid Waste Incineration Units That Commenced Construction On or Before.... An operating day is any day the unit combusts any municipal or institutional solid waste. (d) If you... malfunction or when repairs, calibration checks, or zero and span checks keep you from collecting the minimum...

  6. 40 CFR 60.3041 - What is the minimum amount of monitoring data I must collect with my continuous emission...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... and Compliance Times for Other Solid Waste Incineration Units That Commenced Construction On or Before.... An operating day is any day the unit combusts any municipal or institutional solid waste. (d) If you... malfunction or when repairs, calibration checks, or zero and span checks keep you from collecting the minimum...

  7. Enhanced and updated American Heart Association heart-check front-of-package symbol: efforts to help consumers identify healthier food choices

    USDA-ARS?s Scientific Manuscript database

    A variety of nutrition symbols and rating systems are in use on the front of food packages. They are intended to help consumers make healthier food choices. One system, the American Heart Association Heart (AHA) Heart-Check Program, has evolved over time to incorporate current science-based recommen...

  8. The instant sequencing task: Toward constraint-checking a complex spacecraft command sequence interactively

    NASA Technical Reports Server (NTRS)

    Horvath, Joan C.; Alkalaj, Leon J.; Schneider, Karl M.; Amador, Arthur V.; Spitale, Joseph N.

    1993-01-01

    Robotic spacecraft are controlled by sets of commands called 'sequences.' These sequences must be checked against mission constraints. Making our existing constraint checking program faster would enable new capabilities in our uplink process. Therefore, we are rewriting this program to run on a parallel computer. To do so, we had to determine how to run constraint-checking algorithms in parallel and create a new method of specifying spacecraft models and constraints. This new specification gives us a means of representing flight systems and their predicted response to commands which could be used in a variety of applications throughout the command process, particularly during anomaly or high-activity operations. This commonality could reduce operations cost and risk for future complex missions. Lessons learned in applying some parts of this system to the TOPEX/Poseidon mission will be described.

  9. The KATE shell: An implementation of model-based control, monitor and diagnosis

    NASA Technical Reports Server (NTRS)

    Cornell, Matthew

    1987-01-01

    The conventional control and monitor software currently used by the Space Center for Space Shuttle processing has many limitations such as high maintenance costs, limited diagnostic capabilities and simulation support. These limitations have caused the development of a knowledge based (or model based) shell to generically control and monitor electro-mechanical systems. The knowledge base describes the system's structure and function and is used by a software shell to do real time constraints checking, low level control of components, diagnosis of detected faults, sensor validation, automatic generation of schematic diagrams and automatic recovery from failures. This approach is more versatile and more powerful than the conventional hard coded approach and offers many advantages over it, although, for systems which require high speed reaction times or aren't well understood, knowledge based control and monitor systems may not be appropriate.

  10. Assessment of hemoglobin responsiveness to epoetin alfa in patients on hemodialysis using a population pharmacokinetic pharmacodynamic model.

    PubMed

    Wu, Liviawati; Mould, Diane R; Perez Ruixo, Juan Jose; Doshi, Sameer

    2015-10-01

    A population pharmacokinetic pharmacodynamic (PK/PD) model describing the effect of epoetin alfa on hemoglobin (Hb) response in hemodialysis patients was developed. Epoetin alfa pharmacokinetics was described using a linear 2-compartment model. PK parameter estimates were similar to previously reported values. A maturation-structured cytokinetic model consisting of 5 compartments linked in a catenary fashion by first-order cell transfer rates following a zero-order input process described the Hb time course. The PD model described 2 subpopulations, one whose Hb response reflected epoetin alfa dosing and a second whose response was unrelated to epoetin alfa dosing. Parameter estimates from the PK/PD model were physiologically reasonable and consistent with published reports. Numerical and visual predictive checks using data from 2 studies were performed. The PK and PD of epoetin alfa were well described by the model. © 2015, The American College of Clinical Pharmacology.

  11. Evaluating and Improving a Learning Trajectory for Linear Measurement in Elementary Grades 2 and 3: A Longitudinal Study

    ERIC Educational Resources Information Center

    Barrett, Jeffrey E.; Sarama, Julie; Clements, Douglas H.; Cullen, Craig; McCool, Jenni; Witkowski-Rumsey, Chepina; Klanderman, David

    2012-01-01

    We examined children's development of strategic and conceptual knowledge for linear measurement. We conducted teaching experiments with eight students in grades 2 and 3, based on our hypothetical learning trajectory for length to check its coherence and to strengthen the domain-specific model for learning and teaching. We checked the hierarchical…

  12. 75 FR 39185 - Airworthiness Directives; The Boeing Company Model 747-100, 747-100B, 747-100B SUD, 747-200B, 747...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-08

    ... and torque checks of the hanger fittings and strut forward bulkhead of the forward engine mount and... requires repetitive inspections and torque checks of the hanger fittings and strut forward bulkhead of the... corrective actions are replacing the fasteners; removing loose fasteners; tightening all Group A [[Page 39187...

  13. Clients' reasons for prenatal ultrasonography in Ibadan, South West of Nigeria

    PubMed Central

    Enakpene, Christopher A; Morhason-Bello, Imran O; Marinho, Anthony O; Adedokun, Babatunde O; Kalejaiye, Adegoke O; Sogo, Kayode; Gbadamosi, Sikiru A; Awoyinka, Babatunde S; Enabor, Obehi O

    2009-01-01

    Background Prenatal ultrasonography has remained a universal tool but little is known especially from developing countries on clients' reasons for desiring it. Then aim was to determine the reasons why pregnant women will desire a prenatal ultrasound. Methods It was a cross-sectional survey of consecutive 222 women at 2 different ultrasonography facilities in Ibadan, South-west Nigeria. Results The mean age of the respondents was 30.1 ± 4.5 years. The commonest reason for requesting for prenatal ultrasound scans was to check for fetal viability in 144 women (64.7%) of the respondents, followed by fetal gender determination in 50 women (22.6%. Other reasons were to check for number of fetuses, fetal age and placental location. Factors such as younger age, artisans profession and low level of education significantly influenced the decision to check for fetal viability on bivariate analysis but all were not significant on multivariate analysis. Concerning fetal gender determination, older age, Christianity, occupation and gravidity were significant on bivariate analysis, however, only gravidity and occupation remained significant independent predictor on logistic regression model. Women with less than 3 previous pregnancies were about 4 times more likely to request for fetal sex determination than women with more than 3 previous pregnancies, (OR 3.8 95%CI 1.52 – 9.44). The professionals were 7 times more likely than the artisans to request to find out about their fetal sex, (OR 7.0 95%CI 1.47 – 333.20). Conclusion This study shows that Nigerian pregnant women desired prenatal ultrasonography mostly for fetal viability, followed by fetal gender determination. These preferences were influenced by their biosocial variables. PMID:19426518

  14. A Formal Methodology to Design and Deploy Dependable Wireless Sensor Networks

    PubMed Central

    Testa, Alessandro; Cinque, Marcello; Coronato, Antonio; Augusto, Juan Carlos

    2016-01-01

    Wireless Sensor Networks (WSNs) are being increasingly adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events may undermine the mission of the WSNs. Hence, their effects need to be properly assessed before deployment, to obtain a good level of expected performance; and during the operation, in order to avoid dangerous unexpected results. In this paper, we propose a methodology that aims at assessing and improving the dependability level of WSNs by means of an event-based formal verification technique. The methodology includes a process to guide designers towards the realization of a dependable WSN and a tool (“ADVISES”) to simplify its adoption. The tool is applicable to homogeneous WSNs with static routing topologies. It allows the automatic generation of formal specifications used to check correctness properties and evaluate dependability metrics at design time and at runtime for WSNs where an acceptable percentage of faults can be defined. During the runtime, we can check the behavior of the WSN accordingly to the results obtained at design time and we can detect sudden and unexpected failures, in order to trigger recovery procedures. The effectiveness of the methodology is shown in the context of two case studies, as proof-of-concept, aiming to illustrate how the tool is helpful to drive design choices and to check the correctness properties of the WSN at runtime. Although the method scales up to very large WSNs, the applicability of the methodology may be compromised by the state space explosion of the reasoning model, which must be faced by partitioning large topologies into sub-topologies. PMID:28025568

  15. Analysis of quality control data of eight modern radiotherapy linear accelerators: the short- and long-term behaviours of the outputs and the reproducibility of quality control measurements

    NASA Astrophysics Data System (ADS)

    Kapanen, Mika; Tenhunen, Mikko; Hämäläinen, Tuomo; Sipilä, Petri; Parkkinen, Ritva; Järvinen, Hannu

    2006-07-01

    Quality control (QC) data of radiotherapy linear accelerators, collected by Helsinki University Central Hospital between the years 2000 and 2004, were analysed. The goal was to provide information for the evaluation and elaboration of QC of accelerator outputs and to propose a method for QC data analysis. Short- and long-term drifts in outputs were quantified by fitting empirical mathematical models to the QC measurements. Normally, long-term drifts were well (<=1%) modelled by either a straight line or a single-exponential function. A drift of 2% occurred in 18 ± 12 months. The shortest drift times of only 2-3 months were observed for some new accelerators just after the commissioning but they stabilized during the first 2-3 years. The short-term reproducibility and the long-term stability of local constancy checks, carried out with a sealed plane parallel ion chamber, were also estimated by fitting empirical models to the QC measurements. The reproducibility was 0.2-0.5% depending on the positioning practice of a device. Long-term instabilities of about 0.3%/month were observed for some checking devices. The reproducibility of local absorbed dose measurements was estimated to be about 0.5%. The proposed empirical model fitting of QC data facilitates the recognition of erroneous QC measurements and abnormal output behaviour, caused by malfunctions, offering a tool to improve dose control.

  16. DYNA3D/ParaDyn Regression Test Suite Inventory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Jerry I.

    2016-09-01

    The following table constitutes an initial assessment of feature coverage across the regression test suite used for DYNA3D and ParaDyn. It documents the regression test suite at the time of preliminary release 16.1 in September 2016. The columns of the table represent groupings of functionalities, e.g., material models. Each problem in the test suite is represented by a row in the table. All features exercised by the problem are denoted by a check mark (√) in the corresponding column. The definition of “feature” has not been subdivided to its smallest unit of user input, e.g., algorithmic parameters specific to amore » particular type of contact surface. This represents a judgment to provide code developers and users a reasonable impression of feature coverage without expanding the width of the table by several multiples. All regression testing is run in parallel, typically with eight processors, except problems involving features only available in serial mode. Many are strictly regression tests acting as a check that the codes continue to produce adequately repeatable results as development unfolds; compilers change and platforms are replaced. A subset of the tests represents true verification problems that have been checked against analytical or other benchmark solutions. Users are welcomed to submit documented problems for inclusion in the test suite, especially if they are heavily exercising, and dependent upon, features that are currently underrepresented.« less

  17. Towards component-based validation of GATE: aspects of the coincidence processor.

    PubMed

    Moraes, Eder R; Poon, Jonathan K; Balakrishnan, Karthikayan; Wang, Wenli; Badawi, Ramsey D

    2015-02-01

    GATE is public domain software widely used for Monte Carlo simulation in emission tomography. Validations of GATE have primarily been performed on a whole-system basis, leaving the possibility that errors in one sub-system may be offset by errors in others. We assess the accuracy of the GATE PET coincidence generation sub-system in isolation, focusing on the options most closely modeling the majority of commercially available scanners. Independent coincidence generators were coded by teams at Toshiba Medical Research Unit (TMRU) and UC Davis. A model similar to the Siemens mCT scanner was created in GATE. Annihilation photons interacting with the detectors were recorded. Coincidences were generated using GATE, TMRU and UC Davis code and results compared to "ground truth" obtained from the history of the photon interactions. GATE was tested twice, once with every qualified single event opening a time window and initiating a coincidence check (the "multiple window method"), and once where a time window is opened and a coincidence check initiated only by the first single event to occur after the end of the prior time window (the "single window method"). True, scattered and random coincidences were compared. Noise equivalent count rates were also computed and compared. The TMRU and UC Davis coincidence generators agree well with ground truth. With GATE, reasonable accuracy can be obtained if the single window method option is chosen and random coincidences are estimated without use of the delayed coincidence option. However in this GATE version, other parameter combinations can result in significant errors. Copyright © 2014 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  18. Model Checker for Java Programs

    NASA Technical Reports Server (NTRS)

    Visser, Willem

    2007-01-01

    Java Pathfinder (JPF) is a verification and testing environment for Java that integrates model checking, program analysis, and testing. JPF consists of a custom-made Java Virtual Machine (JVM) that interprets bytecode, combined with a search interface to allow the complete behavior of a Java program to be analyzed, including interleavings of concurrent programs. JPF is implemented in Java, and its architecture is highly modular to support rapid prototyping of new features. JPF is an explicit-state model checker, because it enumerates all visited states and, therefore, suffers from the state-explosion problem inherent in analyzing large programs. It is suited to analyzing programs less than 10kLOC, but has been successfully applied to finding errors in concurrent programs up to 100kLOC. When an error is found, a trace from the initial state to the error is produced to guide the debugging. JPF works at the bytecode level, meaning that all of Java can be model-checked. By default, the software checks for all runtime errors (uncaught exceptions), assertions violations (supports Java s assert), and deadlocks. JPF uses garbage collection and symmetry reductions of the heap during model checking to reduce state-explosion, as well as dynamic partial order reductions to lower the number of interleavings analyzed. JPF is capable of symbolic execution of Java programs, including symbolic execution of complex data such as linked lists and trees. JPF is extensible as it allows for the creation of listeners that can subscribe to events during searches. The creation of dedicated code to be executed in place of regular classes is supported and allows users to easily handle native calls and to improve the efficiency of the analysis.

  19. The DD Check App for prevention and control of digital dermatitis in dairy herds.

    PubMed

    Tremblay, Marlène; Bennett, Tom; Döpfer, Dörte

    2016-09-15

    Digital dermatitis (DD) is the most important infectious claw disease in the cattle industry causing outbreaks of lameness. The clinical course of disease can be classified using 5 clinical stages. M-stages represent not only different disease severities but also unique clinical characteristics and outcomes. Monitoring the proportions of cows per M-stage is needed to better understand and address DD and factors influencing risks of DD in a herd. Changes in the proportion of cows per M-stage over time or between groups may be attributed to differences in management, environment, or treatment and can have impact on the future claw health of the herd. Yet trends in claw health regarding DD are not intuitively noticed without statistical analysis of detailed records. Our specific aim was to develop a mobile application (app) for persons with less statistical training, experience or supporting programs that would standardize M-stage records, automate data analysis including trends of M-stages over time, the calculation of predictions and assignments of Cow Types (i.e., Cow Types I-III are assigned to cows without active lesions, single and repeated cases of active DD lesions, respectively). The predictions were the stationary distributions of transitions between DD states (i.e., M-stages or signs of chronicity) in a class-structured multi-state Markov chain population model commonly used to model endemic diseases. We hypothesized that the app can be used at different levels of record detail to discover significant trends in the prevalence of M-stages that help to make informed decisions to prevent and control DD on-farm. Four data sets were used to test the flexibility and value of the DD Check App. The app allows easy recording of M-stages in different environments and is flexible in terms of the users' goals and the level of detail used. Results show that this tool discovers trends in M-stage proportions, predicts potential outbreaks of DD, and makes comparisons among Cow Types, signs of chronicity, scorers or pens. The DD Check App also provides a list of cows that should be treated augmented by individual Cow Types to help guide treatment and determine prognoses. Producers can be proactive instead of reactive in controlling DD in a herd by using this app. The DD Check App serves as an example of how technology makes knowledge and advice of veterinary epidemiology widely available to monitor, control and prevent this complex disease. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. A Semiautomated Journal Check-In and Binding System; or Variations on a Common Theme

    PubMed Central

    Livingston, Frances G.

    1967-01-01

    The journal check-in project described here, though based on a computerized system, uses only unit-record equipment and is designed for the medium-sized library. The frequency codes used are based on the date printed on the journal rather than on the expected date of receipt, which allows for more stability in the coding scheme. The journal's volume number and issue number, which in other systems are usually predetermined by a computer, are inserted at the time of check-in. Routine claiming of overdue issues and a systematic binding schedule have also been developed as by-products. PMID:6041836

  1. Method of validating measurement data of a process parameter from a plurality of individual sensor inputs

    DOEpatents

    Scarola, Kenneth; Jamison, David S.; Manazir, Richard M.; Rescorl, Robert L.; Harmon, Daryl L.

    1998-01-01

    A method for generating a validated measurement of a process parameter at a point in time by using a plurality of individual sensor inputs from a scan of said sensors at said point in time. The sensor inputs from said scan are stored and a first validation pass is initiated by computing an initial average of all stored sensor inputs. Each sensor input is deviation checked by comparing each input including a preset tolerance against the initial average input. If the first deviation check is unsatisfactory, the sensor which produced the unsatisfactory input is flagged as suspect. It is then determined whether at least two of the inputs have not been flagged as suspect and are therefore considered good inputs. If two or more inputs are good, a second validation pass is initiated by computing a second average of all the good sensor inputs, and deviation checking the good inputs by comparing each good input including a present tolerance against the second average. If the second deviation check is satisfactory, the second average is displayed as the validated measurement and the suspect sensor as flagged as bad. A validation fault occurs if at least two inputs are not considered good, or if the second deviation check is not satisfactory. In the latter situation the inputs from each of all the sensors are compared against the last validated measurement and the value from the sensor input that deviates the least from the last valid measurement is displayed.

  2. A comparison of two tools to screen potentially inappropriate medication in internal medicine patients.

    PubMed

    Blanc, A-L; Spasojevic, S; Leszek, A; Théodoloz, M; Bonnabry, P; Fumeaux, T; Schaad, N

    2018-04-01

    Potentially inappropriate medication (PIM) is an important issue for inpatient management; it has been associated with safety problems, such as increases in adverse drugs events, and with longer hospital stays and higher healthcare costs. To compare two PIM-screening tools-STOPP/START and PIM-Check-applied to internal medicine patients. A second objective was to compare the use of PIMs in readmitted and non-readmitted patients. A retrospective observational study, in the general internal medicine ward of a Swiss non-university hospital. We analysed a random sample of 50 patients, hospitalized in 2013, whose readmission within 30 days of discharge had been potentially preventable, and compared them to a sample of 50 sex- and age-matched patients who were not readmitted. PIMs were screened using the STOPP/START tool, developed for geriatric patients, and the PIM-Check tool, developed for internal medicine patients. The time needed to perform each patient's analysis was measured. A clinical pharmacist counted and evaluated each PIM detected, based on its clinical relevance to the individual patient's case. The rates of screened and validated PIMs involving readmitted and non-readmitted patients were compared. Across the whole population, PIM-Check and STOPP/START detected 1348 and 537 PIMs, respectively, representing 13.5 and 5.4 PIMs/patient. Screening time was substantially shorter with PIM-Check than with STOPP/START (4 vs 10 minutes, respectively). The clinical pharmacist judged that 45% and 42% of the PIMs detected using PIM-Check and STOPP/START, respectively, were clinically relevant to individual patients' cases. No significant differences in the rates of detected and clinically relevant PIM were found between readmitted and non-readmitted patients. Internal medicine patients are frequently prescribed PIMs. PIM-Check's PIM detection rate was three times higher than STOPP/START's, and its screening time was shorter thanks to its electronic interface. Nearly half of the PIMs detected were judged to be non-clinically relevant, however, potentially overalerting the prescriber. These tools can, nevertheless, be considered useful in daily practice. Furthermore, the relevance of any PIM detected by these tools should always be carefully evaluated within the clinical context surrounding the individual patient. © 2017 John Wiley & Sons Ltd.

  3. SU-G-201-17: Verification of Dose Distributions From High-Dose-Rate Brachytherapy Ir-192 Source Using a Multiple-Array-Diode-Detector (MapCheck2)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harpool, K; De La Fuente Herman, T; Ahmad, S

    Purpose: To investigate quantitatively the accuracy of dose distributions for the Ir-192 high-dose-rate (HDR) brachytherapy source calculated by the Brachytherapy-Planning system (BPS) and measured using a multiple-array-diode-detector in a heterogeneous medium. Methods: A two-dimensional diode-array-detector system (MapCheck2) was scanned with a catheter and the CT-images were loaded into the Varian-Brachytherapy-Planning which uses TG-43-formalism for dose calculation. Treatment plans were calculated for different combinations of one dwell-position and varying irradiation times and different-dwell positions and fixed irradiation time with the source placed 12mm from the diode-array plane. The calculated dose distributions were compared to the measured doses with MapCheck2 delivered bymore » an Ir-192-source from a Nucletron-Microselectron-V2-remote-after-loader. The linearity of MapCheck2 was tested for a range of dwell-times (2–600 seconds). The angular effect was tested with 30 seconds irradiation delivered to the central-diode and then moving the source away in increments of 10mm. Results: Large differences were found between calculated and measured dose distributions. These differences are mainly due to absence of heterogeneity in the dose calculation and diode-artifacts in the measurements. The dose differences between measured and calculated due to heterogeneity ranged from 5%–12% depending on the position of the source relative to the diodes in MapCheck2 and different heterogeneities in the beam path. The linearity test of the diode-detector showed 3.98%, 2.61%, and 2.27% over-response at short irradiation times of 2, 5, and 10 seconds, respectively, and within 2% for 20 to 600 seconds (p-value=0.05) which depends strongly on MapCheck2 noise. The angular dependency was more pronounced at acute angles ranging up to 34% at 5.7 degrees. Conclusion: Large deviations between measured and calculated dose distributions for HDR-brachytherapy with Ir-192 may be improved when considering medium heterogeneity and dose-artifact of the diodes. This study demonstrates that multiple-array-diode-detectors provide practical and accurate dosimeter to verify doses delivered from the brachytherapy Ir-192-source.« less

  4. Neighborhood social capital is associated with participation in health checks of a general population: a multilevel analysis of a population-based lifestyle intervention- the Inter99 study.

    PubMed

    Bender, Anne Mette; Kawachi, Ichiro; Jørgensen, Torben; Pisinger, Charlotta

    2015-07-22

    Participation in population-based preventive health check has declined over the past decades. More research is needed to determine factors enhancing participation. The objective of this study was to examine the association between two measures of neighborhood level social capital on participation in the health check phase of a population-based lifestyle intervention. The study population comprised 12,568 residents of 73 Danish neighborhoods in the intervention group of a large population-based lifestyle intervention study - the Inter99. Two measures of social capital were applied; informal socializing and voting turnout. In a multilevel analysis only adjusting for age and sex, a higher level of neighborhood social capital was associated with higher probability of participating in the health check. Inclusion of both individual socioeconomic position and neighborhood deprivation in the model attenuated the coefficients for informal socializing, while voting turnout became non-significant. Higher level of neighborhood social capital was associated with higher probability of participating in the health check phase of a population-based lifestyle intervention. Most of the association between neighborhood social capital and participation in preventive health checks can be explained by differences in individual socioeconomic position and level of neighborhood deprivation. Nonetheless, there seems to be some residual association between social capital and health check participation, suggesting that activating social relations in the community may be an avenue for boosting participation rates in population-based health checks. ClinicalTrials.gov (registration no. NCT00289237 ).

  5. Microscopic analysis and simulation of check-mark stain on the galvanized steel strip

    NASA Astrophysics Data System (ADS)

    So, Hongyun; Yoon, Hyun Gi; Chung, Myung Kyoon

    2010-11-01

    When galvanized steel strip is produced through a continuous hot-dip galvanizing process, the thickness of adhered zinc film is controlled by plane impinging air gas jet referred to as "air-knife system". In such a gas-jet wiping process, stain of check-mark or sag line shape frequently appears. The check-mark defect is caused by non-uniform zinc coating and the oblique patterns such as "W", "V" or "X" on the coated surface. The present paper presents a cause and analysis of the check-mark formation and a numerical simulation of sag lines by using the numerical data produced by Large Eddy Simulation (LES) of the three-dimensional compressible turbulent flow field around the air-knife system. It was found that there is alternating plane-wise vortices near the impinging stagnation region and such alternating vortices move almost periodically to the right and to the left sides on the stagnation line due to the jet flow instability. Meanwhile, in order to simulate the check-mark formation, a novel perturbation model has been developed to predict the variation of coating thickness along the transverse direction. Finally, the three-dimensional zinc coating surface was obtained by the present perturbation model. It was found that the sag line formation is determined by the combination of the instantaneous coating thickness distribution along the transverse direction near the stagnation line and the feed speed of the steel strip.

  6. Predictors of Health Service Utilization Among Older Men in Jamaica.

    PubMed

    Willie-Tyndale, Douladel; McKoy Davis, Julian; Holder-Nevins, Desmalee; Mitchell-Fearon, Kathryn; James, Kenneth; Waldron, Norman K; Eldemire-Shearer, Denise

    2018-01-03

    To determine the relative influence of sociodemographic, socioeconomic, psychosocial, and health variables on health service utilization in the last 12 months. Data were analyzed for 1,412 men ≥60 years old from a 2012 nationally representative community-based survey in Jamaica. Associations between six health service utilization variables and several explanatory variables were explored. Logistic regression models were used to identify independent predictors of each utilization measure and determine the strengths of associations. More than 75% reported having health visits and blood pressure checks. Blood sugar (69.6%) and cholesterol (63.1%) checks were less common, and having a prostate check (35.1%) was the least utilized service. Adjusted models confirmed that the presence of chronic diseases and health insurance most strongly predicted utilization. A daughter or son as the main source of financial support (vs self) doubled or tripled, respectively, the odds of routine doctors' visits. Compared with primary or lower education, tertiary education doubled [2.37 (1.12, 4.95)] the odds of a blood pressure check. Regular attendance at club/society/religious organizations' meetings increased the odds of having a prostate check by 45%. Although need and financial resources most strongly influenced health service utilization, psychosocial variables may be particularly influential for underutilized services. © The Author(s) 2018. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. A computational study of liposome logic: towards cellular computing from the bottom up

    PubMed Central

    Smaldon, James; Romero-Campero, Francisco J.; Fernández Trillo, Francisco; Gheorghe, Marian; Alexander, Cameron

    2010-01-01

    In this paper we propose a new bottom-up approach to cellular computing, in which computational chemical processes are encapsulated within liposomes. This “liposome logic” approach (also called vesicle computing) makes use of supra-molecular chemistry constructs, e.g. protocells, chells, etc. as minimal cellular platforms to which logical functionality can be added. Modeling and simulations feature prominently in “top-down” synthetic biology, particularly in the specification, design and implementation of logic circuits through bacterial genome reengineering. The second contribution in this paper is the demonstration of a novel set of tools for the specification, modelling and analysis of “bottom-up” liposome logic. In particular, simulation and modelling techniques are used to analyse some example liposome logic designs, ranging from relatively simple NOT gates and NAND gates to SR-Latches, D Flip-Flops all the way to 3 bit ripple counters. The approach we propose consists of specifying, by means of P systems, gene regulatory network-like systems operating inside proto-membranes. This P systems specification can be automatically translated and executed through a multiscaled pipeline composed of dissipative particle dynamics (DPD) simulator and Gillespie’s stochastic simulation algorithm (SSA). Finally, model selection and analysis can be performed through a model checking phase. This is the first paper we are aware of that brings to bear formal specifications, DPD, SSA and model checking to the problem of modeling target computational functionality in protocells. Potential chemical routes for the laboratory implementation of these simulations are also discussed thus for the first time suggesting a potentially realistic physiochemical implementation for membrane computing from the bottom-up. PMID:21886681

  8. Simulation-based MDP verification for leading-edge masks

    NASA Astrophysics Data System (ADS)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification acceptable.

  9. Vehicular traffic noise prediction using soft computing approach.

    PubMed

    Singh, Daljeet; Nigam, S P; Agrawal, V P; Kumar, Maneek

    2016-12-01

    A new approach for the development of vehicular traffic noise prediction models is presented. Four different soft computing methods, namely, Generalized Linear Model, Decision Trees, Random Forests and Neural Networks, have been used to develop models to predict the hourly equivalent continuous sound pressure level, Leq, at different locations in the Patiala city in India. The input variables include the traffic volume per hour, percentage of heavy vehicles and average speed of vehicles. The performance of the four models is compared on the basis of performance criteria of coefficient of determination, mean square error and accuracy. 10-fold cross validation is done to check the stability of the Random Forest model, which gave the best results. A t-test is performed to check the fit of the model with the field data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. A generalized statistical model for the size distribution of wealth

    NASA Astrophysics Data System (ADS)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2012-12-01

    In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature.

  11. Engineered plant biomass feedstock particles

    DOEpatents

    Dooley, James H [Federal Way, WA; Lanning, David N [Federal Way, WA; Broderick, Thomas F [Lake Forest Park, WA

    2011-10-11

    A novel class of flowable biomass feedstock particles with unusually large surface areas that can be manufactured in remarkably uniform sizes using low-energy comminution techniques. The feedstock particles are roughly parallelepiped in shape and characterized by a length dimension (L) aligned substantially with the grain direction and defining a substantially uniform distance along the grain, a width dimension (W) normal to L and aligned cross grain, and a height dimension (H) normal to W and L. The particles exhibit a disrupted grain structure with prominent end and surface checks that greatly enhances their skeletal surface area as compared to their envelope surface area. The L.times.H dimensions define a pair of substantially parallel side surfaces characterized by substantially intact longitudinally arrayed fibers. The W.times.H dimensions define a pair of substantially parallel end surfaces characterized by crosscut fibers and end checking between fibers. The L.times.W dimensions define a pair of substantially parallel top surfaces characterized by some surface checking between longitudinally arrayed fibers. The feedstock particles are manufactured from a variety of plant biomass materials including wood, crop residues, plantation grasses, hemp, bagasse, and bamboo.

  12. Spatial distribution of angular momentum inside the nucleon

    NASA Astrophysics Data System (ADS)

    Lorcé, Cédric; Mantovani, Luca; Pasquini, Barbara

    2018-01-01

    We discuss in detail the spatial distribution of angular momentum inside the nucleon. We show that the discrepancies between different definitions originate from terms that integrate to zero. Even though these terms can safely be dropped at the integrated level, they have to be taken into account when discussing distributions. Using the scalar diquark model, we illustrate our results and, for the first time, check explicitly that the equivalence between kinetic and canonical orbital angular momentum persists at the level of distributions, as expected in a system without gauge degrees of freedom.

  13. Path integral pricing of Wasabi option in the Black-Scholes model

    NASA Astrophysics Data System (ADS)

    Cassagnes, Aurelien; Chen, Yu; Ohashi, Hirotada

    2014-11-01

    In this paper, using path integral techniques, we derive a formula for a propagator arising in the study of occupation time derivatives. Using this result we derive a fair price for the case of the cumulative Parisian option. After confirming the validity of the derived result using Monte Carlo simulation, a new type of heavily path dependent derivative product is investigated. We derive an approximation for our so-called Wasabi option fair price and check the accuracy of our result with a Monte Carlo simulation.

  14. Extended Poisson process modelling and analysis of grouped binary data.

    PubMed

    Faddy, Malcolm J; Smith, David M

    2012-05-01

    A simple extension of the Poisson process results in binomially distributed counts of events in a time interval. A further extension generalises this to probability distributions under- or over-dispersed relative to the binomial distribution. Substantial levels of under-dispersion are possible with this modelling, but only modest levels of over-dispersion - up to Poisson-like variation. Although simple analytical expressions for the moments of these probability distributions are not available, approximate expressions for the mean and variance are derived, and used to re-parameterise the models. The modelling is applied in the analysis of two published data sets, one showing under-dispersion and the other over-dispersion. More appropriate assessment of the precision of estimated parameters and reliable model checking diagnostics follow from this more general modelling of these data sets. © 2012 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Primordial black holes for the LIGO events in the axionlike curvaton model

    NASA Astrophysics Data System (ADS)

    Ando, Kenta; Inomata, Keisuke; Kawasaki, Masahiro; Mukaida, Kyohei; Yanagida, Tsutomu T.

    2018-06-01

    We review primordial black hole (PBH) formation in the axionlike curvaton model and investigate whether PBHs formed in this model can be the origin of the gravtitational wave (GW) signals detected by the Advanced LIGO. In this model, small-scale curvature perturbations with large amplitude are generated, which is essential for PBH formation. On the other hand, large curvature perturbations also become a source of primordial GWs by their second-order effects. Severe constraints are imposed on such GWs by pulsar timing array (PTA) experiments. We also check the consistency of the model with these constraints. In this analysis, it is important to take into account the effect of non-Gaussianity, which is generated easily in the curvaton model. We see that, if there are non-Gaussianities, the fixed amount of PBHs can be produced with a smaller amplitude of the primordial power spectrum.

  16. Location contexts of user check-ins to model urban geo life-style patterns.

    PubMed

    Hasan, Samiul; Ukkusuri, Satish V

    2015-01-01

    Geo-location data from social media offers us information, in new ways, to understand people's attitudes and interests through their activity choices. In this paper, we explore the idea of inferring individual life-style patterns from activity-location choices revealed in social media. We present a model to understand life-style patterns using the contextual information (e. g. location categories) of user check-ins. Probabilistic topic models are developed to infer individual geo life-style patterns from two perspectives: i) to characterize the patterns of user interests to different types of places and ii) to characterize the patterns of user visits to different neighborhoods. The method is applied to a dataset of Foursquare check-ins of the users from New York City. The co-existence of several location contexts and the corresponding probabilities in a given pattern provide useful information about user interests and choices. It is found that geo life-style patterns have similar items-either nearby neighborhoods or similar location categories. The semantic and geographic proximity of the items in a pattern reflects the hidden regularity in user preferences and location choice behavior.

  17. Constraint-Based Abstract Semantics for Temporal Logic: A Direct Approach to Design and Implementation

    NASA Astrophysics Data System (ADS)

    Banda, Gourinath; Gallagher, John P.

    interpretation provides a practical approach to verifying properties of infinite-state systems. We apply the framework of abstract interpretation to derive an abstract semantic function for the modal μ-calculus, which is the basis for abstract model checking. The abstract semantic function is constructed directly from the standard concrete semantics together with a Galois connection between the concrete state-space and an abstract domain. There is no need for mixed or modal transition systems to abstract arbitrary temporal properties, as in previous work in the area of abstract model checking. Using the modal μ-calculus to implement CTL, the abstract semantics gives an over-approximation of the set of states in which an arbitrary CTL formula holds. Then we show that this leads directly to an effective implementation of an abstract model checking algorithm for CTL using abstract domains based on linear constraints. The implementation of the abstract semantic function makes use of an SMT solver. We describe an implemented system for proving properties of linear hybrid automata and give some experimental results.

  18. Inkjet 3D printed check microvalve

    NASA Astrophysics Data System (ADS)

    Walczak, Rafał; Adamski, Krzysztof; Lizanets, Danylo

    2017-04-01

    3D printing enables fast and relatively easy fabrication of various microfluidic structures including microvalves. A check microvalve is the simplest valve enabling control of the fluid flow in microchannels. Proper operation of the check valve is ensured by a movable element that tightens the valve seat during backward flow and enables free flow for forward pressure. Thus, knowledge of the mechanical properties of the movable element is crucial for optimal design and operation of the valve. In this paper, we present for the first time the results of investigations on basic mechanical properties of the building material used in multijet 3D printing. Specified mechanical properties were used in the design and fabrication of two types of check microvalve—with deflecting or hinge-fixed microflap—with 200 µm and 300 µm thickness. Results of numerical simulation and experimental data of the microflap deflection were obtained and compared. The valves were successfully 3D printed and characterised. Opening/closing characteristics of the microvalve for forward and backward pressures were determined. Thus, proper operation of the check microvalve so developed was confirmed.

  19. iss049e040733

    NASA Image and Video Library

    2016-10-19

    ISS049e040733 (10/19/2016) --- NASA astronaut Kate Rubins is pictured inside of the Soyuz MS-01 spacecraft while conducting routine spacesuit checks. Rubins, suited up in a Russian Sokol Launch and Entry suit, was conducting leak checks in advance of her upcoming landing along with Japanese astronaut Takuya Onishi and Russian cosmonaut Anatoly Ivanishin. The trio are scheduled to land Oct. 29, U.S. time.

  20. Finite-Time Performance of Local Search Algorithms: Theory and Application

    DTIC Science & Technology

    2010-06-10

    security devices deployed at airport security checkpoints are used to detect prohibited items (e.g., guns, knives, explosives). Each security device...security devices are deployed, the practical issue of determining how to optimally use them can be difficult. For an airport security system design...checked baggage), explosive detection systems (designed to detect explosives in checked baggage), and detailed hand search by an airport security official

Top